Friday, 4 November 2011

Contextualising Design Lecture 3rd November 2011

Innovation by Dan Berry


What Fingers Do

Smart Phones
Dan Berry took us through a sort of timeline of different phones and how they developed. For example he started off with his very first phone, I can't remember the name of it but it was so old that it had an antenna! He then went on to the launch of the smart phones and pointed out the key pad, how you controlled and used the phones by only using buttons. Finally he came to the Iphone and commented on how now instead of having just buttons to use, you had the whole screen and lots of different interfaces, and that now the phone was more than a phone because it had all these different apps on it. 



The Book That You Read

Comic on an Ipad
Dan then went onto Ipads and how they are used. He used a number of images of news and blog web pages that displayed headlines such as 'Ipad and Kindle are killing books'. He showed us a video of someone looking at a comic on the Ipad, how they could zoom into a particular image to look at it and the scroll across or down or look at the whole thing at once. I personally think that this won't destroy the publishing industry and books because there will be people that still want to be able to have a physical book in their hands and turns real pages over to read the next page. Myself being one of them. 



The Book That Reads You

Kindle 2
This is where the lecture got interesting. Dan then started to tell us about all the information and data that the Ipads, Kindles and Iphones are gathering. For example, Ipad and Kindle both have GPS and so if you are reading an article/ book then they know where you are reading it, they can also find out what time you read and what the lighting conditions are like. Ipads have a tiny camera at the top so you can call your friends and family to have a video conversation...right? Yes, but they can also use that camera to pick up micro expressions on your face to know how you feel about something you are reading. Another example is of the video of the person looking at a comic on their Ipad..Marvel can gather information about how long someone looks at a certain picture and how long it take for them to look at a whole comic. Most people would read about all of this and feel uncomfortable, I did too, but when you think about it, all of this information they are gathering can be used to improve certain aspects of the whole experience of either reading a book, or a comic etc.. 

Another thing that came up was Facebook and about how the site stores absolutely loads of information just about one person. I can't remember any and I did have enough time to write them down...mainly because there we so many! But it was incredible. 

Machines of Loving Grace

And then Dan ended it with a poem..



"The best way to predict the future is to invent it"
-Alan Kay

Overall I enjoyed this lecture and found it very interesting. Imagine in the future if I create an animation and people watch it through something like the Ipad, I could gather information about how the audience react to it, where they watch it and what time they watch it. There is also some more technology just coming out where a camera/computer can estimate what age a person is and what ethnicity they are...which would be even better for finding out who your audience is! Obviously not everyone is going to like the idea of their actions being watched but I think that if you use this technology within moderation then it should be fine. 



No comments:

Post a Comment