robots – EDUC 342: Child Development & New Technologies https://ed342.gse.stanford.edu Thu, 28 Jan 2016 07:15:04 +0000 en-US hourly 1 https://wordpress.org/?v=5.6.1 “Her,” Robots, and Child Development https://ed342.gse.stanford.edu/her-robots-and-child-development/ https://ed342.gse.stanford.edu/her-robots-and-child-development/#respond Thu, 28 Jan 2016 07:15:04 +0000 http://ed342.gse.stanford.edu/?p=1255 What I found particularly interesting and terrifying about the piece on Robots and Child Development is how often we are blind to the potential detrimental effects of technology and how easily we write them off. When I first was reading the article I wasn’t convinced. I think from a technological point of view the possibility of developing social robots is so fascinating and exciting that it is so easy to focus on the good and believe that critics are overanalyzing the effects of a piece of technology. However, once I came across this paragraph:

“What will happen if children grow up interacting with robots as peers and even friends because of the robots’ sociality, but also objectifying if not dominating the robots because the children understand that the robots are a human technological creation? Imagine, for example, if an 8-year-old’sbest friend is a social robot. The child plays with the robot everyday and goes to it for comfort and companionship. The robot always does what the child wants and the child never needs to accommodate to the social interests and needs of the robot. Does that situation put into motion a master–servant relationshipthat we would not want to reify”

I was taken back to the film “Her” and the reality that it is very possible that we will get to a point where the line between robot and human is so blurred that the effects of a child treating a robot as less than human could effect the way said child interacts with humans. A truly fascinating idea and a terrifying one because it is very likely that such a possibility will not stop technological developers from developing the technology that could lead to such a phenomenon. So I guess I am pointing out a question that we keep confronting- technology is changing and sometimes not for the better, how do we keep up and prevent the detrimental effects from harming children in the process?

]]>
https://ed342.gse.stanford.edu/her-robots-and-child-development/feed/ 0
Goochee Week 4 Discussion Post https://ed342.gse.stanford.edu/goochee-week-4-discussion-post/ https://ed342.gse.stanford.edu/goochee-week-4-discussion-post/#respond Thu, 28 Jan 2016 04:11:33 +0000 http://ed342.gse.stanford.edu/?p=1221

Wowzers, week four’s readings have been the creepiest so far, by far. I am responding to the child near-future-robot scenario article.

I found the postulation that children will form moral and social relationships with their toys to be completely ridiculous, until they invoked the first example of the Tamagotchi, which I did love and interact with in a social and moral capacity! Scary.

I think the scariest part of the article was when they spoke of socializing the kids with AIBO and a real dog at the same time, and then asked the children questions about the qualities of each. The fact that over 60% of the children cited that “AIBO had mental states and sociality” is pretty terrifying.

The exploration of Robovie wasn’t as shocking to me because I imagine parents and society will step in before robots becomes enslaved to the child. In the words of the authors, the concept that the child “never needs to accommodate to the social interests and needs of the robot” seems like a really extravagant and expensive humanoid playmate that a child will never have. I see robot pets as a greater threat than humanoid robots because they simply seem more likely. I think society will be more careful in the dissemination and engagement with those humanoid robots, but who knows! It makes sense that Kurzweil is referenced in this piece.

]]>
https://ed342.gse.stanford.edu/goochee-week-4-discussion-post/feed/ 0
DQC Week 4 – Robots in our future? https://ed342.gse.stanford.edu/dqc-week-4-robots-in-our-future/ https://ed342.gse.stanford.edu/dqc-week-4-robots-in-our-future/#respond Wed, 27 Jan 2016 21:11:12 +0000 http://ed342.gse.stanford.edu/?p=1216 Kahn et al.’s argument that robots will be the future of our society and be detrimental to childhood development was not very compelling to me. Though I agree that technological advancement has accelerated in recent years and we have become increasingly urban towards a “technological nature”, I do not agree that robots will have the capacity to replace “real life” animals and people in our everyday social lives. I am not convinced of Kahn’s hypothesis that children growing up will categorize social robots as a unified entity instead of a combinatorial set of its constituent properties. This ignores the impact of the child’s social environment, which includes a framework for what is human and what is not. Kahn used the color orange as an example of a new ontological category- that children see it as its own entity and not a combination of yellow and red. This is true only to an extent, because children do eventually learn this fact. I think the same holds for the social robot; even if they do see it as its own entity, they will eventually learn that it is a machine with “human” features engineered by humans.

Additionally, no matter how advanced these robots are, they are only able to engage and respond to children to a limited extent. The human brain is incredibly complex, and if we have not yet understood completely how it works, how are we able to create artificial intelligence that can emulate and replace living creatures?

]]>
https://ed342.gse.stanford.edu/dqc-week-4-robots-in-our-future/feed/ 0
Thoughts on emotional robots https://ed342.gse.stanford.edu/thoughts-on-emotional-robots/ https://ed342.gse.stanford.edu/thoughts-on-emotional-robots/#respond Sun, 24 Jan 2016 02:30:04 +0000 http://ed342.gse.stanford.edu/?p=1166 The Kahn et al article really had an impact on me. It made me think about how our logic can be hijacked by our emotions, and how vulnerable young children’s emotions are and where the world is heading. It makes complete sense that the children would feel an emotional attachment to the robots, but what is it about the robots that cause this emotional attachment? Is it the fact that they are communicating with the robots? I mean, we are very social beings, and the fact that we are interacting with the machines on a social level and communicating with them and getting feedback, perhaps this is the trigger that hijacks our logic (i.e. they ARE machines and not “alive” after all, and we know it). The world seems to be heading in this direction, whether we like it or not. How will our emotional growth be effected by the advent of this new NOC? In short, I think this article does a great job at shedding light on what I think will be a very important issue in the not too distant future.

]]>
https://ed342.gse.stanford.edu/thoughts-on-emotional-robots/feed/ 0