Back to Parent

Research

Our research covered a variety of topics, including child development/psychology and current trends in human interactions with artificial intelligence. We had found that children need to be taught empathy and to learn how to constructively take criticism at a young age in order to interact well with other people. We also studied modern parenting trends and methods, which showed that parents tended to spend less time with children due to the demands of work and other obligations. This would lead to children forming a stronger bond with hired caretakers. In addition, we found that there is more and more pressure on children to get into prestigious colleges, which leads to increased emphasis on learning social interactions, networking, and doing as many extracurriculars or work as possible in tandem with getting good grades in school.

As for children’s interactions with artificial intelligence, the AI toys for children or responsive chat bots had several precedents. Mattel had already started to develop AI toys such as a holographic Barbie Doll and Aristotle, an AI nanny for children. However, focus groups had protested the mass release of these toys/AI aids because of parents’ concerns of data collection and monitoring their children. There were other concerns about children interacting with AI at an early age, such as getting emotionally attached to an inanimate object. Children were already greeting Alexa and Google Home like human friends. These AI’s were also not responding to children in a way that humans would respond, which would encourage kids to interact with them with impatience and without empathy. One mother wrote about her son flipping his “pet” robot onto its back and gleefully watching it struggle to get back up. The robot did not express any pain or requests for help, which enabled the child to continue to push it around. Chatbots also cannot truly understand the nuances of a situation or one’s inner feelings. In the example of WoeBot, which was an AI designed to help redirect negative emotions, the AI cannot detect issues of abuse or eating disorders from chatting with it, so it cannot offer the proper care or empathy that a child or user would need.


Content Rating

Is this a good/useful/informative piece of content to include in the project? Have your say!

0