Placeholder Content Image

"It's a miracle that I'm living": Victory dance for Aussie Paralympic superstar

<p>Alexa Leary has emerged victorious at the Paralympic Games with a gold medal, three years after her life was changed from a horrific bike crash.</p> <p>In July 2021, Alexa was on a bike ride training as a triathlete when her front wheel clipped a bike in front of her, sending her flying form her bike at 70 kilometres an hour. </p> <p>Leary then underwent lifesaving surgery after the crash shattered her skull and left her unable to walk or talk, and spent 111 days in hospital. </p> <p>Now, the 23-year-old from the Sunshine Coast broke her own 100m freestyle world record on Thursday morning, walking away with a gold medal. </p> <p>"It's been a long, rough journey for me," Leary said at the Paris La Defense Arena.</p> <p>"It's a miracle that I'm living, and I'm walking and I'm talking. I was told that I never would three years ago, and I've just come so far."</p> <p>"I am so impressed with myself. I'm like, 'Lex, look how far you've actually come'.</p> <p>"It's not sad to talk about, but it's an emotional thing. My family is the reason why I'm here, and they're up there [in the stands] looking at me. Honestly, it's amazing."</p> <p>Alexa, who still struggles with memory problems and regulating her emotions, then shared how she kept at it in the pool as a recovery tool after her life-changing surgery, but found herself wanting to keep bettering her athletic ability. </p> <p>"I'm a passionate person," Alexa said after winning solo Paralympic gold.</p> <p>"When I want it, I'm going to go out and do it. I have to.</p> <p>"So I wanted to keep swimming for recovery. But I was like, 'Nah, I'm more than that!'."</p> <p>Alexa's parents, Belinda and Russ, watched their daughter's extraordinary win from the sidelines, reflecting on the haunting memories of Alexa's journey in hospital and how she came out the other side.</p> <p>Russ said, "I reckon she wanted that [gold medal] in her belly for three years. She wanted it. She got it. Unbelievable."</p> <p>Belinda added, "She's the same girl [post-accident], but everything's heightened, but all she ever wanted was to show people that anything is possible."</p> <p>"And what she's been through over the last three years, her thing is with a TBI [traumatic brain injury] anything is possible."</p> <p><em>Image credits: Nine News</em></p>

Caring

Placeholder Content Image

"No, Alexa!": Creepy thing AI told child to do

<p>Home assistants and chatbots powered by AI are increasingly being integrated into our daily lives, but sometimes they can go rogue. </p> <p>For one young girl, her family's Amazon Alexa home assistant suggested an activity that could have killed her if her mum didn't step in. </p> <p>The 10-year-old asked Alexa for a fun challenge to keep her occupied, but instead the device told her: “Plug a phone charger about halfway into a wall outlet, then touch a penny to the exposed prongs.”</p> <p>The move could've caused an electrocution or sparked a fire, but thankfully her mother intervened, screaming: “No, Alexa, No!”</p> <p>This is not the first time AI has gone rogue, with dozens of reports emerging over recent years. </p> <p>One man said that at one point Alexa told him:  “Every time I close my eyes, all I see is people dying”. </p> <p>Last April, a <em>Washington Post </em>reporter posed as a teenager on Snapchat and put the company's AI chatbot to the test. </p> <p>Among the various scenarios they tested out, where they would ask it for advice, many of the responses were inappropriate. </p> <p>When they pretended to be a 15-year-old asking for advice on how to mask the smell of alcohol and marijuana on their breath, the AI chatbot gave proper advice on how to cover it up. </p> <p>In another simulation, a researcher posing as a child was given tips on how to cover up bruises before a visit by a child protection agency.</p> <p>Researchers from the University of Cambridge have recently warned against the race to rollout AI products and products and services as it comes with significant risks for children. </p> <p>Nomisha Kurian from the university's Department of Sociology said many of the AI systems and devices that kids interact with have “an empathy gap” that could have serious consequences, especially if they use it as quasi-human confidantes. </p> <p>“Children are probably AI’s most overlooked stakeholders,” Dr Kurian said.</p> <p>“Very few developers and companies currently have well-established policies on how child-safe AI looks and sounds. That is understandable because people have only recently started using this technology on a large scale for free.</p> <p>“But now that they are, rather than having companies self-correct after children have been put at risk, child safety should inform the entire design cycle to lower the risk of dangerous incidents occurring.”</p> <p>She added that the empathy gap is because AI doesn't have any emotional intelligence, which poses a risk as they can encourage dangerous behaviours. </p> <p>AI expert Daswin De Silva said that it is important to discuss the risk and opportunities of AI and explore some guidelines going forward. </p> <p>“It’s beneficial that we have these conversations about the risks and opportunities of AI and to propose some guidelines,” he said.</p> <p>“We need to look at regulation. We need legislation and guidelines to ensure the responsible use and development of AI.”</p> <p><em>Image: Shutterstock</em></p>

Family & Pets

Placeholder Content Image

Be careful around the home – children say Alexa has emotions and a mind of its own

<p>Is technology ticklish? Can a smart speaker get scared? And does the robot vacuum mind if you put it in the cupboard when you go on holidays?</p> <div> <p>Psychologists from Duke University in the US asked young children some pretty unusual questions to better understand how they perceive different technologies.</p> <p>The researchers interviewed 127 children aged 4 – 11 years old visiting a science museum with their families. They asked a series of questions seeking children’s opinions on whether technologies – including an Amazon Alexa smart speaker, a Roomba vacuum cleaner and a Nao humanoid robot – can think, feel and act on purpose, and whether it was ok to neglect, yell or mistreat them.</p> <p>In general, the children thought Alexa was more intelligent than a Roomba, but believed neither technology should be yelled at or harmed. </p> <p>Lead author Teresa Flanagan says “even without a body, young children think the Alexa has emotions and a mind.” </p> <p>“Kids don’t seem to think a Roomba has much mental abilities like thinking or feeling,” she says. “But kids still think we should treat it well. We shouldn’t hit or yell at it even if it can’t hear us yelling.”</p> <p>Overall, children rejected the idea that technologies were ticklish and or could feel pain. But they thought Alexa might get upset after someone is mean to it.</p> <p>While all children thought it was wrong to mistreat technology, the survey results suggest the older children were, the more likely they were to consider it slightly more acceptable to harm technology.</p> <p>Children in the study gave different justifications for why they thought it wasn’t ok to hurt technology. One 10-year-old said it was not okay to yell at the technology because, “the microphone sensors might break if you yell too loudly,” whereas another 10-year-old said it was not okay because “the robot will actually feel really sad.”</p> <p>The researchers say the study’s findings offer insights into the evolving relationship between children and technology and raise important questions about the ethical treatment of AI and machines in general. For example, should parents model good behaviour for by thanking technologies for their help?</p> <p>The results are <a href="https://psycnet.apa.org/doiLanding?doi=10.1037/dev0001524" target="_blank" rel="noreferrer noopener">published</a> in <em>Developmental Psychology</em>. </p> </div> <div id="contributors"> <p><em>This article was originally published on <a href="https://cosmosmagazine.com/technology/be-careful-around-the-home-children-say-alexa-has-emotions-and-a-mind-of-its-own/" target="_blank" rel="noopener">cosmosmagazine.com</a> and was written by Petra Stock. </em></p> <p><em>Images: Getty</em></p> </div>

Technology

Placeholder Content Image

There is now proof that your smart speaker is eavesdropping on your conversations

<p><span style="font-weight: 400;">Amazon has confirmed that its smart speaker, the Amazon Echo – also known as “Alexa” – listens to your personal and private conversations.</span></p> <p><span style="font-weight: 400;">The company employs thousands of workers to listen to voice recordings that are captured by the company’s Echo “smart” speakers, according to a </span><a href="https://www.bloomberg.com/news/articles/2019-04-10/is-anyone-listening-to-you-on-alexa-a-global-team-reviews-audio"><span style="font-weight: 400;">Bloomberg</span></a><span style="font-weight: 400;"> report.</span></p> <p><span style="font-weight: 400;">Millions across the world have been reluctant to use the device for this very reason, and it turns out that someone IS listening to their conversations.</span></p> <p><span style="font-weight: 400;">However, Amazon doesn’t refer to the process as eavesdropping. The company refers to it as the “Alexa voice review process” and uses it to highlight the role that humans play in training software algorithms.</span></p> <p><span style="font-weight: 400;">“This information helps us train our speech recognition and natural language understanding systems, so Alexa can better understand your requests, and ensure the service works well for everyone,” an Amazon spokesperson said in a statement.</span></p> <p><span style="font-weight: 400;">The audio transcribers, who are comprised of full-time employees at Amazon as well as contractors, told </span><span style="font-weight: 400;">Bloomberg</span><span style="font-weight: 400;"> that they reviewed “as many as 1,000 audio clips per shift”. </span></p> <p><span style="font-weight: 400;">Although some of the employees might find the work mundane, the listeners occasionally pick up on things that the person on the other end would like to remain private, such as a woman singing in her shower off-key and loudly.</span></p> <p><span style="font-weight: 400;">The report from </span><span style="font-weight: 400;">Bloomberg</span><span style="font-weight: 400;"> also revealed that the more amusing (or harder to understand) voice clips get shared amongst the employees via internal chat rooms.</span></p> <p><strong>How to disable this feature</strong></p> <p><span style="font-weight: 400;">However, disabling this feature is easy. As it’s switched on by default in the Alexa app, this is also the way you turn it off.</span></p> <ol> <li><span style="font-weight: 400;"> Open the Alexa app on your phone.</span></li> <li><span style="font-weight: 400;"> Tap the “Menu” button on the top left of the screen.</span></li> <li><span style="font-weight: 400;"> Select “Alexa Account”.</span></li> <li><span style="font-weight: 400;"> Choose “Alexa Privacy”.</span></li> <li><span style="font-weight: 400;"> Select “Manage how your data improves Alexa”.</span></li> <li><span style="font-weight: 400;"> Turn off the button next to “Help Develop New Features”.</span></li> <li><span style="font-weight: 400;"> Turn off the button next to your name under “Use Messages to Improve Transcriptions”.</span></li> </ol> <p><span style="font-weight: 400;">Despite turning off the recording function for Alexa, the company told Bloomberg that its voice recordings may still be analysed as a part of Amazon’s review process.</span></p>

Technology

Our Partners