NSTA partnered with Amazon Future Engineer and Massachusetts Institute of Technology (MIT) on the Alexa for Astronauts program to create the Alexa for Astronauts: Using AI to Monitor Health NSTA Daily Do Playlist for high school life science courses. This playlist (lesson set) and other new STEM curriculums designed for the Alexa for Astronauts program allow high school educators and their students to dive deeper into computer science learning and the NASA Artemis I mission.
Alexa for Astronauts: Using AI to Monitor Health is a series of three lessons that require students to build and/or apply ideas about life science, artificial intelligence (AI), and computer science* using Amazon Alexa** and MIT App Inventor to solve a problem new to astronauts - monitoring their physical health with no Earth-based support.
*No prior coding experience is necessary
**Amazon Alexa devices are not required
The goals of this STEM in the High School Life Science Classroom - Implementing the Alexa for Astronauts: Using AI to Monitor Health NSTA Daily Do Playlist (Part 1) PLU are as follows:
Participation in this PLU requires the use of MIT App Inventor. Register for a free account on the MIT App Inventor (https://space.appinventor.mit.edu/login) web page
The “Focus on Learning” questions are intended to prime your thinking and provide an opportunity to track your thinking over time (in this case, over the course of the PLU). Take the time to reflect and articulate your current thinking.
Let's prime our thinking as students in the lesson are asked to do:
Navigate to the PLU: STEM in the High School Life Science Classroom Jamboard.
Follow the directions provided in the upper right corner of the Jamboard frame. Note posts similar to and different from your own.
Consider what you might do if you were physically unwell. This is the question students are asked right before guidance is provided to shift students’ thinking from themselves to a problem facing astronauts.
“Space missions are being planned that will take humans farther away from Earth than past missions. Immediate communication between spacecraft and Earth might not always be possible at these distances.” (Lesson Plan 1, page 5)
Answer the next question as a high school student might after considering and sharing ideas about feeling physically well or unwell.
Navigate to Lesson Plan 1: How do we know when we are physically well or unwell? Can we use AI to input data and receive output to help monitor our physical wellness? (Lesson Plan 1) on the NSTA website. Find the Download PDF link near the top of the page to access the lesson plan (see image below).
Read Lesson Plan 1 pages 1-6. Compare the questions you entered for QUESTION 1. e with the anticipated student questions presented on page 5.
Anticipated student questions are used to design instructional sequences that are coherent from the students’ perspective.
(Source: Thinking Like A Kid, nextgenstorylines.org, and NextGen)
For additional information and guidance on coherence from the students’ perspective, see Resources 2 and 3 in the PLU: STEM in the Life Science Classroom Parts 1 and 2 Collection.
Think about your own experience with 1. Focus on Learning (previous learning segment) and then answer the following question.
Watch the Canadian Space Agency video David Saint-Jacques explains how the Bio-Monito smart shirt keeps an eye on astronauts’ vital signs. You may also choose to read the companion article Bio-Monitor: Keeping an eye on astronauts’ vital signs.
At this point in the lesson, teachers are guided to provide an opportunity for students to return the class-generated list of ways physical health is monitored on Earth and student-generated questions about how astronauts monitor their health in space and then make connections with the information shared in the video.
In the Canadian Space Agency video, David Saint-Jacques shares that all the data collected by the Bio-Monitor smart shirt “can be quickly sent back to scientists on Earth.”
This piece of information brings students’ thinking back to the engineering challenge: How can we use AI to monitor and provide feedback on astronauts’ health when communication with mission control (scientists on Earth) is not possible?
Before continuing to read Lesson Plan 1, pause to consider your own ideas about AI.
In this lesson — especially the first time teaching the lesson - the teacher and students together build and revise ideas about AI and computer science. Learning and figuring things out together supports a “We Culture” in the science classroom in keeping with the vision of science teaching and learning shared in A Framework for K-12 Education.
(Source: Questions to Guide the Development of a Classroom Culture that Supports “Figuring Out” (nextgenstorylines.org and NextGen)
Read Lesson Plan 1, pages 10-11 (Steps 1-3). Use the lesson plan and ideas presented in the “We Culture” image above to answer the following question.
By the end of Step 3 (page 11), the class most likely will not have a complete understanding of AI. The lesson developers use this uncertainty to motivate students to investigate AI further — in this case, build an Alexa Skill.
What is an Alexa Skill? Skills are like apps for Alexa. Alexa uses an interactive voice interface to provide users a hands-free way to interact with a skill. For more detailed information, visit the Amazon Alexa Developer webpage “Alexa Skills Kit”.
Read Lesson Plan 1, pages 11-13 on using Alexa-MIT App Inventor to create a skill that allows the user to talk to and receive an appropriate response from an Alexa skill.
The focus of the following segment is to gain experience with Alexa-MIT App Inventor that can be used to help students troubleshoot errors they encounter while building the skill themselves.
Follow the instructions on the AFE_AforA_Day 2 Presentation to build the skill described in Lesson Plan 1 and then answer the following questions.
Run the Alexa skill using the testing window which creates a record of your dialogue with Alexa. Take a screenshot of the completed run and save the screenshot as .doc, .docx or PDF.
If after multiple attempts you are unable to successfully run the skill, click the orange Help button at the bottom right of the page and submit a help request. Include the following information in the help request: Type “Alexa-MIT App Inventor help” Describe in as much detail as you can the problem you are experiencing. If possible send screenshots or screencast videos (Vimeo link) Include your name and a contact email so we can get back to you to help resolve the problem.
On slide 11 of the AFE_AforA_Day 2 Presentation, note that students are asked to consider the many different ways people say “Hello”. Student pairs or small groups are then asked to choose three of the ways identified to say “Hello” (three utterances) to build their Alexa skill.
The goal of shifting students’ thinking to the problem of how using only three utterances could limit the usability of the Alexa skill by a wider population is to introduce the need for AI.
Test your Alexa skill in MIT App Inventor repeatedly*, each time using utterances (for “Hello”) different from the utterances programmed into the skill you built. Try both utterances very similar and very different to the utterances programmed.
*In rapid succession or over the course of many days.
Reread Lesson Plan 1 and look for the design choices that support coherence from the students’ perspective (many of which are called out in this PLU).
Upon completion and submission of the task associated with this self-guided professional learning unit, you will receive a certificate awarding two credit hours of professional learning (also known as continuing education credits).
Please allow up to 30 days for your certificate to be awarded. You will receive an e-mail from NSTA as soon as your certificate is available for download.
Web SeminarScience Update: From Talking Trash to Taking Action: The Science of Marine Debris, April 20, 2023
Join us on Thursday, April 20, 2023, from 7:00 PM to 8:00 PM ET for an edition of NSTA’s Science Update about marine debris. Our oceans are fille...