In EME6465 (Interactive Learning Technologies), I had the opportunity to code with several unique programs designed for various age cohorts within the educational realm.
Using Scratch, I designed a Trauma Transport game that could be used to assist new EMS providers in identifying areas of our diverse County that may or may not be appropriate for air ambulance transport. Admittedly, the exact coordinates were not verified for accuracy, but the general idea is to recognize that certain outlying regions deserve consideration. The interactivity provides audio and text feedback (good or bad) depending upon the decisions made, which follows Universal Design for Learning principles suggested by CAST.
Using Tynker, I took the interactivity a step further by designing the Chef Mathmo’s Pizza game that uses scaffolding to introduce basic bath principles to elementary school students. Again, principles of UDL were included by providing simultaneous audio and visual feedback.
After clearing up some basics, we moved on to experimenting with VisualEyes. Using the unique programming language created by University of Virginia, I developed a training tool for familiarization with emergency resources within the community. This tool can be used for community education or for new employee orientation. Part of this involved creating an XML schema definition (.xsd) for the elements used in an extensible markup language (.xml) document referenced by the program.
I was also able to gain some experience in storytelling through interactive learning environments by developing a 9-1-1 game using Alice 2.0. Due to the limited time for creation, only one scene was completed. In this scene, the learner’s responses are given feedback using principles of consequence remediation. One overlooked opportunity was the simultaneous audio and visual communications that would be more in line with UDL principles.
We completed our series of mini-projects for interactive learning environments by creating an application using LiveCode. I developed the framework for an application to record a patient’s inputted vital signs, symptoms, allergies, and medical history. The interactive learning is two dimensional. The patient will receive education in any potnetially life threatening symptoms that are reported, as well as any negative trending in vitals that meet an algorithm to generate a warning message. Healthcare professionals will be able to save time by reviewing medical files to synthesize findings that cumulatively result in a “high risk alert” for patients that meet specified criteria for heart disease, stroke, or other medical emergencies.
Here’s the basic framework for the app that is still in development: