Last year’s Stanford Medicine X conference explored ways in which technology could be used to augment the attendees’ experiences. During breaks between sessions, organizers used specially developed software to transform television screens set up in the lobby outside the main auditorium into interactive spaces where participants could exchange ideas. On one screen, attendees used their mobile phones to text their reflections on previous sessions or respond to prompts such as: “What’s your dream for health care?” The texts appeared as yellow sticky notes on a virtual corkboard. Another screen served as a digital journal where participants could text comments about what they learned and have them displayed to a wider audience. As people walked up to the screen to read the contextually relevant content, they naturally started conversations. In an effort to bridge the divide between the people who were physically present at the conference and those who were watching the live-stream from other locations, an additional screen broadcast tweets from around the world in real time.
This year, conference organizers have developed three iPhone apps for Medicine X based on Apple iBeacon, a Bluetooth-powered location system. “When we heard about the iBeacon technology, it was clear that it would fit really well into a conference setting as well as being useful for allowing people to interact with the large-screen displays,” said Michael Fischer, a PhD student in computer science in the MobiSocial Lab at Stanford, who helped develop the app. “We brainstormed all the possible ways that the iBeacon technology could help people participate in the conference and came up with some ideas that we are excited to test out at the upcoming conference.”
In anticipation of this year’s conference, I reached out to Fischer to learn more about how the apps will further enhance attendees’ experience at Medicine X. Below he explains how they will facilitate networking among participants, allow them to provide feedback or rate speakers and serve as a sort of “flight-attendant call button.”
Can you briefly explain how the apps work?
One app allows us to extend the Wellness Room, so that people can request items without having to go to the room and miss part of a session. The Wellness Room provides special amenities, such as warm blankets or a place to rest, to assist patients in managing their conditions during the conference. The room was designed to help patients physically attend the conference who might have otherwise not been able to. For example, a previous ePatient attendee had a medical condition called cryoglobulinemia, which causes proteins known as cryoglobulins to thicken if the ambient temperature drops too low. If this were to occur, it could lead to kidney failure and would be life threatening. So it’s crucial for this patient to keep warm. Using the iBeacon technology we were able to develop a system that allows people to use an iPhone to request a blanket or other item be delivered to their seat. There will be iBeacons on all the tables in the room so that the phone will automatically know where you are sitting. All the requests will be forwarded to a volunteer who will bring the item directly to the table.
Another app will be used during the breaks to help people get to know each other. The application works by displaying short bios on a nearby TV screen. In this way, the screen acts as a type of watering hole that people can gather around. When new people approach, their bios will be added to the screen. When a person leaves the proximity of the screen, the bio will be removed. We’ll have multiple screens set up around the conference. Our hope is that people can find a group that they might not yet be familiar with. The service is opt-in and people can switch to and from stealth mode at any time. Conference-goers will also have the option to forgo this app altogether.
Lastly, we have developed a feature that will be used at check-in. We want to create an experience that will surprise and delight people from the moment they step into the conference. There is a tradition at Stanford during freshman year that when you first come to your dorm, the dorm staff yells out your name. It is pretty big surprise and makes you feel part of the community instantly. We wanted to replicate that experience as best we could for the conference.
What challenges did you encounter in developing the apps?
The main challenge was figuring out how to organize the placement of the iBeacons in the room so that when a person requested an item all they would have to do was press a button. We arranged them so when people press a button we can localize them in the room and a volunteer knows the region of the room they are in. Another challenge was taking each of the ideas and refining them enough so they would be useful and support the mission of the conference. Overall, I hope the technology creates connections and interactions that otherwise would not have occurred.
In working on this project, what excites you about the potential of the apps?
What excites me the most about the potential of the people locator [function of the] app is its ability to help people connect with each other between sessions. The technology of using the proximity sensing screens blends well with the patient-to-patient mission of the conference. Finding the right person to connect with can be half the challenge. This feature will spark face-to-face interactions that will be both expected and unexpected. Additionally, the screens will provide everyone in the venue with ambient intelligence.
With the blanket app, I’m excited to see how it will be used and in what directions we can take it in the future. The app will make the conference more appealing to a number of people. Additionally, we’ll get many suggestions about what other types of things people would want to have delivered to them at the table. A wide swath of applications can now be built based around the phone’s ability to sense hyperlocality.
iBeacon right now is what GPS was like 10 years ago. It will be a platform for innovation that will enable a new layer of services. For example, GPS-enabled applications such as Uber were an unanticipated application of GPS technology, which was probably not even considered when GPS was being developed. I anticipate the same thing will happen with iBeacon, with many unanticipated applications being developed as iBeacon matures. What we’ll be doing with it in a decade will be something we can barely even imagine now.
More news about Stanford Medicine X is available in the Medicine X category.
Previously: Countdown to Medicine X: Global Access Program provides free webcast of plenary proceedings, Countdown to Medicine X: How to engage with the “no smartphone” patient, Medicine X symposium focuses on how patients, providers and entrepreneurs can ignite innovation and Medicine X spotlights mental health, medical team of the future and the “no-smartphone” patient
Photo by Stanford Medicine X
From August 11-25, Scope will be on a limited publishing schedule. During that time, you may also notice a delay in comment moderation. We’ll return to our regular schedule on August 25.