Every year, LUDLAB hosts its Demo Day, an event organized to showcase all the projects it has undertaken during the year. The second Demo Day was recently held at Universidad Austral’s Pilar Campus, in Buenos Aires Province, where Mavek, a mobile app that provides a voice for people who cannot talk for themselves and whose mobility is severely impaired, was first introduced to the public. In 2017, two students will continue working on a device to enable quadriplegics and people with severely limited mobility to issue commands in order to move around autonomously on a wheelchair.
At this year’s Demo Day, the following games and developments were showcased:
Developed by two third-year IT engineering program students, 21-year-old twins Michele and Brian Re, Mavek is an application for Android smartphones and tablets. Connected to an MYO device, an armband that reads very subtle arm movements, this app provides a voice for people with significantly restricted mobility and speech. Thus, for example, patients lying on a hospital bed can communicate with their nurses, relatives and physicians.
Relying on a very user-friendly interface that can be handled by people of all ages, users can choose from pre-set phrases, such as “I need some water”, or build new words by selecting letters that are later reproduced in audio via a smartphone or tablet.
This device has already been tested on a patient at the Austral University Hospital, who was hospitalized and had undergone a tracheotomy. Among other words and short sentences, he managed to utter his son’s name.
RAP (pokÉmon augmented reality)
Nicolás Rudolph, 23, and Luciano Sartor, 22, both fourth-year students at UA’s IT Engineering program, came up with the idea for RAP based on the stir caused by PokémonGO game. Unlike the latter, RAP interactively focuses on battles among pokémons, as its developers have found ways to provide choices for specific pokémons and attacks.
Players log onto the app, focus their smartphone’s camera on a physical token placed on a table, and a pokémon shows up immediately. Another player does the same, with the same smartphone and a new token, and their two pokémons engage in a fight. This game was developed in three weeks.
Drawing inspiration from a Super Mario Bros 3-D video, Nicolás Burdoni and Federico Ruiz, both 21 and fourth-year students at UA’s IT Engineering program, set about bringing the classic Italian plumber’s game into a new technology by using the Oculus Rift virtual reality headset. The name “Itsame” came from Mario’s famous greeting, “It’s-a me, Mario!”, as, when players access this virtual world, they really feel like they become the game’s character.
These two future engineers believe that VR technologies will consolidate on a mainstream level as a result of their potential contributions to several fields, spanning from medicine to video gaming.
Like in the case described above, Tomás Najún (24, fifth-year IT Engineering student) and Nicolás Rudolph chose to take a traditional game into the VR domain. These students worked six hours a week for a month to develop their game.
For Najún and Rudolph, both VR and AR are well on the way to worldwide consolidation, although VR moves forward at a slower speed because it requires more specific hardware. VR and AR apps can and will bring many benefits, even changing the way in which people interact with each other.
JetLag is a work of art: a sculpture developed and set up with dynamic images and multiphonic sound to be displayed at places like building lobbies, airport lounges, and train station concourses. This sign imitates the split flap electro-mechanical technology used in watches and signs at train stations and airports since 1960.
JetLag looks like a sign of that kind, but, instead of electro-mechanical modules, it combines 85 smartphones programmed to work together. Every phone features a dedicated software that plays an animation simulating the mechanism of split flap signs. The images are displayed and switched from every phone in this grid, while every one of them plays the typical sound of split flap modules.
Brian and Michele Re have also developed the Orb-Smart project. The twins learned about Orb-Slam 2, a software developed at Spain’s Universidad de Zaragoza, and decided to add a new functionality to it. Orb-Slam 2 produces a map or spatial graphic description by collecting images plotting dots, while Orb-Smart has added location recognition by incorporating a mobile device camera.
forthcoming: 2017 LUDLAB
Students Juan Pablo Bastidas (21, third-year IT Engineering program) and Ignacio Berdiñas (20, second-year IT Engineering program) will continue working on the development of a headset for quadriplegics or people with severe mobility impairment. This headset will enable them to move their wheelchairs on their own via commands captured through users’ neuronal and electric signals. For example, if the user blinks twice, the chair will turn to the right. Currently, Bastidas and Berdiñas are surveying these electric and neuronal signals to store them in a computer in order to build instructions.
In a way, this project furthers an earlier development by the now graduate Pablo Cereltano, who developed a device with two receptors to capture relaxed or concentrated electric signals. Based on that device, a game for smartphones was also developed: with focused thoughts, math calculations are made, for instance, to heat up a barrel until it explodes. With relaxing thoughts, a little mushroom takes off and flies until it starts to drop when this relaxed state is interrupted.