No one at our frog Milan studio knew much about Black Friday until we saw a super discounted Parrot Mini-Drone on Amazon. We bought it with the idea to teach our kids how to use it, but their interest didn’t last long. So we brought it back to the studio with a vision. At first glance, the drone is beautiful and well assembled. It works flawlessly, but driving it becomes a little bit pointless once you get used to it. We wanted to make it fun again and thought about ways to transform a single-user drone into a multi-user game. It took us, Design Technologists at frog, an afternoon to find a source of inspiration, and to decide how to proceed. We landed on the idea to create a sort of augmented reality version of an iconic Nintendo game: Super Mario Kart.
First thing we needed was an app. In this case, specifically, we needed to create a drone remote controller app, which would apply the game rules, provide the sensors input, and establish the augmented reality experience. It would be an Android app since the openness of the platform lends itself to quicker development iterations and complex hacks (if you can master its subtleties).
First step was on the networking configuration: the drones Linux-based OS needed to connect to a 5GHz Wi-Fi so that multiple drones could connect to the same network. As it turned out, the drones put the network under stress, since they simply transmit uncompressed frames of their onboard camera video stream. It’s a lot of real-time data to handle and we had to accurately optimize the configuration parameters of the Wi-Fi.
After securing connectivity from our remote app to the drone, the next challenge was enabling the drones to “sense” each other – a quick study of current state-of-the-art indoor positioning technologies pushed us towards a computer vision solution, leveraging the onboard camera.
Leveraging the popular OpenCV library we scouted a couple of approaches, such as shape recognition and color blob recognition, but they were lacking in either correctness or performance, as we needed them to work in the fast-paced environment of a flashy racing course. The best approach we found was relying on fiducial markers, which are quite robust towards varying light conditions and changes of the field-of-vision. The ARToolkit framework could be bent to our purposes, and finally we were able to “see” the drones (as opposed to just a matrix of pixels over the screen).
One by one the pieces of the puzzle fell into place and the codebase of the Android app was growing into what was essentially a real-time data processing pipeline. We would get the video frame from the drone, process it, find the markers, check the aim, process also the input events (touches and phone tilt) to both pilot the drone and implement the shooting mechanics.
Writing this kind of logic with the tools of standard Android programming would be cumbersome and error prone. Java is a language we love, but it starts to show its age, and a callback-based programming paradigm quickly spins out in terms of asynchronous complexity and pipeline stage handling – we still have a target of a 60 fps experience.
Reactive Extensions is a paradigm we embraced years ago in the context of mobile app development, and it shines in challenges like this. We paired that with the emergent Kotlin, a modern language that fits particularly well with Android, and we got a very expressive platform to build our codebase, dealing with different sources and sinks of asynchronous data, performing complex calculations, all without stalling the graphics pipeline and keeping a high framerate. All of this translates into an enjoyable gaming experience.
Drones Race instructions-01
In the end, we had a performing network, a drone remote control app, a computer vision algorithm all in place, tightly knit by a reactive-first codebase. All that was left was the real, pure fun of coding the game rules and triggers! Targeting, shooting, track bonuses and maluses: it was time to get real.
Once the Android remote app software was functional, we had to start working on the drone hardware. We had two goals: we wanted to differentiate each drone, in order to make them easily recognizable, and we wanted to create a design that could accommodate the fiducial markers in a nice way. The frog Milan Industrial Design team got involved, and after a couple of iterations, the team came up with a solution able to reach both our goals. They created a 3D printed “shield” we could easily attach and remove (in order to easily replace batteries) with a large space on the back of the drones to accommodate the 45mm x 45mm markers, without changing the driving experience.
Processed with VSCO with kk2 preset
After a few tests we realized that improving the game experience meant helping the drivers recognize the circuit. The ID team helped us by designing and building a cardboard race track able to drive the eyes of the drone drivers.
Processed with VSCO with kk2 preset
After a few weeks of hacking, developing, and testing we finally finished our project, and were ready to host the first frog drone race. In one evening we hosted more than 23 lap races, giving around 100 people the chance to test our game. As with all remarkable achievements, the collaborative effort between frog’s Technology, Industrial Design, and Visual Design disciplines transformed a nerdy toy and a fuzzy game concept into a multiplayer augmented reality gaming experience.
All the code is available on frog’s GitHub.
Emanuele pushes pixels and data, in and out of boxes that he draws on whiteboards. He then builds teams to iteratively craft those drawings into computer programs.
Simone started writing software when COBOL was a hype word. Now his interests span from web-bots to data-visualization to embedded applications and IOT.
We respect your privacy