I helped Seeper design software solutions for the Alef Education Control Room from October 2019 to Mars 2020. The Control room is a mixed of audio visual elements, lights and LED wall.
I worked on designing a Knowledge Graph in unity, which pulls live data from an API, or a CSV file, and displays a live Knowledge Graph.
I also worked on finishing a Tablet app to control the Room's different parts.
The original structure code in unity and back end API calls were handled by Gavin Woods ).
Due to the client requirements, no images of the actual installation can be shared.
I helped Tuur Van Balen and Revital Cohen for their exhibition at the Stanley Picker's gallery in May 2019. They took over the gallery with a mixed media pieces.
I worked on helping with 3 raspberry pi's running various programmes at bootup (systemD), and developing further animations for the LED screen.
The project was really interesting and Tuur was awesome.
The code was motly writen using Openframeworks, mplayer, bash scripting and networking through OSC.
Technology used : Openframeworks, Raspberry-Pi, Fadecandy, networking, bash.
As part of the Genuine X team, I worked on an interactive game. The installation consisted in a video projection on the floor, with tracking from above on where the user is situated.
The game has 3 modes : attract mode ( waiting), Gaming Mode (no help), AI Mode ( AI help). The goal of the game is to find the correct pattern to go from one side of the grid to the other, by walking on the shapes.
The work involved creating a simple tracking algorithm, then designing the game engine and incorporating some assets (videos / images)
Due to client requirements pictures of the actual installation are not allowed.
Technology used : OpenFrameworks, Blob tracking, depth sensor (D435 / kinectV2).
The first experiential multimedia installation of its kind, using an interactive music composition & technology platform created by emerging arts & experiential practice Aeon Industries.
Modelled on psychologist John Bowlby’s Theory of Attachment, the pulsating, musical organ has a fluctuating self-esteem, algorithmically generated by audience attention. Interact with the organ, ignore it or become intimate, and learn how it responds.
I designed the "organ" for Aeon Industries by coding differents physical states (grid of 16 states). The shape then takes different forms depending on which states it is the closest to. Each state is stored in a json file which can be changed in the future for a quick diversification of the orgsan look. At any given time, the state is an average of 4 different state, making it pretty versatile.
I worked with the music producer to make the piece reactive to the music. The interactivity of certain aspects of the music depends on the state the organ is in.
It got featured in ItsniceThat .
Technology used : OpenFrameworks, Leds, Raspberry-Pi, networking, Ableton live.
As part of the Genuine X team, I worked on an installation for a conference in america for BCG. The installtion consisted in a table with an RFID reader, and a couple of tags on pieces of acrylic. The pieces of acrylic were triggering video content on the screen.
The deadline was short, but the work already in a good shape, I only had to re-fresh it from Henry's initial work.
The work involved making a simple electronic circuit with an RFID reader, and linking it to Resolume, for fast replay.
Technology used : Fritz, Arduino, Resolume.
I worked on a performance / research project using augmented reality.
A key feature of the project was the use of non-physical sound sources in space. The dancer would perform contemporary dance whilst the audience explored the space and soundscape surrounding him. ARKit seemed like a good technology to fullfill the needs of the project.
The user walks in the space and triggers sound sources as he approaches the cubes. Certain sources are only triggered once, while others can change over time. Technology used : swift 3, performance dancing.
As part of the UNIT9 team, I worked on an augmented reality spacesuit for children to step into Tim Peake .
Inspired by Tim Peake’s out-of-this-world journey to space, the augmented reality spacesuit combines face-tracking and body-tracking technology to mirror user’s actions in real time. It then projection maps the user into the shell of Tim Peake’s 200 lb spacesuit. Users can then take selfies and create a shareable video for social media.
The augmented reality spacesuit worked by capturing a live video feed of the user’s face, and projecting it onto a 3D model of a human head inside the helmet. Using a HQ webcam we captured the user’s image, extracted it using face-tracking technology and applied it onto a 3D mesh from kinect. Technology used : kinect, unity3D.
As part of the UNIT9 team, I worked on the creation of an escape room using voice recognition, alexa from amazon.
What better way to solve the intricate puzzles of an escape room than to have a highly intelligent know-all assistant helping you along the way? When AKQA Portland asked UNIT9 to create an escape room for the Amazon Echo—to be revealed at Comic-Con New York —we immediately knew that Alexa had to be part of the experience.
The Amazon Echo experience was the world’s first voice-activated escape room. Teams navigated through a series of five rooms, solving intricate puzzles along the way. They had to conquer a laser maze without tripping up and put mind over matter in an interrogation room. But each team had one very superior advantage: Alexa was a member of the team. The world we created was a perfect blend of physical challenges, technology, and of course, artificial intelligence, essentially gamifying the smart home and exposing the wide range of Alexa’s abilities to the difficult-to-reach influential geek audience.
Experiencing the experimental electronic genre of 'concrete music' within an industrial concrete landscape during the Water Tower Festival 2017.
Using hardware sound processing electronics - BELA - embedded into a sheep skull, a custom-conceived instrument is created. The nuanced vibrations generated when any part of the surface of the skull is touched by the visitor are processed to render a unique live audio response.
The tactile experience of touching the natural sculptural forms of the skull to unlock its distinct aural language, coupled with the audio and visual impact of the robust industrial surrounds, coalesce towards a unique interactive encounter of multi-layered soundscapes.
Volkswagen – official partner of the French Football Federation – wanted to bring sports fans closer to their idols, by offering an innovative new gameplay experience. So UNIT9 teamed up with DDB Paris to develop a new model of entertainment. Creating the first-ever connected foosball table.
Revolutionising the traditional foosball table for the digital age, we connected a social audience with the physical gameplay, so an online community can influence live games and launch takeovers. The rules of foosball are turned on their head in the digital world. Gameplay becomes even more fast-paced and unpredictable.
To bring to life the first connected foosball table we brought together a crack team of engineers, 3D artists, developers and designers. And then we invited stars from the French national football team – the likes of Paul Pogba and Olivier Giroud – to beta test the gameplay. Bringing the online community and football fans closer to their idols.
Foosball Arcade is a standalone piece of product design due to appear at the FIFA 2018 World Cup as well as the UEFA Women’s Euro Tournament in 2017. Offering fans a new type of connected experience.
Arduino, Servo Motor, Max MSP,(romain Meunier )
DDAANN Studio (ddaann.co.uk)unveiled and demoed the Dreambooth for the first time at Protein Block Party 2014.
In October 2014 DDAANN was commissioned to build an interactive video booth for the Bacardi Triangle party in Puerto Rico. My role was to help them to automate the process using xCode (GPUImage), and arduino.