One thing we know about technology is that it never stays the same. As an IT company that strives to provide our customers with the best possible solutions, it is of utmost importance that we stay on top of the latest technology trends. This year, we collaborated with students from Thomas More to explore and evaluate new techniques and technologies to improve our existing AIVI platform. So, what are the results of this year’s collaboration between Mediaan Conclusion and Thomas More?
AIVI
Our state-of-the-art platform, AIVI, is a standardized platform with reusable components for many intelligent capabilities. These components are accelerators, built on the many lessons learned and years of implementation experience. We support features for various use cases, for example, a Computer Vision project that requires video-based automatic detection, recognition and tracking of people or objects. But from the beginning, we knew we didn’t want to be limited to a single data source. As humans, we use multiple senses to make decisions, so why not introduce a new source, such as audio?
The project
Our goal for this collaboration project is to explore the use of audio to detect specific events and “trigger” appropriate responses to those events. Let’s take security at a large festival as an example. During such an event, there is a lot of noise and people are having a good time. Suddenly you hear a loud bang or a flare going off, creating a certain mood in the crowd. With this particular project, we wanted to detect specific sounds, such as sirens, thunderstorms and breaking glass, where the sound is coming from, determine the mood of a crowd and initiate the right actions to alert security when needed, with the right information.
Students first began investigating how to complete this project by conducting market research and outlining the technical requirements needed to achieve the desired goals. Database schematics were drawn, costs were calculated, AI services were tested and compared until the next phase began; creating a Proof of Concept.
In the POC phase, sound spectrograms were used along with AI models to identify sounds and others to classify moods. This was then combined into a fully working application in which public events such as festivals could be created, security teams could be added, and automatic alert systems were used to direct agents to hot zones. Everything was visualized in a sleek, user-friendly interface. During this project, the following technologies were used:
- .Net
- Vue.js
- tailwindcss
- Microsoft Azure
A win win situation
The project with Thomas More gave us even more confidence in expanding the AIVI platform with additional sources of information to enrich its capabilities. Students from Thomas More had a great time and also received additional training in critical thinking, presentation skills and collaboration, not only within their own group but across multiple groups. Resulting in an improvement in both their technical and soft skills.