We all know what Artificial Intelligence (AI) is but many of us are completely unaware of how real the technology is today.
Showcasing what AI can and will achieve was the focus of many of the keynote speakers at Microsoft’s Future Decoded event in London last month. In this blog I will share the insights of one of the speakers in this article, Joseph Sirosh, Corporate Vice President, Data Group at Microsoft.
Sirosh wanted to break down the different types of application design and behaviour that can have intelligence embedded into them. By moving data analysis and intelligence from the application to the database – as in the case of Intelligence DB – it is possible to perform incredibly fast manipulations and calculations based on real-time data.
Sirosh introduced us to Microsoft SQL Server 2016: an intelligent database with R functionality built into its core.
Readers looking for what has been added to SQL Server 2016 should start here, whilst readers looking to get their hands dirty with R services for SQL Server 2016 should direct themselves here. Sirosh has also written a very in-depth (and free) eBook on how to develop intelligent applications using SQL Server which can be found here.
Sirosh then moved onto the Intelligent Lake and introduced us to Microsoft Cognitive APIs and how intelligent functionality can be added to existing applications with just a few lines of code.
A very powerful demonstration shown was how an app – in realtime – could be fed a YouTube video and the Face API would recognise a face, log its gender, age and emotion attributes then attempt to identify the face using a local database of images as a reference.
The most interesting part of the demonstration was that the entire functionality could be added in a few lines of code thanks to the API. Uber is currently using the functionality as a security measure to check a driver’s identity before starting a shift to ensure the safety of both the driver and passenger.
The final focus of Sirosh’s talk was Deep Intelligence: applications and machines that have intelligence built into their core. Joseph spoke of self-driving and automated cars that could recognise hazards and surroundings and react just as skillfully, or more skilfully, than a human can. Modern cars are increasingly becoming aware of their surroundings and can even distinguish between different hazards such as humans, animals or inanimate objects like a parked car or kerb and act accordingly.
Tore Lie from eSmart Systems in Norway took this idea a step further. He came to the stage to showcase their drone project and how Microsoft Cognitive APIs and the Azure web infrastructure have been combined with drones to make inspections of power line infrastructure in rural Norway much safer.
The drones can fly over power lines, taking hundreds of images of infrastructure and other equipment which can then, using both historical data and simulated data, accumulate the images and process them incredibly fast to provide analysis on whether equipment is defective or the image needs human review. The model used by the drones constantly learns as new images are added and thanks to intelligent design the software improves its ability to recognise and analyse equipment on an ongoing basis.
It is hard to do these projects justice in so few words, and while these technologies and concepts seem quite focused on augmenting business applications there was a lot more to the keynotes of Future Decoded and I encourage you to delve into the links below to find out more about how varied the talks were. Artificial intelligence, machine learning and augmented reality are going to be a major focus of Microsoft and it is exciting to think about what the developer community can build with it all.