Using Cognitive Computing to Humanize Computer Interaction
Every so often there is a fundamental shift in how we interact with “machines”. The era of cognitive computing is about to trigger a new shift.
It is easy to forget how far input devices have evolved since the first automated computing devices were introduced just over a century ago. Today we are all used to touching, swiping and pinching using our fingers, on the screens in order to interact with machines.
This move to touch was a radical shift from typing on a computer keyboard or using a mouse (more recent), which was the norm, since the first automated computing devices were introduced just over a century ago.
It is pretty amazing to think that today we use touch to interact with machines daily. We rarely give that a second thought. Touch has undoubtedly augmented existing interaction models and opened new possibilities. There are now many things where touch is simply the best method of interacting with a machine.
Despite the innovation it is a fact that we have still not been able to really humanize the experience of working with computers or machines.
Innovation in the agribusiness is not a new phenomenon. It has been going on for many years often without many people being aware of it outside that business itself. That continuous innovation has resulted in todays agribusinesses being much more productive than previous generations of agribusinesses.
Some studies put the average agribusinesses ability to produce at 262% better than it was as recently as 1950. This has been achieved with 2% fewer inputs than before where inputs are things like labor, seeds, feed and fertilizer.
To that end it is clear the agribusiness has come a long way from where it started in the 18th century. At that time it was driven by Oxen and horses for power, crude wooden plows and lots of work done by hand to sow seeds and cultivate crops.
If we fast forward through the many innovations, triggered by the industrial revolution (iron ploughs, cotton gin, tractors etc), we eventually arrive to the point where information technology started to be used extensively in the in the late 1980’s and early 1990’s quickly followed by the use of satellites to plan work in the late 1990’s.
To the outside the agribusiness looks like a slow moving industry when it fact it is clear to those closer that it is moving at an astonishing pace!
This post is the eleventh, and final, post in documenting the steps I went through on my journey to build an autonomous, voice-controlled, face recognizing drone. There are 10 other posts building up to this one which you can find at the end of this post.
Focus of this post
In this post I will share a video of the complete end-to-end demo and share details of the architecture which sits behind it. I will also share information on what I bought/used to bring this all together and relist all the different software, services and node packages in a single place.
Pulling It All Together
A lot of what we have been doing with this project is humanizing the way we communicate with machines/computers/things. That means talking and observing to drive intelligent interaction rather than using a mouse, keyboard or touch screen.
Our Autonomous Voice-Controlled, Face Recognizing, Drone is a smart drone which showcases, albeit crudely, how interaction with services filled with intelligence is going to evolve. It highlights the importance of cognitive services to the success of organizations in the future.
So with that said take a look at the entire end to end demo in the video below.
This post is the tenth post in documenting the steps I went through on my journey to build an autonomous, voice-controlled, face recognizing drone. There are 9 other posts building up to this one which you can find at the end of this post.
Focus of this post
Up until now we have been mostly working on controlling the drone, using the Microsoft Cognitive Services Face API to identify people and lastly making use of the Microsoft Cognitive Services API to convert text to speech and speech to text.
Ultimately we will have built an intelligent end-to-end IoT solution featuring analytics and visualization. The instructions here can be used to also understand how to get data in from other devices as well!
Visualizing your cloud strategy and developing cloud competency
Speaking to organizations embracing the cloud, with a clear cloud strategy, results in two reoccurring to-do’s they recommend I will share in this post and dig into.
You should have simple way to categorize potential cloud projects. This often requires consideration of numerous dimensions shown visually so all can see the plan. I call this visualizing your cloud strategy.
You should have a plan in place to increase the cloud knowledge of the organization so they can best leverage the new environment and explore ALL its options. To me this is all about developing cloud competency at your organizations or as an individual.