Edition Nov 16th-17th, 2018

Vertical Innovation Hackathon

Keynote 2.0

Win a Google Home Assistant winner

Keynote 2.0

Vertical Innovation Hackathon Bolzano 17/11/2018

Keynote 2.0

The main aim of this beta application is the one of providing some innovative ways in order to interact with the computer.

 

Have you ever thought about the possibility of avoiding the use of 'remote click control'?

The idea has been conceived as an interaction with a Google Home device or a Google Assistant app installed on your phone and with simple gestures recognized by some computer vision and deep learning algorithms.

 

Sample Interaction

The user without any touch can easily open the Keynote app with the related presentation to show and by some vocal commands be able to start the full screen presentation.

 

We are going to show here an example of a simple interaction vocal command:

 

User: 'Let's start'

 

End Device: 'Starting your presentation'

 

Gesture recognition

In order to let our app be simple for testing we decided to declare 2 kind of gestures up to know which can be used for switching slides: forward and backward.

 

Respectively:

 

Number 4 (by hand) - to go forward

 

Number 5 (by hand) - to go backward

Limitations

 

The limit we had, was that in order to use different kind of gestures, like swiping or different numbering we would have needed a much bigger dataset from which to retrieve the right information to train the model.

 

Plus, we would have needed a different hardware setup !

 

End of the presentation

In order to end the presentation you can simply talk to your Google Home or Google Assistante device and say something like: 'Close the presentation'

 

CyberVisionTeam

 

Bruno Marafini - EIT Digital Master School - Cyber Security

 

Mirko Schicchi - EIT Digital Master School - Cyber Security

Filippo Tessaro - ICT University of Trento - Data Science

 

See also the video: https://vimeo.com/301341287

Team