When it comes to reviewing live human communication, there is no such thing as widely accepted protocol. We don't have a shared language for the elements of communication, in the same way we do for example for sports, where even casual observers will understand the difference between a "pass" or a "shot", or a "block".
What is a moment?
Before we can understand communication, we therefore need to break it down into clearly defined elements, and create a language around those. The first element in this new architecture is an intuitive one - the "moment". We all recognise when a moment occurs in a conversation - it may be a moment of hesitation, a mirroring of body language, a shared smile. These moments are central to what we do at Ovida. Our AI uses the same cues humans rely on recognising moments, but can do it at a scale and fidelity that humans simply cannot match. When our AI identifies a pattern that a skilled human observer would recognise as a moment, we tag it, and share it back to you.
What types of moments are there?
We are continuously expanding the list of moments our AI can reliably identify.
Right now, we have the following:
key moment - a moment defined by a specific set of patterns across both users that marks the moment as being of particular significance. The exact method is proprietary (we can't share all the secret sauce!) but in essence we're looking at changes in state across several measures of personal presence.
opening moment - most meetings start with an invitation from the host to set off the agenda for the meeting, and we identify this accurately 95% of the time
open question - a question that cannot be answered with yes / no
closed question - a question that demands a yes / no answer
stacked question - several questions asked in an unbroken sequence
coach / client absolute language - the use of language that leaves no doubt about a situation or event or that exaggerates or overstates a case. Examples include: always, never, all, none and must.
Where do I find moments?
Moments are displayed below the share of voice timeline (1), in the Moments tab (2) and highlighted in the transcript on the right (3). You can filter which ones you would like to see. The default setting displays only key moments.
There are 2 possible filters:
moment type:
no type - these are the moments that you have created manually during the meeting by pushing the spacebar or when reviewing the meeting. You need to decide where they belong (is that moment an open/closed/stacked question etc.) and put them in the right category from the dropdown list.
closed questions
stacked questions
coach absolute language
client absolute language
key moments
open questions
ICF markers (if you want this feature enabled, contact support@ovida.io)
more filters - you can refine your search further by filtering by the identity of the moment creator, or by searching for moments with comments (or practices... coming soon).
They will appear:
below the share of voice timeline. When you click on a specific one, it will open the moment details and the option to comment on it, practice doing it better (*coming soon), save it as your favourite or delete it if you want to.
in the transcript, when the filter for specific moments is on. When you hover over
it will open a pop-up with brief information about this specific moment and the last comment made on it.
You can learn more about how to understand, navigate and add your own moments here.