Pt4

The Security Issues

The Privacy Rights/Recorded Conversations:

As it was discussed in our last blog, one of the primary area of concerns of using a Virtual Personal Assistant is the conversations (or really the queries) that you are having with, for example, Siri. Siri is actually being recorded by Apple and stored on their servers for an indefinite period of time (experts say that this is about 18 months, but it is not certain).  In reality, nobody really knows where these servers are located, adding even more to the mystery of how these recordings would actually be subsequently used.  But, it is important to keep in mind that that these conversations which are being recorded are those in which we are actually engaged in.  But what about those particular instances in which we are not engaged actively with our Smartphone, and we assume that our Virtual Private Assistant is not activated?  Is there the distinct the possibility that it could be covertly listening into private conversations that we are having with others?  The answer to this is a resounding “Yes”.  Given the sheer level of the sophistication of the Neural Network and other Machine Learning tools which are embedded into them, it is very likely even that these private conversations are being picked up by the likes of Cortana or Siri, and being transmitted back to the servers so that they can be stored as well.  It is the implications of this which is most fearful.  For example, what if these private conversations are used for marketing purposes (such as creating targeted advertisements) or worst yet, being used by a third party for malicious purposes (an example of this will be discussed later).

Having conversations which turn into trails of evidence:

It should be kept in mind that although the level of the sophistication of Virtual Personal Assistants is growing, the degree to which they are “learning” about the behavioral patterns of the end users that use them is still at a primitive level.  For example, this applies to the actual context of the conversation which is being held, especially to the words that are being spoken.  So, if an end user is attempting to ask Siri or Cortana a specific question, and they are unable to answer the query, there is a good chance that the VPA misunderstood what is being asked.  Remember, the contextual meaning of words still cannot be understood by computers yet at this point time, so thus, an unprocessed message by Siri or Cortana could be misconstrued into actual criminal threat at a subsequent point in time, and actually be used as evidence in a court of law.

Conclusions

Our next blog will continue with the examination of other security issues as it relates to Siri and Cortana in more detail.

One thought on “Virtual Personal Assistants Pt. 4

  1. Wow, that’s not good at all, it looks like we are lossing all the privacity when using the mobiles. Thanks, Be structured, for taking the time to share this information and for your great
    IT Service & Support in LA.

Leave a Reply

Your email address will not be published. Required fields are marked *