SDVoE LIVE! on demand

Episode 10 – Network Safety for AV Pros

Everyone recognizes that network security is important, but why? Ultimately, security is the foundation of privacy. But privacy is about more than just the network – it’s about behavior. Our guest Josh Srago realized early the rapidly increasing importance of these issues to pro AV, so he put a successful AV design career on hold to attend law school, and become an industry expert in navigating these crucial questions for our industry. In this episode, we’ll dive into the comingled topics of security and privacy, and explore how AV pros can be ready to best serve their clients.

Catch new episodes of SDVoE LIVE!
every second Tuesday at 1 p.m. ET

The show is broadcast live in SDVoE Academy, on Twitter @sdvoe and on the SDVoE Alliance YouTube channel.

Watch Q&A in our aftershows and access
extra resources in SDVoE Academy

If you have previously visited SDVoE Academy, simply log.
If not, you can easily create a free SDVoE Academy account.

Opt-in for reminders
for all shows

Sign up for an SDVoE Academy account
if you don’t already have one.

to your calendar

Josh Srago

Episode guest

Josh Srago

Josh is a recent graduate from Santa Clara University School of Law and award winning audiovisual professional. His career has spanned all aspects of the industry – end user, manufacturer, contractor, and consultant – giving him the broad experience to understand the interests associated with the varied members of the community. As the technology became more dependent on network technology, Srago saw the best way he could contribute would be to focus his efforts on translating the evolving tech policy landscape, including net neutrality, privacy, and cybersecurity, to the audiovisual community and share its needs with policy makers. He’s accomplished this through his many writings and speaking engagements around the country. You can connect with him on his LinkedIn page or see some of his more recent thoughts on Twitter (@JSrago).

Episode transcript

Justin Kennington (00:09):

Hello everybody. Hello once again, welcome to SDVoE LIVE!. I’m your host, Justin Kennington and this is TV for pro AV. Once again, we have a very special, and I think interesting episode for you. Our guest is my old friend, Josh Srago, who has taken what I think is one of the more interesting career paths in pro AV. I’ll let him tell you a little more about that later, but today we’re going to focus our discussion on network security, but also privacy. Network security quickly comes to your mind when you think of AV over IP and some of the other technologies on the network that we use in AV. But what’s the point of security? It really is privacy and a lot of that is driven by behaviors and actions rather than technology. So let me not spoil our thunder. We’ll talk about that a lot with Josh later.

Thanks for joining us. Those of you watching this in SDVoE Academy will find our chat function so you can chat with everyone else who’s watching the show live. We are also broadcasting live on YouTube and Twitter.

After today’s show, we will have our aftershow. My co-host Matt and I, and our guest Josh, will interact directly with you so you can bring your questions, your comments, and tell us what you think about the show.

Matt Dodd (02:37):

Hi guys. Good to see you again. Matt here, your favorite English guy. Great show today. We need to think hard about prioritizing essential, basic security elements before we even start thinking about features of a system. Josh is going to help us with that. But before we do that, let’s take a look at this snippet from a new course in SDVoE Academy “Basic Security and Privacy Requirements of an AV over IP System”. You’ll be able to go and watch the course yourselves after the show.

Hacker (03:24):

Networks are everywhere and as a hacker, I’m extremely busy trying to find points which are weak in security and privacy. They are out there and when I find them, I make sure to steal every piece of personal protective data I can find.

Matt Dodd (03:46):

It could be really easy to choose an AV- over-IP system based on the cool feature sets it offers. However, what’s more important is understanding how well protected the content, which is being sent across the network, is. Let’s use the car industry as an example. Over the decades, the consumer has been instrumental in making sure the safety of car manufacturers meets a minimum requirement, so much so that whenever we buy a new car today, we automatically expect that car to meet a minimum standard of safety without needing to research it first. Consumer demands are key to pushing manufacturers to meet standards.

By making sure we’re all aware of those minimum standards from an AV-over-IP perspective, we can really push the manufacturers of those systems to meet our basic security and privacy standards before any other features are considered. It’s completely non-negotiable that your AV-over-IP system has the following three security measures in place: encryption, authentication and vulnerability scanning. Without any of these, your AV and control data is wide open to theft. This course will help you to demonstrate the security parameters of your AV-over -IP installation to IT professionals by explaining these basic security and privacy features to ensure your data doesn’t end up going to places or people it shouldn’t.

Let’s begin with the question, “Why?”. Why do we need to implement basic security measures to networks? Okay, let’s rephrase that question. Why do we lock our houses before we go out to work? The answer to this, of course, is the same for both questions. It stops the thieves stealing your personal things. Yes, that’s exactly what it is. Yes, you’re absolutely right. Let’s lock the door here. Security touches everyone. Whether we’re protecting our personal possessions from the hands of thieves or protecting data streams from going to places they shouldn’t, basic security and privacy mechanisms need to be put into place.

Hello? How are you?

Justin Kennington (06:41):

Hello, Matt. Good to see you.

Matt Dodd (06:43):

It’s good to see you too. Great show planned, looking forward to this, and we’re making a point that basic security and privacy measures have to be thought about by everybody before feature sets are thought about. We used the car analogy. Do you like that?

Justin Kennington (07:28):

I think the point made in the video is exactly it. There are people who think first about a car’s safety and that’s what they’re going to prioritize and they’re going to shop based on that. But there are many people who think about how fast does it accelerate? How much technology does it have in it? They can do that because we have demanded of the car industry the baseline that the most unsafe vehicle in production is still really very safe. I think the place we have to get to in AV is to demand that, so that we can know that all AV systems are safe and secure. Then we can worry about who has the most interesting feature set.

Matt Dodd (08:11):

Absolutely. Well, on that note, let’s head into the AV news. Item number one, what was your take on this?

Justin Kennington (08:34):

The piece described the question of how we manage the information that can be, and is, captured by many digital signage systems today. Some of these kiosks have cameras built in, they have temperature checks built, so they are capable of capturing private data – facial recognition, matching your face to your temperature. The question then becomes, “How do we treat that data sensitively?”.

What I took away from the article is everyone who was interviewed said the owner of this technology should work with their system integrator to come up with a clear and transparent privacy policy that protects individual users. I thought to myself, “Wow, on the one hand, I agree with that, but on the other hand, how do you even communicate a privacy policy when we’re talking about public installations?”.  For example, if this is a kiosk at the shopping mall, how am I, as a user, going to even receive this notice of privacy rights?  There are lots of big, big challenges here. This is something I wanted to talk about with Josh too.

Matt Dodd (09:57):

That word “transparency” went across both of the news pieces that we’re talking about, but definitely the public by-and-large is not aware of how the tech is being used. It’s the privacy policy that’s actually very important to everybody because it needs to reflect exactly what’s happening to the information that the tech is gathering. I think one other point I took away from this was that when people realize that there are cameras and microphones in the room, they start to create this drama about how they don’t want to be in there because all their information’s being taken away and used for nefarious purposes. In fact, it’s there for a very good reason, which is explained very clearly in the privacy policy.

In article number two, Scott Tiner is very up to speed with IT and AV. He works in an educational facility so he’s seeing this firsthand.

Justin Kennington (12:34):

The main takeaway, the thing that I snuck in here, is that piece was six years old. The real reason I put it in here is, for those of you who go to the resource link in SDVoE Academy to read the details, is to think about whether we have actually made any progress since six years ago when Scott laid out these challenges.

The challenges are good ones. He says we can add a privacy button to every control panel, but then you have an education problem. How do you make sure that the user even knows the button exists and remembers to use it when they’re entering a sensitive area of their conversation?

And the flip side, you can make the camera opt in so that you have to push a button that says, “I’m going to turn on the potentially snooping technology in this classroom”. Then how do you make sure that button gets opted out at the end of class? Pushing that burden back onto the users is a valid thing to explore, but brings its own kinds of problems too.

Matt Dodd (13:49):

It goes back to that whole point about transparency. As tech managers, AV integrators and AV designers, we need to help and support our audiences to be fully aware of the sort of tech that’s in place, what it’s being used for. And more importantly, how the information is being captured and what it’s being used for as well. Some great pieces there.

It’s time to introduce our guest. Who is this guy?

Justin Kennington (14:29):

This is Josh Srago. I repeatedly insist on calling him the JD for AV. I haven’t gotten a clear answer from him, whether he likes that title or not, but we’ll hear a little bit about what that means and how he got here and why he wants to talk about security and privacy.

Josh Srago (14:55):

Hi, Matt. Hey, Justin.

Justin Kennington (14:59):

It’s good to see you.

Josh Srago (15:00):

Thank you. Thank you very much. It’s good to see you too.

Justin Kennington (15:04):

So I’ve been teasing everyone out there that you’ve had this interesting career trajectory. Why don’t you tell us what that is behind your head there and how you got to this point? Where are you at now and why?

Josh Srago (15:18):

I’ve told my story a lot in terms of my career in the AV industry. I’ve worked on all sides of the table. I’ve been a facilities manager. I’ve been a manufacturer’s rep. I’ve been a contractor, and I’ve been a consultant. The big change for me was when I decided that we needed a voice when it came to policy decisions and the way the world was changing. I saw how policy decisions are going to affect things starting in 2014 with an appellate court decision in DC – Verizon versus the FCC. Verizon sued the FCC over the practice of what they were trying to implement at the time on net neutrality laws. Shortly after that, then-FCC-chairman Wheeler tried to implement a fast-lane idea where larger companies could purchase or isolate themselves and say, “Hey, we have priority on this network, and we’re going to take this much bandwidth to ourselves.”

AV at that time, if you recall, was moving into 4K. I started thinking about that. I’m like, “Okay, so at 5PM, all of my clients are going to be allocated on 4K video down to just a small partition of the network, because the rest of the world is going home to watch YouTube, Netflix, and Hulu.” Now you look at how many streaming services we have and how dangerous that could have been if fast lanes had been implemented. That was what led me to it and that in turn led to looking into privacy implications, hacking violations, and what the laws were on cybersecurity. That’s how I ended up at Santa Clara. I just finished my JD in December so I actually finished a semester early. Now I’m just looking for someone to hire me and pay me to do this kind of work.

Justin Kennington (17:19):

Well, I was going to say, I’d be happy to hire you. SDVoE Alliance relies on a lot of pro bono work, so maybe I’m not helping you yet, but look for it as we expand our audience here.

I kind of teased it in the opening and, it’s a phraseology that you taught me and put it into my head that security is really privacy and privacy is really behavior. Can you talk about that sort of privacy stack as I’ve named it now?

Josh Srago (17:51):

Certainly. One of the examples I used to give when I was talking about this in-person back in those olden days is room-scheduling systems. Think about a room scheduling system where you have a panel on the door connected to a centralized system and that centralized system, the control system, connects to your calendars in some way. Let’s say your system gets hacked or an individual loses their phone or somehow somebody gets access to what these calendars are.

What happens now is they know where your appointments are and who’s in the room. Translating this back to a room-scheduling system, the room-scheduling system is provided by a manufacturer. If that manufacturer has, in their end-user license agreement, allowed themselves access to monitor who’s in the room at a given time, they can start to see what business arrangements are going to happen.

So let’s say the room scheduling system at Samsung got hacked while they were in negotiations with Harmon, back when they purchased Harmon a few years back. If you were the provider of that room-scheduling system, you were giving yourself access to know who was in that room because you were monitoring the calendar. You could see the behavior of the company and you could start to make strategic moves based on what was going to happen there.

Alternatively, one of the other options is that if you’re an individual, you’re just sharing your information, and somebody is targeting you, they can see what office you’re going to be in on a given day. So let’s say you only go into the office three days a week. They can start to monitor behavioral patterns. When are they logging in from this specific IP address? When are they in their Zoom rooms from this location versus this location? So they know where you are and they start to be able to follow you around.

The goal is making sure that you can prevent that access and making sure that you can anonymize that data. You and Matt were both hitting on this, you were dancing around the edge of it, with the discussion of the last article, which is data management. It becomes looking at specifically what access to data you actually need. Do you need everything or just need a small piece of data? How long are you keeping it? How long are you storing that information and who has access to it? When are you deleting it? What access and controls are you giving to users to determine if they get to say they don’t want you to have that data anymore? Do they have to opt in? Do they have to opt out? What are your default parameters? What’s the structure you put in place to prevent that information?

Justin Kennington (20:47):

Whose job should it be to answer those questions? Is this down a hundred percent to the end user, that corporation, or that university, in their policies? Is it down to a system integrator or designer working with them to establish this? Or, and I know in some cases it already is, is it down to government and legislation that needs to dictate this? What’s your view?

Josh Srago (21:16):

Everybody has to play a part. The end user, it isn’t their job to understand what this technology does, how it works and how it operates. That’s the contractor.

The contractor has to be able to explain to them, as the expert they’ve hired, that this is how the system works. This is the information it’s tracking. We can’t control this. Here’s what the end user license agreement says the app manufacturer is going to have access to. Are you comfortable with this? And make sure that they understand that and explain that to the end user. If the end user isn’t comfortable with it, then they need to make different decisions and you need to help them make those decisions as the contractor or the consultant. Policy, of course, comes into play. The California Consumer Privacy Act, (and now soon to be the CPRA in a couple of years because we just updated the California privacy laws on the last election) has specific requirements for users being able to contact companies and say, “Do you have information about me? What information about me do you have? I want you to get rid of certain pieces of information.”

I’ve been following a story right now, specifically about Clubhouse. Clubhouse blew up during the pandemic and was really popular. However, I already know Clubhouse has information on me, but I don’t have a Clubhouse account. The reason I know that they have information about me is because they are able to, when you sign up for an account, get access to your entire contacts list on your phone. Let’s say Justin, if you were to do it and they got access to your entire contacts list, that means that because you have my phone number they would now have my phone number. I didn’t consent to that. You consented to that for me.

So there’s that personal responsibility as well of the end users saying, “What am I comfortable with?” The company, whoever is going to provide the technology, needs to be able to explain that arrangement between the two things and the government plays a role in making sure that individuals have a right of protection and a right of privacy that’s enforced and people that are advocating on their behalf.

Justin Kennington (23:25):

It gets vastly complicated very quickly doesn’t it? Another thing I want to touch on, and I’m going to push it to the aftershow because we’re out of time right now, is how do we communicate these things effectively to that end user? To Josh who’s not a member of Clubhouse, to the person who’s just walking by a kiosk in the mall and possibly having their photo taken, how in the world do we communicate this to them in an effective way?

I’ll see you at the aftershow and talk about that.

Matt Dodd (26:53):

Talk to us about the next show. What’s it about?

Justin Kennington (26:56):

This one’s called Practice What You Preach. Our dear friend Tim Albright of AVNation fame is going to join us. We’re going to pull back the curtain a little bit and show you how SDVoE LIVE! is produced, talk to Tim about how his shows are produced and give you a little education on how to make the most of your presentation skills. We’ll talk about how to make sure that you’re addressing your clients, customers and colleagues with the biggest impact possible, especially in a world where travel is limited, where in-person contact is limited. We want to help you be impactful in your communications. That’s what the show is about.

Matt Dodd (27:37):

Something very close to my heart. We’ll show you what happens behind the scenes and some really simple tricks and techniques that you can use to really elevate your pitch and make it work.


Stay informed