Fireside Cyberchat 2025: Cybersecurity and AI
Helen Patton’s talk focused on the risks of generative artificial intelligence and what industry cybersecurity professionals are seeing in terms of its impacts on their work. Here is a brief recap.
Fireside Cyberchat 2025: Cybersecurity and AI
The second annual Fireside Cybercat took place the afternoon of Thursday, Oct. 16. Helen Patton, cybersecurity executive at Cisco Systems, visited Miami for the second year in a row to share her expertise about the intersection of higher education and cybersecurity, and her knowledge was once again hugely appreciated.
Around 50 Miami community members attended the webinar, with most of the attendees being IT Services employees, but some students and university staff came as well.
Patton’s talk focused on the risks of generative artificial intelligence and what industry cybersecurity professionals are seeing in terms of its impacts on their work. Here is a brief recap.
The tenets of AI adoption: Compute, Power, and Trust
Patton mentioned that there are essentially three things impacting the adoption of artificial intelligence, especially in the higher education space.
Compute
AI requires more compute power in general. In fact, AI has more consistently high loads than anything else we’ve been working with up til now, so this changes what we need our networks to be able to accommodate. Not all networks can accommodate that kind of load—and so more money is being spent on network infrastructure that can handle this increased need for compute power.
Power
This tenet refers more to grid infrastructure that AI systems rely on—how much electricity they require. As it turns out, those higher networking loads also pull much more electricity. Data centers already make up a large percentage of global power usage, but we still don’t have enough power to support the fast adoption rate of AI tools. Interestingly, a text-based ChatGPT query requires ten times the power to calculate as a Google search.
Trust
Security is all about trust. What can we do without regulation (both at the organizational level and at the government level)?
Patton’s point is that AI has specific needs in terms of not just technological power, but also trust within the culture and within the cybersecurity industry itself.
Cyber Risks of AI Adoption
As with anything else, there are levels of risk involved with the fast adoption of generative AI within any organization, and any business who wants to have a relationship with these tools needs to assess that risk before jumping in feet-first. A governance group needs to determine whether an organization is on the lower- or higher-risk side of the spectrum.
For instance, when we give AI a desired outcome and tell it to help us get there more efficiently, the technology may very well bypass the organizational processes that have been put in place by humans for various reasons—creating efficiencies, yes, but also creating more questions like: Is that process there for a security reason? By bypassing it, are we creating a security flaw? (Patton referenced a blog post by Ethan Mollick about this very topic.)
From a security perspective, this may add some risk because we have given AI access to all of our systems in order to circumvent inefficient processes.
A few recommendations…
So, what can be done to mitigate all this risk? After all, the purpose of adopting effective cybersecurity practices is to decrease the likelihood of being targeted by scams, and the point of the cybersecurity industry as a whole is to protect unsuspecting computer users and companies from nefarious activities.
Patton’s recommendations are broad, but manageable:
- Maintain basic cyber hygiene—Take backups, use password managers, use multi-factor authentication on all internet accounts
- Give AI smaller projects—not the whole shebang
- Take inventory of the agents in use in your organization—who is the owner, who is using it and for what?
- Clean up your data
- Do incident response exercises (like tabletop activities where you can practice what could happen in a cyber event)
- Keep learning! How is AI impacting your job and your industry?
Watch the full video here:
Patton’s way of speaking is incredibly accessible, and we are grateful she decided to spend more of her time with us this NCSAM. The video is really worth a watch!