As artificial intelligence (AI) tools become even more engrained in our daily lives, it’s important to keep our fingers on the pulse of data security best practices. People are jumping at the chance to play with AI and see what it can do — but they’re moving so fast, they’re ignoring some major red flags.
We recently spoke with the Veracross Information Security team to learn how schools can prioritize data security while embracing the modern innovations of today’s AI craze. Keep reading to hear their advice for navigating these tools at school and keeping your community’s data safe.
The Importance of Data Security
Confidentiality. Integrity. Availability. Known as the CIA Triad, these three principles speak to the heart of information security and are used to establish data policies and assess software vendors. Data security is top of mind for IT professionals — not only because it’s a federal law, but also because it’s the right thing to do.
“Security is about building a community and educating and supporting our community as we look after our data and the data of others,” says Mike Martell, Chief Information Security Officer at Veracross. “We should care about our people and our communities; we should care about keeping them safe and informed and about protecting their rights and their privacy.”
Considerations When Using AI
From a security perspective, there are many unknowns about AI tools. Where is your data going? How is it being used, shared, or sold? Are you unknowingly part of a training model?
“AI tools are a bit more mysterious because we don’t know what happens to our data,” says Martell. “As data stewards, we need to help our community find safe tools to use… and make sure they’re not entering personal or health information into these tools.”
One tip for identifying safe AI tools is to use vendors that you already know and trust. Survey your community to find out how they wish to use AI, then see if your current vendors offer a product in that space. For example, if you use Zoom, try the Zoom AI Companion; if you’re a Microsoft school, use Bing Chat. This can help alleviate some fear of the unknown as your school is already familiar with the family of products and how they handle data.
Some schools are responding to the unknown by blocking AI tools. While this is a common practice, not everyone agrees it is the best solution. “Blocking the tools stifles creativity and innovation,” says Martell. “You want to teach people how to use the tool responsibly so if they don’t have guardrails in place, they’ll be safe.”
This is especially pertinent for adolescents as AI will play a major role in how they learn, work, and interact outside of your school. If they don’t learn how to safely and ethically use these tools within the walls of your school, they’ll learn it on their own — for better or for worse.
Safeguarding Your School
If you choose to welcome AI tools at your school, it’s important to educate and train your community about how to safely use them.
“Security awareness training is one of the best tools you can have in place to make sure information is being kept secure,” says Emily St. Clair, Veracross Information Security Analyst. “IT departments don’t have the ability to be everywhere all at once. We need to be able to equip everyone with the appropriate knowledge to keep information safe in their own day-to-day activities.”
There are many different ways to spread the word about safe AI usage at your school:
- Send a monthly newsletter with tips, current events, and recent trends
- Host a trivia game about AI safety and best practices
- Start a dedicated Teams or Slack channel to encourage questions and cross-department communication
- Film a short skit about common AI pitfalls and how to avoid them
Whatever you choose, St. Clair recommends to mix it up and make your training content engaging so that people pay attention. Pick a fun theme based on an upcoming holiday or pop culture reference to encourage participation.
It’s also important to speak directly with your students and understand how they feel about these tools. What do they already know? What do they want to know? How do they wish to use AI? Take this feedback and use it to inform future training sessions, AI policies, or decisions about safe tools.
Keep Your Finger on the Pulse
As AI tools continue to evolve, so must our data security practices. Take WormGPT, for example. This tool is similar to ChatGPT but for cybercriminals. In the same way that it helps teachers write lesson plans or difficult parent communications, it helps bad actors write malicious software code and phishing emails.
“In the past, bad grammar and spelling mistakes were a red flag for phishing emails,” says St. Clair. “This new tech could change that as autogenerated emails become more grammatically correct… This changes how we’re detecting (and protecting) against these threats.”
November marks ChatGPT’s one year anniversary — this is only the beginning. To learn more from Martell and St. Clair about protecting your school’s data in the age of artificial intelligence, watch our on-demand webinar.