AI and Social Engineering


It is said that the easiest way for malicious actors to gain access to an organization is through a social engineering attack. Social engineering attacks are the act of intentionally misleading members of an organization to obtain information under false pretenses.

With the breakneck pace of AI development, comes new ways for organizations to fall victim to social engineering. Chat GPT, an AI chatbot capable of human-like text communication, has opened new doors for social engineering attack capabilities. AI-generated human-like text communication over email and other social media platforms can be leveraged to trick employees into sending personal information, clicking on links to malware, exposing confidential data, etc.

Universities must be sure to educate their students and employees on the social engineering schemes that could be used to put both individuals and the entire organization at risk. Without proper education, members of an organization can be at serious risk of falling victim to a social engineering attack, which may lead to the compromise of data privacy and operational continuity.

Full story


About Author

I am a computer science major at Fordham University, working as an IT risk analyst assistant in the Fordham University IT department.

Comments are closed.