Cybersecurity

6% Employees Pasting Sensitive Data to GenAI tools as ChatGPT


June 15, 2023Hacker NewsBrowser Security / Data Security

GenAI’s revolutionary technology tools, such as ChatGPT, have posed significant risks to an organization’s sensitive data. But what do we really know about these risks? A new research by Browser Security company LayerX highlights the scope and nature of this risk. The report entitled “Revealing the True GenAI Data Exposure Risk” provides important insights for data protection stakeholders and empowers them to take proactive actions.

The Numbers Behind the Risks of ChatGPT

By analyzing the use of ChatGPT and other generative AI applications among 10,000 employees, the report has identified key areas of concern. One alarming finding revealed that 6% of employees have pasted sensitive data into GenAI, with 4% engaging in this risky behavior on a weekly basis. These repeated actions pose a major threat to data exfiltration for organizations.

The report addresses important risk assessment questions, including the actual extent of GenAI use across a company’s workforce, the relative share of “paste” in these uses, the number of employees pasting sensitive data to GenAI and their frequency, the departments that use GenAI the most, and the types of sensitive data that is most likely to be exposed through pasting.

Increased Data Usage and Exposure

One of the striking findings was the 44% increase in GenAI usage over the last three months alone. Despite this growth, only an average of 19% of an organization’s employees are currently using GenAI tools. However, the risks associated with using GenAI remain significant, even at its current adoption rate.

That research also highlights the prevalence of exposure to sensitive data. Of employees who use GenAI, 15% engage in pasting data, with 4% doing it weekly and 0.7% multiple times a week. This repeated behavior underscores the urgent need for strong data protection measures to prevent data leakage.

Source code, internal business information, and Personally Identifiable Information (PII) are the main types of embedded sensitive data. This data is mostly pasted by users from R&D, Sales & Marketing, and Finance departments.

GenAI Tool as ChatGPT

How to Utilize Reports in Your Organization

Data protection stakeholders can leverage the insights provided by the report to build an effective GenAI data protection plan. In the GenAI era, it is critical to assess the visibility of GenAI usage patterns within an organization and ensure that existing products can provide the insights and protection needed. Otherwise, stakeholders should consider adopting a solution that offers continuous monitoring, risk analysis, and real-time governance of every event in a browsing session.

Download the full research here.

Found this article interesting? Follow us on Twitter And LinkedIn to read more exclusive content we post.





Source link

Related Articles

Back to top button