Whistleblower Edward Snowden used a simple and low-cost web crawler to tap into the troves of government data he leaked.

Intelligence officials that are investigating the incident, which led to Snowden sharing thousands of classified documents to press outlets like The Guardian, determined he used the widely available tool that is designed to search, index, and back up websites to scrape the data, according to The New York Times.

An unnamed official told the Times that the process was automated, allowing Snowden to go about his routine work tasks while the software ran.

Investigators found that the “insider attack” was not sophisticated and should have been detected, especially since the incident followed similar ones involving WikiLeaks, which took place close to three years ago.