Adobe Creates Robots.txt-like for Images to Prevent AI Training on Them
The standard, though not completely reliable, means of blocking web crawlers on websites is the robots.txt file. Adobe intends to implement its equivalent for image files, so that their authors…