President Joe Biden’s administration is pushing the tech industry and financial institutions to shut down a growing market of abusive sexual images made with artificial intelligence technology. New generative AI tools have made it easy to transform someone’s likeness into a sexually explicit AI deepfake and share those realistic images across chatrooms or social media. The victims — be they celebrities or children — have little recourse to stop it. The White House is putting out a call Thursday looking for voluntary cooperation from companies in the absence of federal legislation.