High-tech criminals are using artificial intelligence to impersonate real estate pros and infiltrate transactions. Would your clients know the difference?
Illustration of deepfake concept

© ArtemisDiana - iStock/Getty Images Plus

Sophisticated scammers can hijack real estate transactions using deepfake audio and video to impersonate agents and other parties involved in a home purchase, security experts warn. The fears are mounting after police in Hong Kong last month alerted the public that a financier had been duped into wiring $25 million to scammers while confirming account information via live video conferencing. The financier learned after transferring the funds that deepfake technology had been used to impersonate his company’s chief financial officer and other staff members on the live video call. He said the parties on the call all sounded and looked like the colleagues he worked with.

The incident illustrates the next frontier in real estate scams where bad actors use artificial intelligence tools to impersonate the written and voice communications of real estate professionals, according to CertifID’s new report, the “2024 State of Wire Fraud.” CertifID CEO Tyler Adams says fraudsters’ use of AI makes it even more difficult to decipher legitimate information in a real estate transaction. “It used to be that we could tell everyone to just watch out for misspellings in an email address” as a red flag that communications could be fraudulent, Adams says. “Those days are gone. We’re no longer seeing misspellings. The communications look really good and legitimate, and it’s becoming more difficult to tell what’s fraudulent.”

Adams says the best defense is for real estate pros to step up awareness efforts with their clients and tell them to always verify information they receive. “Education early and often is the best strategy right now,” he says. “It needs to be constant. We need to be telling consumers at the beginning that there is a chance that fraud could be interjected at any point in a transaction. Creating more awareness early on in a transaction is so important so that consumers can keep their guard up.”

Cara Carlin, director of communications for the Better Business Bureau in Arkansas, has been sounding the alarm about the risks of AI in real estate. “What we’re seeing in the real estate world is a threat of AI-generated listings, AI and ‘spider’ seller personas, generated conversations—and that could be with the property owner or even voice impersonations,” Carlin told CBS THV-11 News in Little Rock. “There’s not a lot of red flags, unfortunately. But some of the things that scammers do you may be able to pick up on, like a reluctance from the owner or the agent to meet in person.”

Deepfake videos could even trick real estate agents into taking listings for properties that don’t exist or writing virtual sales contracts that don’t represent the home’s actual condition. AI also can be used to gain access to sensitive client information, says real estate safety expert Tracey Hawkins, host of REALTOR® Magazine’s podcast, “Drive With NAR: The Safety Series.” “To protect buyers and sellers from such risks, you must take the necessary steps to ensure that your clients’ identities and transactions are secure,” she says. “This includes verifying identities, implementing secure payment procedures and adopting strict security measures.”

Face-to-face contact may grow even more critical in a deepfake world. Security experts stress the importance of always verifying sensitive information, relying on a phone number you know to be authentic (not taken from a potentially fraudulent email). Hawkins also stresses the need to always use secure communication channels, encrypted emails and messaging apps rather than relying on free email accounts when communicating with clients or colleagues about sensitive real estate information.

Advertisement