- by David Silverberg
- business reporter
9 hours agoThank you for reading this post, don't forget to subscribe!
Image source, Getty Images
It sounds like fun, but would you want strangers doing this in your house while you’re away?
How would you feel if you rented out your house for a few nights and upon returning you discovered that it has been used to host a noisy house party?
As you become shocked by the damage done to your property, you will either become angry or upset, or a combination of the two.
Such scenarios have been widely reported around the world in recent years, especially during the coronavirus pandemic. After bars and nightclubs closed, young adults, especially, wanted to find somewhere to hang out, dance, and possibly drink too much alcohol.
This sparked a fight with short-term rental giant Airbnb, which announced a “global party ban”, and vowed to do everything possible to stop such behavior. This included banning offenders from making new bookings and a ban on those under the age of 25 who did not have a history of excellent reviews.
Airbnb recently said its ban resulted in a 55% drop in the number of parties reported between 2020 and last year. But the battle is not won yet, the US firm has now gone ahead and introduced an artificial intelligence (AI) powered software system to help weed out potential troublemakers.
Now operating worldwide, when you try to make an Airbnb booking the AI automatically tracks things like how recently you created your account, and – big red flag – whether you’re in the same city or Trying to rent a property in the city where you live.
It also raises questions about the length of your stay – just one night is a potential concern – and whether the planned trip is taking place during a heavy period of revelry like Halloween or New Year’s Eve.
Image Source: Naba Banerjee
Naba Banerjee, head of safety and trust at Airbnb, stops hundreds of thousands of home parties
“If someone is booking a room for just one night during New Year’s Eve, and they’re from the same city as the host, it’s like having a party,” says Naba Banerjee, head of safety and trust at Airbnb. is likely to.”
Ms Banerjee further says that if the AI feels that the risk of a party booking is too high, it will block the booking, or instead direct the person to the website of one of its partner hotel companies. She says it’s an issue of trust, that people renting out their homes through Airbnb feel as assured as possible.
One such person is Lucy Patterson. She rents out a one-bedroom annex next to her home in Worcestershire, and has had more than 150 bookings since she first listed the apartment.
“Part of my plan for becoming an Airbnb host is that I only have a one-bedroom space to minimize the possibility of parties,” she says. “Of course it hasn’t always been perfect, but I would say 99% of my guests have been fantastic.”
She says Airbnb’s new use of AI has given them “more reassurance.”
Going forward, Ms Banerjee says AI will only get better and better as it learns the more data it processes.
Image source, Lucy Patterson
Lucy Patterson says she’s convinced by Airbnb’s use of AI to screen out potential partygoers
In the car sharing space, Turo, one of the largest online marketplaces, also uses AI systems to help protect people who let others rent their cars.
The software, a platform called DataRobot AI, can instantly detect theft risks. It also sets prices for cars, which are determined based on their size, power and speed and the time of day or week a person wants to start their rental period.
Separately, Turo also uses AI to allow some users to talk to its app to tell it what car it wants and when. The AI will then present a personalized list of recommended vehicles matching the criteria along with text on the screen. The service is currently available to customers of the popular consumer AI system ChatGPT-4, which has been incorporated into Turo’s system.
“We couldn’t make it easier to browse Turo, and that can help build trust between us and our customers,” says Albert Mangahas, the company’s chief data officer.
Using AI to filter out potentially problem customers is a good idea, says Edward McFolland III, assistant professor of technology and operations management at Harvard Business School. “Having that layer of AI can help reduce friction on both the business and consumer sides.”
However, he points out that even a perfectly calibrated AI model can produce false negatives, such as excluding a young person who wants to rent an apartment for New Year’s Eve but doesn’t want to host a party. Have no intention of doing so. “And that’s why it’s still so hard to get AI technology right all the time.”
Image source, Lara Bozabelian
Lara Bouzabelian does not accept tenants for the first time
In Toronto, Canada, Lara Bouzabelian uses Airbnb to rent out her family’s 3,000-square-foot (280 square meter) cottage. She says no matter how lucrative the booking is, or what the AI has determined, she follows her rules.
“I don’t take customers who are first-time users. I need to know that someone has checked in on them at some point.”