I was paid in crypto to write fake Google reviews — and the people who hired me tried to scam me I never set foot in Pompeii or walked through the lobby of the riverside DoubleTree, yet I gave them five-star Google Maps reviews. Over a few days I also posted glowing reviews for an Ibis budget in east London, a central Travelodge, Hyatt Place and dozens of hostels and B&Bs across Europe. For a brief period my “job” was to churn out fake hotel reviews in exchange for payments in USDC, a dollar-pegged stablecoin. What looked like easy money — $5 per review — quickly revealed itself as part of a much larger and more sinister cyber-fraud machine that uses crypto, messaging apps and fake corporate identities. The scheme’s stated purpose may have been fake review generation, but the true target was recruits like me: low-risk, low-suspicion workers who can be used for money laundering and then pushed into bigger scams. How the operation worked - Recruitment: The campaign reached me via Telegram. A recruiter calling herself “Sharon Roberts” messaged first; after nine days of prodding and coaching I was handed off to a “receptionist” named Victoria Castillo. Both accounts almost certainly used false identities. - Onboarding: Victoria walked me through opening a crypto wallet on a US exchange and accepting USDC payments. When I asked about legal reporting, she shrugged it off: “You can ignore this one.” - Tasks and payments: Telegram channels — one impersonating Quad Marketing Agency and others using branding similar to HotelsCombined (part of Booking.com) — posted work from 8am to 7pm UK time. They asked for up to 14 reviews a day at $5 each, interspersed with “business tasks” where workers sent crypto and received slightly more back. I earned $30 for a few hours of work spread over weeks. - Scale and impersonation: The channel posing as Quad had about 16,800 subscribers; a near-identical channel had 14,700. One channel had posted nearly 6,000 requests for fake reviews since mid-March. Quad, Accor, Travelodge, Hilton, Hyatt and Booking.com all told reporters they had no involvement and condemned the misuse of their names and logos. Where crypto fits in - Payments: Recruiters paid in USDC. Chainalysis — the blockchain forensics firm — found the wallets linked to these payments followed a consistent pattern: wallets were topped up, then dispersed tens of thousands of small payments to recruits before moving larger sums on. Typical payout totals per wallet ranged from $300,000 to $600,000 in USDC before transfers onward. - Money-laundering risk: While blockchains are public, criminals commonly “tumble” funds (split and recombine them) to obscure origins. Experts told the reporter the operation likely used recruits to launder proceeds and then attempted to extract money from recruits themselves via “business tasks” and upgrade fees — a variant of employment scams and the “pig butchering” model, where victims are primed with small payouts before being asked for larger sums. Red flags and escalation - Fake identity signals: Victoria’s profile images correlated with photos found on unrelated adult websites, and the recruiter’s English had odd phrasing. The operation’s “business tasks” included paying $50 to get $60 back, with an incremental pyramid-style table promising ever-larger returns (peaking at $16,000 for an initial $10,000 outlay). - Geopolitics and coercion: Investigators have found similar scams running from countries with weak enforcement, and in some cases victims working in these scams are themselves trafficked or coerced in prison-like scam centers. - Automation and human-bot mix: Fraud specialists say companies are deploying automated detection that forces scammers to use “human bots” (real people performing repetitive tasks) or increasingly sophisticated AI agents. There’s also concern about agentic AI that can act autonomously in the real world. Industry, enforcement and platform responses - Platforms: Google said it’s stepped up measures and has removed more than 240 million fake reviews since 2024, restricting 900,000 accounts for policy violations. Booking.com said only verified guests can post reviews, and that it uses automated and human teams to detect fraud. Travelodge and Accor said they don’t create or commission fake reviews and would seek to prevent them. - Regulation: New UK rules (effective from last April) require platforms that host reviews to have clear policies to prevent and remove fake or incentivised reviews, flag suspicious activity and try to ensure reviews are genuine. The Competition and Markets Authority (CMA) found in 2023 that fake review text for products cost UK consumers between £50m and £312m annually, and that 11–15% of reviews in its sample were fake. The CMA has since opened investigations into five companies over misleading online reviews. - Law enforcement and forensics: Chainalysis and other blockchain investigators are increasingly able to trace payment patterns and link wallets, which helps law enforcement identify the industrial scale of these schemes. Expert perspective - Serpil Hall, an anti-fraud consultant, warned that scams have increased greatly over the last six to seven years, aided now by generative and agentic AI that make impersonation easier. She said fraudsters are becoming “very crafty.” - Jacqueline Burns Koven of Chainalysis described the pattern as similar to employment scams: small tasks and initial payouts build trust, then victims are asked to pay to “upgrade” or free up funds, at which point the scammers vanish. Takeaways for the crypto and travel sectors - Crypto is a useful tool for fraud: stablecoins like USDC facilitate micropayments to thousands of workers and can be combined with tumbling services to hide flows. - Platforms must keep improving: the balance of automated filters and human review is key; Google and Booking say they’ve caught large volumes but the problem persists. - Consumers and would-be workers should be cautious: offers that promise easy money via messaging apps, ask for upfront “business task” payments, or operate under borrowed corporate branding are major red flags. In the end, I stopped when the operation began pushing me to pay to progress. When I revealed I was a journalist, contact ceased. I made $30 from several hours of work; the bigger story is the vast network that recruited me, the payment rails it used, and the mounting regulatory and forensic response trying to stop it. For the crypto industry, and platforms that rely on public trust, the incident is another reminder that technology can be repurposed by bad actors — and that the integrity of online reviews and the transparency of crypto flows both matter more than ever. Read more AI-generated news on: undefined/news