Media regulator Ofcom said it had published its first codes of practice on tackling illegal harms such as child sexual abuse and assisting or encouraging suicide.
Sites and apps have until March 16, 2025, to assess the risks illegal content poses to children and adults on their platforms, Ofcom said.
After the deadline, they will have to start implementing measures to mitigate those risks, such as better moderation, easier reporting and built-in safety tests, Ofcom said.
Ofcom Chief Executive Melanie Dawes said the safety spotlight was now firmly on tech companies.
"We'll be watching the industry closely to ensure firms match up to the strict safety standards set for them under our first codes and guidance, with further requirements to follow swiftly in the first half of next year," she said.
The Online Safety Act, which became law last year, sets tougher standards for platforms such as Facebook, YouTube and TikTok, with an emphasis on child protection and the removal of illegal content.
Under the new code, reporting and complaint functions will have to be easier to find and use. High-risk providers will be required to use automated tools called hash-matching and URL detection to detect child sexual abuse material, Ofcom said.
The regulator will be able to issue fines of up to 18 million pounds ($22.3 million) or 10% of a company's annual global turnover if they fail to comply.
Britain's Technology Secretary Peter Kyle said the new codes were a "material step change in online safety".
"If platforms fail to step up the regulator has my backing to use its full powers, including issuing fines and asking the courts to block access to sites," he said.