Roblox announced it is stepping up its age verification system for users who want to access chat features, in a move to protect younger players. The company will use facial age-estimation technology to group users into appropriate age categories, thereby limiting chat interactions between children and unknown adults. This new requirement is a direct response to mounting criticism and lawsuits over child safety on the popular gaming platform. The enhanced safety measures will begin rolling out in December in Australia, New Zealand, and the Netherlands, with a global implementation planned for early January. While users are not required to submit a scan to use the platform, it will be mandatory for accessing chat functionalities. Roblox stated this initiative is part of its effort to set a higher standard for safety and civility online.