YouTube is introducing changes to two crucial browsing features as concerns grow over harmful content spreading through the platform’s recommendation algorithm.

Sortable topics and filtering capabilities are being added to YouTube’s homepage and “up next” recommended categories, according to a new blog post. The goal is to give users more control over what they see on the site. People will have the ability to click on certain topics they want to explore, like DIY crafts or music videos, from the top of the homepage. The same type of filtering applies to “up next,” which will allow users to more specifically curate what type of videos they want to see as recommendations and which creators they want to see videos from. Similar product changes were first rumored earlier this year.

“Our goal is to explain why these videos surface on your homepage.”

YouTube is also giving people more control over the type of videos they don’t want to see. This includes content that is irrelevant to their interests that might have otherwise been recommended by the company’s algorithm. A new option called “Don’t recommend channel” will soon be available for people to click in order to hide content from a specific creator or brand. Those videos and creators are still discoverable on the platform, according to the blog post, but they won’t surface in the recommended tab or homepage.

New transparency tools being rolled out with the update will show people why certain videos or creators are being recommended, too.

“Sometimes, we recommend videos from channels you haven’t seen before based on what other viewers with similar interests have liked and watched in the past,” Essam El-Dardiry, product manager at YouTube, wrote in the blog post. “Our goal is to explain why these videos surface on your homepage in order to help you find videos from new channels you might like.”

It’s a clear effort from YouTube to tackle ongoing concerns from users, critics, and policymakers over borderline harmful content being recommended to users. Notable pieces from publications like The New York Times have demonstrated how YouTube’s recommendation algorithm can send people down a rabbit hole of harmful content that could lead to radicalization. Much of this content doesn’t technically break YouTube’s community guidelines or terms of service, so is allowed to remain up — even though it’s sometimes considered harmful to society.

YouTube’s upcoming changes also won’t allow borderline content to appear as recommended topics on people’s homepage or recommended feed. While the changes don’t necessarily imply a change to the company’s algorithm, the data YouTube collects from what people want to see and decide to hide could lead to recommendation algorithm changes down the road.

Some of these product changes are scheduled to roll out in the coming days, while others will occur at later dates.