His comments come days after a gunman killed 50 people in two mosques in Christchurch, New Zealand , and livestreamed the attack as it happened.

"With social media, and with the internet today, basically terrorists can become their own mass media and, certainly, more does need to be done to try to take that ability away from them," Scott Stewart, vice president for tactical analysis at geopolitical intelligence firm Stratfor, told CNBC's " Squawk Box " on Monday.

Social network giants and their users should share the responsibility of alerting authorities about online radicalism as soon as they spot it, according to a security expert.

In the past, terrorists relied on mass media to propagate their messages but the internet has made it easier for them to spread their agenda, said Stewart, previously a special agent for the U.S. State Department who was involved in numerous terrorism investigations.

"One of the things that does need to be done is for people to shout out when they see things developing, and when they see these attacks coming down the pipe. A lot of people scoff at saying 'see something, say something' but it really works," he said, adding that there should be tools made available for people to report instances of radicalism on social media, chat rooms or even in real life.

"I think everybody shares a little bit of the responsibility."

Last Friday, a gunman opened fire at two mosques in Christchurch, and killed 50 people in what is said to be the country's worst ever peacetime mass killing. Dozens more were wounded, some of them critically. Prime Minister Jacinda Ardern called the attack an act of terrorism.

The attacker left behind a lengthy document that stated he was a white nationalist who hates immigrants and was set off by attacks in Europe that were perpetrated by Muslims, the Associated Press reported. He was charged with murder on Saturday.

The shooter livestreamed the attacks on Facebook for 17 minutes using an app designed for extreme sports enthusiasts and copies of that video have reportedly been circulated around various social media platforms such as Twitter, Alphabet's YouTube and Facebook-owned Whatsapp and Instagram.

That led to many calling on social media giants to do more to crack down on online extremism.

Facebook reported in the first 24 hours since the shootings, that it had removed 1.5 million videos of the attack globally — 1.2 million of those were blocked during uploads. In a series of tweets, the social network said it is also removing all edited versions of the video that do not show graphic content.

Facebook tweet: In the first 24 hours we removed 1.5 million videos of the attack globally, of which over 1.2 million were blocked at upload...

The social network giant is reportedly building its own artificial intelligence chips that can help the company filter videos that violate its terms of services quicker.

For her part, Ardern said she plans to hold discussions with Facebook over the issue of livestreaming.

— The Associated Press and Reuters contributed to this report.