Dallas' biggest live act last weekend wasn't even technically alive. Hatsune Miku' is a 3D projection, an animated singer created in Japan who sings and dances next to a live band on stage. Her acolytes began to arrive at The Bomb Factory on Saturday afternoon, hours before the concert, and they came in droves.

The line to get into the sold-out venue for Miku Expo 2016 snaked around the block, and another block. The Bomb Factory labeled the show “all ages,” and the crowd proved it. Confused father and mothers stood with their wide-eyed kids, bearded Dungeon Master types stood in packs of two and three, and androgynous emo men wearing flowing dresses mixed with the throng. Women and girls wore short plaid skirts and long leggings. Representatives from all ages, sexes and ethnicities wore the aqua-blue wigs associated with Miku, the global phenomenon.

Technically, Miku is not an artificial intelligence. She’s an “embodied agent”—a computer program with an avatar built to interact with the physical world. But she’s no mere animation. Because her “voice” is actually programmed like a synthetic instrument, processed and smoothed by algorithms to simulate speech, she is known as a “vocaloid.”

“She started as software, you know,” said 24-year-old Jessica Tuenje, attending the show with her similarly blue-wigged 14-year-old sister. “I’ve been into Miku since 2007. I’d have to look around online just to find the Japanese versions. Now this has blown up worldwide and there are a lot more English songs. It’s come a long way.”

When the Fort Worth resident she says “it,” she doesn’t mean the avatar, she means Miku’s devoted fan base. The real show here is not the 3D anime vixen twirling on stage, but the solidarity of the fan base, which knows every beat, outfit, virtual sidekick and glow stick color change. A 3D animation brought these disparate people together to listen to music with lyrics in a language most can’t understand.

Enthusiasm, mass appeal, razor-sharp marketing and crowd manipulation — this Hatsune Miku is too good for pop entertainment. She should run for president.

Like any epic talent, Miku came from humble beginnings. Her origin story is rooted in music industry software, not talent recruitment.

Yamaha developed the vocaloid concept in the early 2000s. The idea was simple: Instead of paying vocal artists to sing, what if researchers could make a synthesizer that could approximate a human voice? This “singer in a box” could open up new creative venues, especially in the world of synth-pop.

Here’s how it works, more or less. A voice actor provides samples of sounds for a digital library. Users type in the lyrics and melody, and the voice follows along. The reason this doesn’t sound like the computer from WarGames is because the software includes a Synthesis Engine that coverts pitch, manipulates timbre and adjusts timing. The software also adds stress to pronunciations and vibrato, but it can’t approximate a shout. (Grunge is safe from vocaloids. For now.)

