Community Spotlight: Dean Shev
Most people contain multitudes. Dean Shev contains entire ecosystems.
https://www.youtube.com/watch?v=NPsoTP3hr6k
By Kris Krüg
There’s the healthcare data analyst at Fraser Health Authority who sees the cracks in Canada’s medical system… not from theory but from daily immersion in the wreckage. There’s the 14-year-old dreamer who wanted to be a musician but grew up in a family of engineers where passion didn’t pay bills. And there’s the AI ethics student who came to our Vancouver AI Community meetups to learn, absorbed a values system you can’t get from Coursera or Y Combinator, then went out and built something that actually matters.

Most profiles would pick one Dean to write about. That’s the mistake. Dean’s story isn’t about “following your dreams” or “AI disruption” or any of that tired bullshit. It’s about what happens when someone refuses to choose between technical skills, creative soul, and ethical compass. It’s about building healthcare AI that doesn’t suck because he actually understands all three domains and figured out how to make them reinforce each other instead of compete.
When Dean played that song about our community at meetup #20… “Bringing People in a Circle”I …. watched him stand on stage having synthesized data into art, having turned survey responses into music that made people cry. That’s when I realized we’re not just watching someone win hackathons or launch startups. We’re watching someone invent a new form of expression. And maybe, accidentally, demonstrate what ethical AI development actually looks like when you do it right.
From Russian Engineers to Healthcare Reality
Dean learned early that passion doesn’t pay bills. Growing up in a family of engineers where everything was math and logic, music was a dream. Engineering was survival.
He wanted to be a musician at 14. His family valued practical, logical paths. So he became a healthcare data analyst—stable, respected, useful. The kind of job that makes parents proud and keeps lights on. The music dream didn’t die. It just went underground, waiting.

This isn’t a “sold out to the man” story. Dean made smart choices. He built real expertise. He developed deep healthcare domain knowledge that 99% of AI founders don’t have. While tech bros were reading blog posts about healthcare “opportunities,” Dean was inside Fraser Health Authority, living the reality.
He sees the data on wait times. The walk-in clinic chaos. The family doctor shortage that’s not getting better. He understands that continuity of care isn’t a nice-to-have feature: catching cancer early versus finding it too late. He knows where the system breaks because he’s in the fucking wreckage every day, watching people fall through gaps that shouldn’t exist.
By the time AI exploded onto the scene, Dean had: (1) Deep healthcare analytics expertise, (2) Suppressed creative dreams, (3) Understanding of system-level healthcare problems, and (4) No idea these three things were about to collide in the most interesting way possible.
Learning Ethics in Community, Not Classrooms
Dean stumbled into Vancouver AI Community meetups because he wanted to understand what AI could do for healthcare. What he got was a masterclass in what AI should do, period.
He didn’t just find technical skills. He found ethical frameworks. Indigenous perspectives on data sovereignty. Conversations about responsibility, not just capability. A community that valued “how should we build this?” as much as “can we build this?”
The specific lessons absorbed: Carol Ann Hilton teaching about indigenous economic frameworks and data sovereignty. Discussions about AI bias, transparency, accountability. The crucial difference between healthcare advice and medical advice: an ethical boundary that most healthcare AI startups blur or ignore. Why proprietary health data should serve patients, not companies. The importance of preventive care over reactive treatment.
Dean didn’t just learn AI techniques. He absorbed a values system. He internalized an ethics-first approach that changes everything about how you build.
Most AI healthcare founders go: idea → build → ethics (maybe).
Dean went: ethics → idea → build.

That sequence matters more than anything else. When he says “What if we teach AI to be ethical?” that’s not marketing copy. That’s the foundational question that informed every design decision in VHT. When he focuses on healthcare advice instead of medical advice, that’s not legal caution—it’s ethical clarity learned here, in community, through hundreds of hours of conversation about what technology should and shouldn’t try to do.
The proof is in what he built. VHT doesn’t try to be your doctor. It tries to be your health data memory: a subtle but crucial distinction he learned from us, integrated into his thinking, and made foundational to his company. Ethics weren’t bolted on later. They were there from the start.
When Data Became Music
Round 3 of the Rival Technologies Data Storytelling Hackathon. $2,500 prize. Survey data from 1,000 British Columbians about their AI hopes and fears. Twelve teams competing.
Everyone else built dashboards, visualizations, interfaces, analysis tools. You know, the expected stuff.

Dean dropped a 17-song album where each song represents an actual community member.
The technical sophistication was wild. He combined Rival’s survey data with our meetup transcripts, identified key community voices and their contributions, generated individual musical personalities for each person, created bc-ai.death as the project domain (brilliant choice), built an interactive dashboard to explore songs and data, made everything downloadable and shareable.
But the technical sophistication isn’t what broke my brain. What matters is that Dean didn’t just make songs about people. He tried to capture what each person brings to the community.
Professor P’s song captured his perspective. Kevin Friel’s captured his contribution. Carol Ann Hilton’s captured her essence. My song captured the circle-making, bridge-building work. Fifteen different community members, each with a track that somehow got at something true about who they are and what they do.
“Every song is representation of a person,” Dean explained. “When I was trying to design a song, I was trying to feel and think what is this person brings into this community and an awareness that this person is trying to give us.”
Then he casually mentioned: “There’s 17 songs and there’s like five songs in drafting and each song probably took hours and hours, but I’m mostly insomniac, so it’s fine.”
Translation: Dean worked himself into the ground on this because it mattered. This wasn’t a hackathon hustle to win prize money. This was an artistic statement about what AI creativity could mean.
Here’s the thing about generative audio: it usually lacks soul. You can make a catchy tune, but it feels empty, artificial, like muzak generated by algorithms that don’t understand why humans make music in the first place.
Dean cracked the code. “Generative audio in general lacks soul, but when you mix it up with what we got going on here, like the stories people… he took the transcripts from the stage mixed with the data set… and tried to tell the stories of British Columbians around AI in songs and music.”
AI-generated art has soul when it’s grounded in real human stories, real relationships, real emotional stakes. Dean’s songs work because they’re about people we know, expressing ideas we’ve wrestled with, in a community we’ve built together. The music has meaning because it’s rooted in connection.

He didn’t just win the hackathon. He proved that AI doesn’t have to make us less human. Used right, it can make us more ourselves. Dean became a musician not despite the engineering path, but because of what he learned along the way: the technical skills, the data literacy, the understanding of how to bridge quantitative analysis and emotional truth.
Data doesn’t have to be cold. Statistics can sing. Healthcare can be art. And sometimes the best way to honor a community is to turn their voices into music.
Building VHT: The Healthcare AI That Actually Makes Sense
Dean saw the specific way Canadian healthcare is broken. Not enough family doctors. Lost continuity of care. Walk-in clinics where every visit is the first visit because the doctor’s never seen you before. Medical histories scattered across disconnected systems that don’t talk to each other.
“The odds of that doctor being able to know your full history and give you correct assessment is not going to be likely to happen,” Dean said, and he’s got the data to back it up.
So he built Virtual Healthcare Technologies: VHT. A platform that stores your complete health history: allergies, blood test results, surgery history, medications, any health data you want to track. The more you put in, the better it gets at understanding you specifically.

The AI model is called ECNA. It analyzes your complete profile and provides preventive care recommendations. Not guessing based on population averages. Not treating you like a generic 45-year-old male or whatever. Actually knowing YOUR history, YOUR patterns, YOUR risk factors.
Here’s the crucial distinction that separates VHT from the flood of healthcare AI vaporware: healthcare advice, not medical advice.
This isn’t trying to replace doctors. It’s trying to be the memory system most Canadians don’t have access to anymore. That family doctor who saw you for 20 years and knew your entire health story? Most people don’t have that. What they have is fragmented records, ten-minute appointments with strangers, and a system optimized for acute care instead of prevention.
“You’re getting instantly a machine that does millions upon millions in comparison of your profile and evaluates what possibly it could be and gives you the best preventive care advice,” Dean explained in his community spotlight.
The focus on preventive care is where this gets interesting. Most healthcare is reactive. You get sick, you seek treatment. But what if your AI health assistant noticed patterns in your data that suggested you should talk to your doctor about cardiovascular risk, or mentioned that based on your family history and recent labs, you might want to get screened for X, Y, or Z?
That’s the promise of VHT. Not replacing physicians. Not diagnosing diseases. But providing the kind of continuous, data-informed health guidance that most Canadians simply don’t have access to anymore.
And it’s built on an ethical framework Dean learned in community:
Privacy: Your data serves YOU. It’s not being harvested to train models that benefit everyone except you.
Transparency: Clear about what the AI can and can’t do. No mystification, no magic black boxes.
Boundaries: Healthcare guidance, not medical diagnosis. Know the difference. Respect it.
Regulatory awareness: Dean’s filed medical device registrations. He knows FDA and Health Canada processes.

He understands why proprietary models matter for regulatory approval and why code ownership and control matter for long-term viability.
Preventive focus: Keep people healthy versus just treating disease once it’s already wreaking havoc.
Dean’s not just building on top of ChatGPT and hoping it works. He’s exploring partnerships with UK developers who’ve created proprietary models that reportedly outperform GPT-4 specifically on medical diagnostics tasks. He’s doing technical due diligence most healthtech founders skip. He understands that credibility and proof of concept matter more than hype.
He’s PROSCI/ADKAR certified in change management: understanding that healthcare isn’t just about building better tools. It’s about helping organizations actually adopt them. This combination of technical skill plus organizational change expertise is what makes healthcare AI actually work instead of becoming shelfware.
The Incubator Vision
When Dean and I talked about partnership possibilities in October, what emerged wasn’t just “let’s build a company together.” It was bigger: an infrastructure play.
Dean’s thinking about building an AI healthcare incubator. Not chasing individual projects, but building capacity. Creating space for multiple healthcare AI experiments to run simultaneously. Bringing together developers, clinicians, regulators, and community members. Using patent jam strategies to generate IP through community hackathons. Building an advisory board that spans physicians, AI experts, and ethicists.
The strategic sophistication that came through on that call:

Understanding that user data is potentially more valuable than the model itself: models get better, but high-quality longitudinal health data from engaged users is incredibly rare.
Recognizing the need for a phased approach: start with self-help, move to clinical guidance, eventually navigate regulated medical use. Don’t try to boil the ocean on day one.
Knowing how to access Canadian federal development funding pathways that can support R&D before you need to raise VC.
Learning from adjacent spaces: Dean’s watching DeepWell DTx get FDA approval for digital therapeutics through video games, studying their regulatory pathway, understanding how digital health solutions get approved and how insurance reimbursement codes get established.
When Dean said “If we can increase from 30 percent to 70 percent [analytics vs. physician gut feeling] and reverse it back and say look AI can provide stronger analytical decision support to physician, thus increasing the chance for better diagnostics… That’s huge,” he’s not pitching. He’s synthesizing years of healthcare analytics experience with new AI capabilities and thinking through what actually matters.
He’s not a naive founder thinking AI will “disrupt healthcare” by next quarter. He’s a healthcare insider who understands the system’s complexity, the regulatory reality, the organizational inertia, and the fact that good ideas die without change management expertise. He’s building for the long game.
Teaching the Next Generation
The community flywheel completes: came to learn, built ethical solutions, now teaching others.
Dean co-teaches AI Upgrade for Healthcare Professionals with me and Peter Bittner. He brings healthcare domain expertise, real-world AI implementation experience, PROSCI/ADKAR change management frameworks, the ECHNA ethical AI framework he helped develop, and Fraser Health Authority insider perspective.
The value isn’t just what he teaches. It’s who he is when he’s teaching. He’s not an outside consultant. He’s a healthcare professional teaching other healthcare professionals. He understands their concerns because they’re his concerns. He speaks their language because it’s his language. He knows their constraints because he works within them.
We’re not just upskilling individuals. We’re building a movement of healthcare professionals who understand AI, approach it ethically, know how to evaluate vendor claims versus reality, and can drive adoption from inside their organizations instead of waiting for top-down mandates that never come.
The Supporter Who Shows Up
Details matter. Dean bought a 10-pack of BC+AI memberships right when we launched the nonprofit. Not just attending: investing. Not waiting to see if it works: betting on it early.
He’s at office hours. He contributes to infrastructure conversations. He’s been identified as a potential regional AI organizer. When we thanked supporters at meetup #20, Dean was on that list because he doesn’t just extract value: he puts in. Shows up. Supports. Invests.
This is what makes community actually work. People who understand they’re building something bigger than their individual success. People who know that the ecosystem they strengthen today is the one that’ll support their next project tomorrow.
What Dean Represents
Dean sits at a rare intersection point:
- Technical AI capability
- Healthcare domain expertise
- Creative expression
- Ethical frameworks
- Community values
- Regulatory understanding
- Organizational change management
Most AI healthcare founders have one, maybe two of these. Dean has all of them. That’s not luck: it’s the result of specific choices about how to learn, where to show up, what values to center.
This is what AI development should look like:
Deep domain expertise that comes from years inside the system you’re trying to improve. Ethics learned in community and integrated from the start, not bolted on later when investors ask about it. Creative vision that honors humanity instead of treating people as data points. Regulatory sophistication that comes from actually filing paperwork and understanding approval processes.

Focus on actual problems that keep people up at night, not imagined “disruption opportunities” from startup pitch decks. Building in public with community accountability instead of in stealth mode where only the cap table matters.
Dean couldn’t have built this in Silicon Valley. The ethics-first community learning, the indigenous perspectives on data sovereignty, the focus on public good alongside profit: this is specific to what we’re building in BC. Our ecosystem advantages are cultural, not just technical.
Dean’s success is proof of concept for our community model. We’re demonstrating that the best AI development comes from diverse, values-driven, community-centered environments where people learn together, challenge each other, and hold each other accountable to building technology that serves human flourishing.
Integration as the Actual Future
From 14-year-old music dreamer to healthcare analyst to AI ethics student to hackathon winner to startup founder to educator to community supporter: Dean never abandoned any part of himself. He figured out how to synthesize it all.
The music, the data, the ethics, the healthcare knowledge: they’re not separate activities. They’re one coherent vision of what AI development should be.

He’s building VHT as an immediate solution to Canadian healthcare gaps. He’s modeling ethical healthcare AI development for others to learn from. He’s creating educational pathways so healthcare professionals can understand and adopt AI thoughtfully. He’s investing in community infrastructure so the next generation of builders has better resources than he did. He’s proving that creative dreams and technical expertise can reinforce rather than conflict.
Most healthcare data analysts track numbers. Dean Shev tracks lives, composes songs, builds ethical AI, teaches communities, and refuses to pretend these are separate activities.
That’s not multitasking. That’s integration.
The future of AI isn’t about choosing between art and science, between creativity and analysis, between passion and pragmatism. It’s about refusing to choose. It’s about building systems that honor all of what makes us human: the part that needs healthcare and the part that needs music and the part that demands our technology reflect our values.
Dean’s building that future, one song, one patient record, one community lesson at a time.
And he’s showing the rest of us how it’s done.
Dean Shev is Founder & CEO of Virtual Healthcare Technologies, healthcare data analyst at Fraser Health Authority, co-instructor for AI Upgrade for Healthcare Professionals, Round 3 winner of the Rival Technologies Data Storytelling Hackathon, and an active member of the Vancouver AI Community. His 17-song AI album celebrating community voices is available at bc-ai.death.
Become a BC+AI member at bc-ai.ca/membership.