Equipping creators with tools to Understand Audiences and Inspire Content
An audacious plan to be become an essential part of the content creator workflow.
Equipping creators with tools to Understand Audiences and Inspire Content
An audacious plan to be become an essential part of the content creator workflow.
A page from the web app that was being developed by BENLabs with the “Action Panel” open on the right. The action panel is a container that houses all the tools and services that a content creator needs to gain insights into their audiences and generate new content.
A page from the web app that was being developed by BENLabs with the “Action Panel” open on the right. The action panel is a container that houses all the tools and services that a content creator needs to gain insights into their audiences and generate new content.
Background
BENLabs had an ambitious plan: Become the go-to platform where content creators can research, plan, and execute all of their work by leveraging BENLabs’ immense data and AI services. I was the design lead on a team tasked with researching, planning, and designing the foundational tool set that content creators would use on the platform.
Problem
I feel like my memory is being affected... like, after browsing through videos on TikTok or Youtube shorts... I look back and I'm like I don't remember what 90% of the videos that I watched. Sometimes, I get this feeling where I've looked through so much content, I've looked at so many creators, that my brain's just been on overload.
- Research participant
Content creators can spend 5-10 hours or more a week on researching their niche. This turns out to be 260-520 hours a year on research alone. With that much time spent,  it can quickly become overwhelming trying to organize, process, and pull insights from everything they've gone over. This doesn’t include planning, scripting, filming, or editing, which are additional hours of work.
Initial Research - Conducting user interviews to develop a hypothesis
We believed if we could halve the time it takes content creators to do research, we could help them by giving them time back for what they truly loved, making content. But what does "halving the time" it takes them to do research really mean? To answer these questions I dove into a series of user interviews with content creators. I had lots of questions. What did a content creator's research look like? How much does a content creator's size matter when it comes to the type of research that they conduct? What type of information are they looking for? Through countless of hours of user interviews, I started to develop an idea.
I started to observe that research for a content creator meant understanding their audience and not just their own, personal audience, but the audience of the larger niche that they operated in, and even of audiences adjacent to their niche. They wanted to understand viewing habits, general and specific trends, popular topics, other content creators that were rising, and much more. I took these observations and began to form a a hypothesis.

I hypothesized that: If a content creator was able to better understand their audience, they'd be able to generate better ideas and produce higher-engagement videos.
Goal
So that was it. The goal was to equip creators with the tools necessary to better understand their audiences and to inspire content. I had the mission and off I went.
Internal Research - What kind of AI/ML stuff are we capable of?
One of the first things I wanted to do was understand what we were capable of. This meant I had to meet with our Data Scientists and our Data Engineers, something that I really enjoyed doing. They were some very smart and capable people and they walked me through their work and gave me a good rundown of what we had.
After having in-depth conversations with our data scientists and engineers, I found that we had a lot of cool capabilities. Although some of the capabilities were still in early experimental stages it was still good information. It gave our team some constraints to work within and narrowed down the field of tools that we thought we could offer.
Team Workshop - Sharing information and forming a plan
Example: Some of the ideas and notes generated by the team workshop. We gathered these together and voted on which ones we should try out first based on utility (is it valuable?), feasibility (can we build it?), and usability (will it be easy to use?).
Example: Some of the ideas and notes generated by the team workshop. We gathered these together and voted on which ones we should try out first based on utility (is it valuable?), feasibility (can we build it?), and usability (will it be easy to use?).  
Armed with this knowledge, I was ready for ideas on turning our capabilities into reality. I gathered the team together for a workshop. With the workshop, I wanted to disseminate the information we received from our data people, re-share the information collected from previous research, and finally have everyone on the team generate ideas on how we could combine these things into ideas that would help content creators reduce the time it takes to do research.
Signal Gathering - Taking the best ideas and testing them with users
After the workshop, what I wanted to do next was build out a very simple prototype and conduct some tests with users. I worked closely with my product owner and lead engineer to formulate a research plan. We wanted to make sure that we got a clear about what ideas would work and which ideas wouldn’t.
Example: This is the resulting list that was generated from the usability tests. This shows how each idea was categorize as ‘helpful’ or ‘not helpful’ by users. Out of about 20 ideas, we had 4 ideas that was labeled “helpful” by 80% of our participants.
Example: This is the resulting list that was generated from the usability tests. This shows how each idea was categorize as ‘helpful’ or ‘not helpful’ by users. Out of about 20 ideas, we had 4 ideas that was labeled “helpful” by 80% of our participants.
During the testing, our users were able to experiment with a prototype that mimicked how our team believed our features could work. At the end of the usability tests, I asked users to categorize the ideas that were presented to them as helpful/not helpful. I ran two sets of tests in about two weeks and we came up with a nice little chart that showed how well our ideas ranked.
Design & Release - MVP and preparing iterative updates
Once we got our basic architecture down, we were ready to design, dev, and release. I took our highest ranking concepts and designed the UX for how users would interaction with each of our AI/ML services. This work was divided up into MVP and each subsequent release.
Example: A breakdown of how we divided our work into ‘milestones’ or sprints.
Example: A breakdown of how we divided our work into ‘milestones’ or sprints.
Release & Outcome
My team and I were ecstatic about release. We thought we created something pretty useful and were eager to see how our users would interact with our work. We had our metrics in place to gauge how our work was faring and a slew of follow up items we wanted to follow up with. However! Dun, dun dun! BENLabs had a surprise all-hands meeting with an even bigger surprise! We had run out of funding much faster than we had expected. We believed we had a few years of funding to run on, but it turned out that we had 0 years of funding to run on! Whoa! So sadly, this story ends here. If you're interested in more details, I'd be happy to share them with you. The team that I worked on this project was a great team and I think this really could have worked just given a little more time. Ah, bittersweet ending, no?