Use AI To Build What Your Customers Actually Want

Aalap Davjekar
14 min read2 days ago

--

So many ideas. How do you find the right one?

This article discusses designing a feature set for a software product with the help of an AI large language model. If you’re interested in using AI for software development, please read this instead.

Remember the last time you tried to build something people actually wanted? If you’re like most of us, it probably involved a lot of guesswork, late nights, and maybe too much caffeine. But there has to be a better way, right? Thankfully, it’s 2024, and the answer is a resounding YES.

Gone are the days of throwing features at the wall and seeing what sticks. With breakthroughs in large language models, we can start development through simpler means. We’re talking about a smarter, more targeted approach.

Today, we’re going to learn how to brainstorm features for a product using AI — I prefer Anthropic’s Claude but you can use any LLM you like: ChatGPT, Gemini, Perplexity. It won’t make much of a difference.

For the purposes of this tutorial, we’re going to build a feature set for a web scraper that people will actually want to use. Why a web scraper? Well, it’s a tool that’s useful across industries, from e-commerce to academic research; it’s complex enough to show off AI’s muscles, but not so complicated that we’ll get lost in the weeds; it’s also something I’ve been building over the past couple of months and, so far, it’s been a very interesting project!

For now, here’s what we’re going to cover:

  1. Sketching out our product idea
  2. Figuring out who might use it and digging into their problems
  3. Choosing the right features

By the end of this, you’ll have a roadmap for using AI to design products that people actually need.

Step 1: Painting the Big Picture

Before progressing any further, it’s important to get a bird’s-eye view of our project. Think of this as sketching on a napkin, but instead of a pen, we’re using AI.

First, let’s ask Claude to give us a general overview of our future product. Here’s what we might say:

Can you give me a quick rundown of what a web scraper does?

Claude might respond with something like:

A web scraper is a program that automatically extracts data from websites.

Here’s a quick overview of what it does:
- Sends requests to web pages
- Downloads the HTML content
- Parses the HTML to extract specific data
- Stores the extracted data in a structured format

Web scrapers are used for various purposes like price monitoring, lead generation, or research. They can collect large amounts of data much faster than manual methods.

Now that we’ve got the basics, let’s use AI to brainstorm some possibilities. We could ask:

What are some unique ways a web scraper could be used across different industries?

Claude might respond with:

Web scrapers can be applied creatively across various industries. Here are some unique use cases:

Real Estate:
— Tracking property price trends and market dynamics
— Analyzing neighborhood amenities and crime rates

Healthcare:
— Monitoring drug pricing across pharmacies
— Aggregating clinical trial data from multiple sources

Finance:
— Collecting financial statements for competitor analysis
— Tracking sentiment on social media for stock predictions

E-commerce:
— Dynamic pricing based on competitors’ rates
— Identifying emerging product trends

The list goes on…

We’re quickly building a picture of what our product can do. Let’s push it further:

How could a web scraper incorporate recent tech trends like AI or blockchain?

This is where things get interesting:

Integrating web scraping with emerging technologies like AI and blockchain can lead to powerful and innovative applications. Here’s how these technologies could be combined:

AI-Enhanced Web Scraping:

— Natural Language Processing (NLP): Improve data extraction accuracy by understanding context; Automatically categorize and summarize scraped content

— Computer Vision: Extract data from images and infographics; Navigate CAPTCHAs and other visual challenges

— Machine Learning: Adapt to changes in website structures automatically; Predict optimal scraping times to avoid detection

Blockchain and Web Scraping:

— Data Verification: Use blockchain to create tamper-proof records of scraped data; Establish data provenance and authenticity

— Decentralized Scraping Networks: Distribute scraping tasks across a network of nodes; Improve resilience and bypass IP-based blocking

— Tokenization of Data: Create data marketplaces where scraped information can be traded securely;

Combining AI, Blockchain, and Web Scraping:

— Self-Improving Data Collection: Use AI to continuously optimize scraping strategies; Store performance metrics and learned patterns on blockchain

— Ethical Data Harvesting: Implement AI-driven consent management; Use blockchain to track data usage and ensure transparency

— Predictive Analytics: Combine historical scraped data (stored on blockchain) with AI models for forecasting

By now, we’ve gone from a vague idea of a web scraper to a world of possibilities. However, at this stage, we’re not committing to anything. We’re just exploring possibilities, letting our imagination run wild.

It’s tempting to jump straight into specifics, but this broader view will pay dividends later. It gives you a wider pool of ideas to draw from and helps you spot connections you might have missed otherwise.

Step 2: Who’s Going to Use This Thing?

Now that we’ve got a rough sketch of possibilities, it’s time to figure out which direction we want to take so we can narrow down and focus on a very specific solution.

A few years ago, you might have had to resort to market research tools like Ibisworld, Semrush’s Market Explorer, or Think with Google to do this kind of research, but thanks to LLMs, you can brainstorm ideas faster first and then validate them using these tools. Keep in mind, the vast majority of market analysis tools are NOT free.

Let’s start by asking Claude to help us create some detailed customer profiles. We might say:

What type of demographics would find this tool most useful? 

Create a list of pain points for each potential demographic and why they might use a web scraper.

Claude might come back with something like this:

Small Business Owners
Pain Points:
— Limited resources for market research
— Difficulty keeping up with competitors’ pricing
— Struggle to identify new business opportunities

Use Cases:
— Monitoring competitor prices and product offerings
— Gathering customer reviews and feedback
— Identifying trending products in their niche

Financial Analysts
Pain Points:
— Need for real-time market data
— Difficulty in aggregating financial information from multiple sources
— Identifying market trends quickly

Use Cases:
— Scraping stock prices and financial statements
— Monitoring economic indicators across various websites
— Collecting analyst reports and predictions

Now, let’s dig deeper.

We want to get inside people’s heads. So we might ask:

Evaluate the intensity of these pain points and how urgently people are seeking solutions. 

Tabulate this data.

Add a column of average income levels and spending habits of each demographic.

Add a column of the average typical budget allocations for this solution.

This is where AI shines.

It can draw on vast amounts of data to paint a vivid picture of each user type.

We’re not pulling these profiles out of thin air. Claude is basing these on real-world data and trends. But it’s up to us to use this information wisely and to make we cross-verify it with multiple data sources.

We need to look for common threads, unexpected insights, and areas where our web scraper could really make a difference. And of course, it’s our responsibility to double check these facts from real-world statistics.

Here’s Claude’s response (formatted to an ASCII table as medium doesn’t support HTML tables):

+------------------------+------------+---------+----------------+------------------+------------------+
| Demographic | Pain Point | Urgency | Avg. Income | Spending Habits | Solution Budget |
| | Intensity | (1-10) | (USD/year) | | (USD/year) |
+------------------------+------------+---------+----------------+------------------+------------------+
| Small Business Owners | High | 8 | 70,000-120,000 | Cost-conscious | 1,000-5,000 |
+------------------------+------------+---------+----------------+------------------+------------------+
| Academic Researchers | Medium | 6 | 50,000-90,000 | Grant-dependent | 500-2,000 |
+------------------------+------------+---------+----------------+------------------+------------------+
| Journalists | High | 9 | 45,000-75,000 | Budget-limited | 1,000-3,000 |
+------------------------+------------+---------+----------------+------------------+------------------+
| Financial Analysts | Very High | 10 | 80,000-150,000 | Tech-investing | 5,000-20,000 |
+------------------------+------------+---------+----------------+------------------+------------------+

Claude even offered some insights into the data it put together…

Here are some key observations from this analysis:

Financial analysts show the highest urgency and are willing to allocate larger budgets for web scraping solutions, likely due to the direct impact on their work and potential for high ROI.

Small business owners have high urgency but more limited budgets, reflecting their need for cost-effective solutions.

Academic researchers have lower urgency and smaller budgets, often constrained by grant funding.

Step 3: Features That Hit the Spot

So, we’ve done our homework. We know our users, their problems, and their wallets. This lets us deduce which type of user we want to build our product for.

Since financial analysts have the highest pain point intensity, solution urgency, and solution budget, they make a relatively easy group to cater to.

Now comes the fun part: deciding what our web scraper will actually do. It’s time to build our feature set, and AI is going to help us make it sing.

Let’s start by asking Claude to generate some feature ideas for financial analysts:

Based on the problems we've identified for financial analysts, what features should our web scraper have? 

Prioritize ones that are relatively easy to build but offer high value.

For each feature, rate its ease of implementation on a scale of 1-10.

Rate its potential value to users on a scale of 1-10.

Tabulate the data.

Claude might come back with a list like this:

+------------------------------------------+----------------------+---------------+
| Feature | Ease of Development | User Value |
| | (1-10) | (1-10) |
+------------------------------------------+----------------------+---------------+
| Real-time stock price scraping | 7 | 10 |
+------------------------------------------+----------------------+---------------+
| Financial statement data extraction | 6 | 9 |
+------------------------------------------+----------------------+---------------+
| News sentiment analysis | 5 | 7 |
+------------------------------------------+----------------------+---------------+
| Customizable data export (CSV, JSON) | 8 | 7 |
+------------------------------------------+----------------------+---------------+
| Automated scraping scheduling | 7 | 8 |
+------------------------------------------+----------------------+---------------+
| Basic data visualization | 6 | 7 |
+------------------------------------------+----------------------+---------------+
| Multiple source aggregation | 5 | 9 |
+------------------------------------------+----------------------+---------------+
| API integration | 6 | 8 |
+------------------------------------------+----------------------+---------------+
| Alert system for significant changes | 7 | 8 |
+------------------------------------------+----------------------+---------------+
| Historical data archiving | 8 | 7 |
+------------------------------------------+----------------------+---------------+

Let’s see if we can narrow down even further and find a single feature to focus on.

Given our target demographic (financial analyst) and their budgets, which single feature should we prioritize for our MVP?

Claude’s response:

Given this information and our feature analysis, the single feature we should prioritize for our MVP is Real-time stock price scraping.

Here’s why:

— Highest user value (10/10): Real-time stock prices are crucial for financial analysts making timely investment decisions.

— Relatively easy to implement (7/10): While not the easiest feature, it’s still quite feasible to implement effectively.

— Aligns with budget and urgency: Financial analysts are willing to pay for tools that provide immediate, actionable data. Real-time stock prices directly address their high urgency need.

— Foundational feature: This can serve as a solid base for future features. Once we have real-time stock price scraping, we can build other valuable features on top of it (e.g., alerts, visualization, historical archiving).

— Clear value proposition: It’s easy for potential customers to understand the benefit, which can help with initial adoption and sales.

— Differentiation: While stock prices are available on many platforms, a dedicated scraping tool could offer more flexibility, customization, and integration possibilities.

This helps us focus on the feature that will give us the biggest bang for our buck right out of the gate.

Remember, we’re not just blindly following AI suggestions. We’re using AI to generate ideas and provide analysis, but we’re applying our human judgment to make the final calls. We’re not trying to boil the ocean here. We’re aiming for a feature set that will solve real problems for our users, set us apart from the competition, and give us room to grow.

Step 4: Taking it Even Further

We’ve done the legwork. We’ve crunched the numbers. We’ve picked our features. Now it’s time for the grand finale: creating a feature set that speaks to multiple demographics and spotting those unexpected synergies. Let’s see how AI can help us bring it all home.

First, let’s ask Claude to summarize our work:

Can you create a table that shows our top features, which demographics they appeal to, and their ease/value ratings?
+--+----------------------------------------+-----------------------------------------------+----------------------------+---------------------------------+---------------------+--+
| | Feature | Primary Demographics | Secondary Demographics | Ease of Implementation (1-10) | User Value (1-10) | |
+--+----------------------------------------+-----------------------------------------------+----------------------------+---------------------------------+---------------------+--+
| | --- | --- | --- | --- | --- | |
| | Real-time stock price scraping | Financial Analysts | E-commerce Entrepreneurs | 7 | 10 | |
| | Financial statement data extraction | Financial Analysts, Small Business Owners | Journalists | 6 | 9 | |
| | News sentiment analysis | Journalists, Financial Analysts | Digital Marketers | 5 | 8 | |
| | Customizable data export (CSV, JSON) | Academic Researchers, Data Scientists | All demographics | 8 | 7 | |
| | Automated scraping scheduling | Digital Marketers, E-commerce Entrepreneurs | All demographics | 7 | 8 | |
| | Basic data visualization | Journalists, Small Business Owners | Academic Researchers | 6 | 7 | |
| | Multiple source aggregation | Journalists, Academic Researchers | Financial Analysts | 5 | 9 | |
+--+----------------------------------------+-----------------------------------------------+----------------------------+---------------------------------+---------------------+--+

Now, let’s dig for gold. We could ask:

Based on this table, can you identify any unexpected synergies or ways these features could work together to provide extra value?

Claude’s response:

+--+--------------------------------------------------------------+------------------------------------------------------------------------------------+---------------------------------------------------+--------------------------+--+
| | Feature Combination | Synergy Description | Target Demographics | Potential Value (1-10) | |
+--+--------------------------------------------------------------+------------------------------------------------------------------------------------+---------------------------------------------------+--------------------------+--+
| | --- | --- | --- | --- | |
| | Real-time stock prices + News sentiment analysis | Correlate price movements with news sentiment for predictive insights | Financial Analysts, Traders | 9 | |
| | Financial statement extraction + Historical data archiving | Track company financial health over time for trend analysis | Financial Analysts, Academic Researchers | 8 | |
| | Product pricing monitoring + Alert system | Notify e-commerce businesses of competitor price changes in real-time | E-commerce Entrepreneurs, Small Business Owners | 9 | |
| | API integration + Automated scraping scheduling | Allow seamless integration of scraped data into existing workflows and tools | All demographics, especially tech-savvy users | 9 | |
| | Real-time stock prices + Financial statement extraction | Offer a comprehensive view of a company's financial health and market perception | Financial Analysts, Investors | 10 | |
+--+--------------------------------------------------------------+------------------------------------------------------------------------------------+---------------------------------------------------+--------------------------+--+

You can play around with this as much as you like. There are many, many different possibilities to explore and with enough thought put into your project, you might develop something noone has come up with before!

Just keep in mind that LLMs hallucinate all the time so you should always take these responses with a grain of salt and do your own research.

Prompts to Keep You Going…

User Experience and Interface Design

1. How can we simplify our product’s onboarding process to reduce the learning curve for new users?
2. What accessibility features could we implement to make our product more inclusive?
3. How might we incorporate voice commands or natural language processing to enhance user interaction?
4. In what ways can we use data visualization to make complex information more digestible for users?
5. How can we design our interface to be both powerful for advanced users and intuitive for beginners?

Feature Development and Innovation

1. What emerging technologies could we integrate into our product to give it a competitive edge?
2. How can we make our product more scalable to handle increasing amounts of data or users?
3. What features could we add to make our product more collaborative and team-friendly?
4. How might we use machine learning to enhance our product’s capabilities over time?
5. What cross-platform capabilities should we prioritize to increase our product’s versatility?

Marketing and Branding

1. How can we leverage user-generated content to enhance our marketing efforts?
2. What kind of strategic partnerships could amplify our product’s reach and credibility?
3. How might we use storytelling in our marketing to better connect with our target demographics?
4. What unique selling proposition should we focus on to differentiate ourselves in the market?
5. How can we use social proof and case studies to build trust with potential customers?

Monetization and Pricing Strategy

1. What alternative revenue models could we explore beyond traditional subscriptions?
2. How can we structure our pricing tiers to encourage upgrades and reduce churn?
3. What kind of loyalty program could we implement to increase customer lifetime value?
4. How might we use dynamic pricing to optimize our revenue while remaining fair to users?
5. What complementary products or services could we offer to increase our average revenue per user?

Long-term Strategy and Growth

1. How can we future-proof our product against potential changes in technology or regulations?
2. What adjacent markets or use cases could we expand into with our current product as a foundation?
3. How might we build a developer ecosystem around our product to encourage third-party innovations?
4. What strategies could we employ to enter international markets effectively?
5. How can we use data analytics to predict and adapt to evolving user needs?

Customer Support and Community Building

1. How can we empower our users to help each other, reducing the load on our support team?
2. What kind of educational content could we create to help users get the most out of our product?
3. How might we gamify the process of providing feedback to encourage more user input?
4. What strategies could we use to turn our most engaged users into brand ambassadors?
5. How can we use community events or challenges to increase user engagement and loyalty?

Product Integration and Ecosystem

1. What APIs or integration points should we prioritize to make our product more valuable to users?
2. How can we create a marketplace for user-created add-ons or extensions to our product?
3. What strategic acquisitions or partnerships could enhance our product’s capabilities?
4. How might we use open-source components to accelerate development and build goodwill?
5. What IoT (Internet of Things) integrations could make our product more relevant in a connected world?

Ethical Considerations and Social Responsibility

1. How can we ensure our product is being used ethically and responsibly by our customers?
2. What steps can we take to minimize our product’s environmental impact?
3. How might we use our product or company resources to contribute positively to society?
4. What safeguards should we implement to protect user privacy and data security?
5. How can we make our product development process more inclusive and diverse?

Remember, we’re not just building a list of features — we’re crafting a cohesive product. Each piece should fit together and contribute to the overall value.

Take a moment to review what we’ve created. Does it feel like a product you’d be excited to use? Can you see how it solves real problems for our target users? If so, congratulations! You’ve just used AI to guide you through the product development process.

Conclusion

Let’s take a quick lap around what we’ve accomplished:

  1. We sketched out a big-picture view of our product
  2. We created demographics and dug into their problems
  3. We dreamed up features for our demographics and prioritized them
  4. And finally, it’s time to develop our product!

Thanks for reading!

--

--

Aalap Davjekar

Technical writer and web developer based in Goa, India. Passionate about working at the intersection of art and technology.