Skip to main content
In this quickstart, you’ll use Intuned Agent to build a scraper that extracts job postings from Apple’s career page. By the end, you’ll have a working scraper that handles pagination and extracts structured data—all generated from a simple prompt.

What is Intuned Agent?

Intuned Agent is an AI agent that builds, edits, and maintains browser automation Projects. It runs inside the Intuned platform with a real browser, full access to the platform via the Intuned CLI, and works in the background as long as needed. Learn more about Intuned Agent’s full capabilities in the overview.

Prerequisites

  • An active Intuned account (sign up here). No credit card required—Intuned has a free plan.

What you’ll build

You’ll instruct Intuned Agent to build a scraper that scrapes Apple’s career page. The scraper will:
  • Extract job postings with a defined JSON schema
  • Paginate through the full job list
  • Capture detailed information for each posting

Create your first scraper

Enter your prompt

  1. Go to app.intuned.io/agent.
  2. Paste the prompt below into the input and send it.
Prompt
I want to scrape job postings from [Apple career page](https://jobs.apple.com/en-us/search?location=united-states-USA)

FILTER: No need to apply any filters

For each job, I need:

job_title: string
product_and_service: string
post_date: string (iso format)
description: string
summary: string
apply_url: string
role_number: string
Intuned Agent session starting
Expected result: The agent starts a session and begins exploring the target website.

Answer the agent's questions

As the agent explores the site, it asks clarifying questions to shape the scraper—such as which language to use, how to handle pagination, and what to name the project. Answer each question as it appears.
Agent asking which language to use
When asked about pagination, Configurable max pages is the recommended option—it gives you control at runtime without rebuilding the scraper.

Approve the plan

Once the agent has finished exploring, it presents a full plan: the start URL, entity structure, navigation instructions, and schema. Review it and select Approve Plan to proceed, or Suggest Changes to adjust.
Agent plan ready for approval
Expected result: The agent begins building your scraper.

Wait for the agent to build

The agent writes the code, runs end-to-end tests, and validates the output. This typically takes 30–60 minutes.
Agent building and running E2E tests

Review results and deploy

When the agent finishes, it shows a work summary with a description of what was built and the E2E test results. Select View Code to inspect the generated code, or Merge Branch to deploy.
Agent work summary
Select View Code and open the Artifacts tab to inspect the actual scraped data from the agent’s test run before deploying.
Artifacts panel showing scraped results
When you’re satisfied, select Merge Branch. In the dialog, choose Merge & Deploy to create an Intuned project and deploy it immediately, or Merge only to merge the code without deploying.
Merge and deploy dialog
Expected result: Your scraper is deployed and ready to run via API, scheduled Jobs, or direct triggers.

(Optional) Iterate on your scraper

Start a new conversation and select Pick project to load your scraper. You can then ask the agent to add filters, change the schema, fix selectors, or adjust the scraping logic.Example prompt:
Add a parameter to filter by product and service. Examples:
Apple Ads
AirPods

What’s next?

  • Intuned Agent — Learn more about Intuned Agent’s capabilities, including editing existing projects and fixing failed runs.
  • Jobs — Jobs are the common way to run scrapers. Configure a schedule (daily, hourly, or custom) and define a sink to send your scraper results to a webhook, S3 bucket, or other destination.
  • Authentication — For scrapers that require login, Intuned provides built-in authentication support. You define how to log in and how to verify a session, and Intuned handles the rest—validating sessions before runs, reusing them when possible, and recreating them when expired.
  • Monitoring and traces — Every run generates detailed logs, browser traces, and session recordings. Use these tools to debug failures, verify your scraper is working correctly, and understand what happened during execution.
  • Online IDE — Learn more about the Intuned IDE, which you can use to manually edit the scraper you just created.