Test Generation

Complete reference for the 10 MCP tools that generate test cases from code changes, tickets, designs, specs, videos, and natural language requirements.

circle-info

Who is this for? SDETs, developers, and DevOps engineers integrating ContextQA with AI coding assistants (Claude, Cursor) or CI/CD pipelines.

These 10 tools are the fastest path from any source artifact to a runnable ContextQA test case. Each accepts a different input format and returns fully structured test cases ready to execute.


generate_tests_from_code_change

Generates targeted test cases by analyzing a git diff or pull request description. The tool identifies which user-facing flows are affected by the code change and creates regression tests for them.

Category: Test Generation Authentication required: Yes

Parameters

Name
Required
Type
Description

diff_text

string

The raw git diff output or a PR description

app_url

string

Base URL of the application being changed

name_prefix

string

Prefix to add to generated test case names (e.g., PR-1234_)

Returns

JSON object with:

  • test_cases_created — number of test cases generated

  • changed_files — list of files identified in the diff

  • test_cases — array of created test case details (IDs, names, step counts)

Workflow

Tips

  • Works best with focused diffs (one feature area per run).

  • For large PRs with 50+ changed files, split into smaller diff segments for more targeted tests.

analyze_test_impactcreate_test_casegenerate_edge_cases


generate_tests_from_jira_ticket

Reads a Jira or Azure DevOps ticket — including its description, acceptance criteria, and comments — and generates corresponding test cases.

Category: Test Generation Authentication required: Yes

Parameters

Name
Required
Type
Description

ticket_id

string

Ticket identifier (e.g., APP-1234, CQA-567)

include_acceptance_criteria

boolean

Whether to parse acceptance criteria into separate test scenarios (default: true)

Returns

JSON object with generated test scenarios including IDs and step previews.

Notes

  • The integration must be configured in ContextQA Settings → Integrations → Product Management before this tool can read ticket content.

  • Each acceptance criterion becomes its own test case.

generate_tests_from_linear_ticketcreate_defect_ticket


generate_tests_from_linear_ticket

Creates test cases from a Linear issue. Accepts ticket fields directly — fetch the issue from the Linear MCP first, then pass the data here.

Category: Test Generation Authentication required: Yes

Parameters

Name
Required
Type
Description

ticket_id

string

Linear issue identifier (e.g., ENG-789)

title

string

Issue title

description

string

Full issue description

app_url

string

URL of the application to test

steps_to_reproduce

string

Steps to reproduce (for bug tickets)

expected_behavior

string

Expected outcome

actual_behavior

string

Actual outcome (what's wrong)

Returns

JSON object with created test case details.

Workflow with Linear MCP

generate_tests_from_jira_ticketreproduce_from_ticket


generate_tests_from_figma

Analyzes a Figma design file to extract UI flows and generate corresponding test cases. The AI examines screen designs, interactive components, and flow connections.

Category: Test Generation Authentication required: Yes

Parameters

Name
Required
Type
Description

figma_url

string

Figma file or frame URL (must be publicly accessible or shared via link)

Returns

JSON object with generated test scenarios derived from the design.

Tips

  • Share the specific frame or flow you want tested, not the entire file, to get the most focused results.

  • Works best with annotated designs that include interaction notes.

  • Generated tests reflect the intended design — run them against staging to verify the implementation matches the design.

generate_tests_from_requirementscreate_test_case


generate_tests_from_requirements

Converts a block of plain-text requirements into automated test scenarios. Suitable for PRDs, feature specs, or user story documents.

Category: Test Generation Authentication required: Yes

Parameters

Name
Required
Type
Description

requirements_text

string

Raw requirements text (paste the document content directly)

Returns

JSON object with generated test scenarios mapped to requirement sections.

Example

This generates separate test cases for: the forgot password link, the email delivery, link expiry behavior, and password complexity validation.

generate_tests_from_excelgenerate_tests_from_figma


generate_tests_from_excel

Parses an Excel or CSV file containing manual test cases and converts them into automated ContextQA tests. Useful for migrating existing manual test libraries.

Category: Test Generation Authentication required: Yes

Parameters

Name
Required
Type
Description

file_path

string

Absolute local path to the .xlsx or .csv file

sheet_name

string

Specific sheet to parse (default: first sheet)

Returns

JSON object with generated test cases matched to spreadsheet rows.

Expected spreadsheet format

The tool recognizes common test case template formats. For best results, include columns:

  • Test Case Name or Title

  • Steps or Test Steps

  • Expected Result

  • URL (optional)

generate_tests_from_requirementsmigrate_repo_to_contextqa


generate_tests_from_swagger

Ingests an OpenAPI/Swagger specification and generates comprehensive API contract and coverage tests.

Category: Test Generation Authentication required: Yes

Parameters

Name
Required
Type
Description

file_path_or_url

string

Local file path or URL to the OpenAPI spec (JSON or YAML)

Returns

JSON object with generated API test cases covering endpoints, methods, and response schemas.

Coverage

For each endpoint discovered, the tool generates:

  • Happy path — valid request with expected 2xx response

  • Authentication failure — missing or invalid token → 401

  • Validation errors — missing required fields → 400/422

  • Not found — requests with non-existent resource IDs → 404

Example

generate_tests_from_requirementsexecute_test_suite


generate_tests_from_video

Analyzes a screen recording (.mp4, .webm) of a user performing actions in the application, and converts the observed user journey into an automated test.

Category: Test Generation Authentication required: Yes

Parameters

Name
Required
Type
Description

video_file_path

string

Absolute local path to the video file

extract_transcripts

boolean

Whether to use audio transcription to extract additional context (default: false)

Returns

JSON object with generated test cases derived from the video analysis.

Tips

  • Record at a standard browser resolution (1280×800 or 1920×1080) for best OCR accuracy.

  • Keep recordings under 10 minutes; longer recordings may produce overly broad test cases.

  • Enable extract_transcripts: true if your recording includes narration describing test intent.

generate_tests_from_requirementscreate_test_case


generate_tests_from_analytics_gap

Converts a high-traffic, untested user flow identified by analyze_coverage_gaps into an automated test case.

Category: Test Generation / Analytics & Coverage Authentication required: Yes

Parameters

Name
Required
Type
Description

flow_event_sequence

array

Ordered list of analytics event names representing the user flow (from analyze_coverage_gaps output)

Returns

JSON object with the generated test case that covers the identified gap.

Workflow

analyze_coverage_gapscreate_test_case


generate_edge_cases

Generates boundary and negative test scenarios for a given feature or component using AI inference. Produces test cases that typical happy-path test generation misses.

Category: Test Generation Authentication required: Yes

Parameters

Name
Required
Type
Description

context_query

string

Description of the feature or component to generate edge cases for

Returns

JSON object with edge case scenarios including:

  • Boundary value tests (min/max inputs)

  • Invalid data formats

  • Concurrent access scenarios

  • Session timeout handling

  • Error recovery paths

Example

Generated edge cases include: duplicate email registration, password exactly at 8 characters, password at 7 characters (should reject), phone number with country code, special characters in email local part, etc.

generate_tests_from_requirementscreate_test_case


generate_contextqa_tests_from_n8n

Generates ContextQA test cases from an n8n workflow. Tests the happy path through the workflow, triggering it and validating each node's execution result.

Category: Test Generation Authentication required: Yes

Parameters

Name
Required
Type
Description

file_path_or_url

string

Local path to an n8n workflow JSON export, a direct JSON URL, or an n8n Cloud workflow page URL (requires N8N_API_KEY environment variable)

app_url

string

Base URL of the application the workflow interacts with

Returns

JSON object with status and an array of created test cases, one per workflow path.

n8n API key configuration

For n8n Cloud URLs, set the N8N_API_KEY environment variable on the MCP server before starting:

create_test_caseexecute_test_suite

Last updated

Was this helpful?