Introducing Deep Research in Azure AI Foundry Agent Service

9 months ago 131

Announcing the nationalist preview of Deep Research successful Azure AI Foundry—an API and SDK-based offering of OpenAI’s precocious agentic probe capability.

Unlock enterprise-scale web probe automation

Today we’re excited to denote the nationalist preview of Deep Research successful Azure AI Foundry—an API and bundle improvement kit (SDK)-based offering of OpenAI’s precocious agentic probe capability, afloat integrated with Azure’s enterprise-grade agentic platform.

With Deep Research, developers tin physique agents that profoundly plan, analyze, and synthesize accusation from crossed the web—automate analyzable probe tasks, make transparent, auditable outputs, and seamlessly constitute multi-step workflows with different tools and agents successful Azure AI Foundry.

AI agents and cognition work: Meeting the adjacent frontier of probe automation

Generative AI and ample connection models person made probe and investigation faster than ever, powering solutions similar ChatGPT Deep Research and Researcher successful Microsoft 365 Copilot for individuals and teams. These tools are transforming mundane productivity and papers workflows for millions of users.

As organizations look to instrumentality the adjacent step—integrating heavy probe straight into their concern apps, automating multi-step processes, and governing cognition astatine endeavor scale—the request for programmable, composable, and auditable probe automation becomes clear.

This is wherever Azure AI Foundry and Deep Research travel in: offering the flexibility to embed, extend, and orchestrate world-class probe arsenic a work crossed your full endeavor ecosystem—and link it with your information and your systems.

Deep Research capabilities successful Azure AI Foundry Agent Service

Deep Research successful Foundry Agent Service is built for developers who privation to determination beyond the chat window. By offering Deep Research arsenic a composable cause instrumentality via API and SDK, Azure AI Foundry enables customers to:

  • Automate web-scale research utilizing a best-in-class probe exemplary grounded with Bing Search, with each penetration traceable and source-backed.
  • Programmatically physique agents that tin beryllium invoked by apps, workflows, oregon different agents—turning heavy probe into a reusable, production-ready service.
  • Orchestrate analyzable workflows: Compose Deep Research agents with Logic Apps, Azure Functions, and different Foundry Agent Service connectors to automate reporting, notifications, and more.
  • Ensure endeavor governance: With Azure AI Foundry’s security, compliance, and observability, customers get afloat power and transparency implicit however probe is tally and used.

Unlike packaged chat assistants, Deep Research successful Foundry Agent Service tin germinate with your needs—ready for automation, extensibility, and integration with aboriginal interior information sources arsenic we grow support.

How it works: Architecture and cause flow

Deep Research successful Foundry Agent Service is architected for flexibility, transparency, and composability—so you tin automate probe that’s arsenic robust arsenic your concern demands.

At its core, the Deep Research model, o3-deep-research, orchestrates a multi-step probe pipeline that’s tightly integrated with Grounding with Bing Search and leverages the latest OpenAI models:

  1. Clarifying intent and scoping the task:
    When a idiosyncratic oregon downstream app submits a probe query, the cause uses GPT-series models including GPT-4o and GPT-4.1 to clarify the question, stitchery further discourse if needed, and precisely scope the probe task. This ensures the agent’s output is some applicable and actionable, and that each hunt is optimized for your concern scenario.
  2. Web grounding with Bing Search:
    Once the task is scoped, the cause securely invokes the Grounding with Bing Search instrumentality to stitchery a curated acceptable of high-quality, caller web data. This ensures the probe exemplary is moving from a instauration of authoritative, up-to-date sources—no hallucinations from stale oregon irrelevant content.
  3. Deep Research task execution:
    The o3-deep-research exemplary starts the research task execution. This involves thinking, analyzing, and synthesizing accusation crossed each discovered sources. Unlike elemental summarization, it reasons step-by-step, pivots arsenic it encounters caller insights, and composes a broad reply that’s delicate to nuance, ambiguity, and emerging patterns successful the data.
  4. Transparency, safety, and compliance:
    The last output is simply a structured study that documents not lone the answer, but besides the model’s reasoning path, root citations, and immoderate clarifications requested during the session. This makes each reply afloat auditable—a must-have for regulated industries and high-stakes usage cases.
  5. Programmatic integration and composition:
    By exposing Deep Research arsenic an API, Azure AI Foundry empowers you to invoke probe from anywhere—custom concern apps, interior portals, workflow automation tools, oregon arsenic portion of a larger cause ecosystem. For example, you tin trigger a probe cause arsenic portion of a multi-agent chain: 1 cause performs heavy web analysis, different generates a descent platform with Azure Functions, portion a 3rd emails the effect to determination makers with Azure Logic Apps. This composability is the existent game-changer: probe is nary longer a manual, one-off task, but a gathering artifact for integer translation and continuous intelligence.

This flexible architecture means Deep Research tin beryllium seamlessly embedded into a wide scope of endeavor workflows and applications. Already, organizations crossed industries are evaluating however these programmable probe agents tin streamline high-value scenarios—from marketplace investigation and competitory intelligence, to large-scale analytics and regulatory reporting.

Pricing for Deep Research (model: o3-deep-research) is arsenic follows: 

  • Input: $10.00 per 1M tokens.
  • Cached Input: $2.50 per 1M tokens.
  • Output: $40.00 per 1M tokens.

Search discourse tokens are charged input token prices for the exemplary being used. You’ll separately incur charges for Grounding with Bing Search and the basal GPT exemplary being utilized for clarifying questions.  

Get started with Deep Research

Deep Research is disposable present successful constricted nationalist preview for Azure AI Foundry Agent Service customers. To get started:

We can’t hold to spot the innovative solutions you’ll build. Stay tuned for lawsuit stories, caller features, and aboriginal enhancements that volition proceed to unlock the adjacent procreation of endeavor AI agents.

Read Entire Article