Browse AI is a search engine and web browsing assistant powered by artificial intelligence. It aims to provide personalized and intelligent search results and web browsing experiences for users.
๐Ÿ”— Connect and Use Browse AI
1. ๐Ÿ”‘ Connect your Browse AI accoun
2. โœ… Select an action
3. ๐Ÿš€ Go live with the agent
What do you want to do?

API actions for Browse AI for AI assitants/agents

Language
JS
PYTHON
Framework

Scrape Website

Extracts data from a specified website URL using Browse AI's scraping capabilities.
from composio_openai import ComposioToolSet, Action tool_set = ComposioToolSet() tools = tool_set.get_tools(actions=[Action.SCRAPE_WEBSITE])

Create Robot

Creates a new web scraping robot with specified parameters and scraping instructions.
from composio_openai import ComposioToolSet, Action tool_set = ComposioToolSet() tools = tool_set.get_tools(actions=[Action.CREATE_ROBOT])

Update Robot

Modifies an existing web scraping robot's configuration or scraping instructions.
from composio_openai import ComposioToolSet, Action tool_set = ComposioToolSet() tools = tool_set.get_tools(actions=[Action.UPDATE_ROBOT])

Delete Robot

Removes a specified web scraping robot from the Browse AI system.
from composio_openai import ComposioToolSet, Action tool_set = ComposioToolSet() tools = tool_set.get_tools(actions=[Action.DELETE_ROBOT])

Run Robot

Executes a specific web scraping robot to collect data from its target website.
from composio_openai import ComposioToolSet, Action tool_set = ComposioToolSet() tools = tool_set.get_tools(actions=[Action.RUN_ROBOT])

Get Robot Results

Retrieves the latest scraping results from a specified robot's most recent run.
from composio_openai import ComposioToolSet, Action tool_set = ComposioToolSet() tools = tool_set.get_tools(actions=[Action.GET_ROBOT_RESULTS])

List Robots

Retrieves a list of all web scraping robots associated with the user's account.
from composio_openai import ComposioToolSet, Action tool_set = ComposioToolSet() tools = tool_set.get_tools(actions=[Action.LIST_ROBOTS])

Get Robot Details

Fetches detailed information about a specific web scraping robot.
from composio_openai import ComposioToolSet, Action tool_set = ComposioToolSet() tools = tool_set.get_tools(actions=[Action.GET_ROBOT_DETAILS])

Create Task

Sets up a new scraping task with specific parameters and scheduling options.
from composio_openai import ComposioToolSet, Action tool_set = ComposioToolSet() tools = tool_set.get_tools(actions=[Action.CREATE_TASK])

Cancel Task

Stops and cancels an ongoing or scheduled scraping task.
from composio_openai import ComposioToolSet, Action tool_set = ComposioToolSet() tools = tool_set.get_tools(actions=[Action.CANCEL_TASK])

Get Task Status

Checks the current status of a specific scraping task.
from composio_openai import ComposioToolSet, Action tool_set = ComposioToolSet() tools = tool_set.get_tools(actions=[Action.GET_TASK_STATUS])

List Tasks

Retrieves a list of all scraping tasks associated with the user's account.
from composio_openai import ComposioToolSet, Action tool_set = ComposioToolSet() tools = tool_set.get_tools(actions=[Action.LIST_TASKS])

Export Data

Exports scraped data in a specified format (e.g., CSV, JSON) for download or further processing.
from composio_openai import ComposioToolSet, Action tool_set = ComposioToolSet() tools = tool_set.get_tools(actions=[Action.EXPORT_DATA])

Set Proxy

Configures a proxy server for use with web scraping robots to avoid IP blocks.
from composio_openai import ComposioToolSet, Action tool_set = ComposioToolSet() tools = tool_set.get_tools(actions=[Action.SET_PROXY])

Create API Key

Generates a new API key for programmatic access to Browse AI services.
from composio_openai import ComposioToolSet, Action tool_set = ComposioToolSet() tools = tool_set.get_tools(actions=[Action.CREATE_API_KEY])

Revoke API Key

Invalidates an existing API key for security purposes.
from composio_openai import ComposioToolSet, Action tool_set = ComposioToolSet() tools = tool_set.get_tools(actions=[Action.REVOKE_API_KEY])

Set Rate Limit

Configures the rate limiting settings for a specific robot or all robots.
from composio_openai import ComposioToolSet, Action tool_set = ComposioToolSet() tools = tool_set.get_tools(actions=[Action.SET_RATE_LIMIT])

Add Custom Header

Adds a custom HTTP header to be used in scraping requests for a specific robot.
from composio_openai import ComposioToolSet, Action tool_set = ComposioToolSet() tools = tool_set.get_tools(actions=[Action.ADD_CUSTOM_HEADER])

Set User Agent

Specifies a custom User-Agent string for a robot's HTTP requests.
from composio_openai import ComposioToolSet, Action tool_set = ComposioToolSet() tools = tool_set.get_tools(actions=[Action.SET_USER_AGENT])

Enable JavaScript

Activates JavaScript execution for a specific robot to scrape dynamic content.
from composio_openai import ComposioToolSet, Action tool_set = ComposioToolSet() tools = tool_set.get_tools(actions=[Action.ENABLE_JAVASCRIPT])

New Data Available

Fires when new data is scraped and available from a specified robot.
from composio_openai import ComposioToolSet, Action tool_set = ComposioToolSet() tools = tool_set.get_tools(actions=[Action.NEW_DATA_AVAILABLE])

Scraping Error Occurred

Triggers when a scraping operation encounters an error or fails to complete.
from composio_openai import ComposioToolSet, Action tool_set = ComposioToolSet() tools = tool_set.get_tools(actions=[Action.SCRAPING_ERROR])

Robot Created

Activates when a new web scraping robot is successfully created in the system.
from composio_openai import ComposioToolSet, Action tool_set = ComposioToolSet() tools = tool_set.get_tools(actions=[Action.ROBOT_CREATED])

Robot Updated

Fires when an existing web scraping robot's configuration is modified.
from composio_openai import ComposioToolSet, Action tool_set = ComposioToolSet() tools = tool_set.get_tools(actions=[Action.ROBOT_UPDATED])

Robot Deleted

Triggers when a web scraping robot is removed from the system.
from composio_openai import ComposioToolSet, Action tool_set = ComposioToolSet() tools = tool_set.get_tools(actions=[Action.ROBOT_DELETED])

Task Completed

Activates when a scraping task finishes its execution successfully.
from composio_openai import ComposioToolSet, Action tool_set = ComposioToolSet() tools = tool_set.get_tools(actions=[Action.TASK_COMPLETED])

Rate Limit Reached

Fires when a robot or account reaches its configured rate limit for scraping.
from composio_openai import ComposioToolSet, Action tool_set = ComposioToolSet() tools = tool_set.get_tools(actions=[Action.RATE_LIMIT_REACHED])

API Key Created

Triggers when a new API key is generated for the account.
from composio_openai import ComposioToolSet, Action tool_set = ComposioToolSet() tools = tool_set.get_tools(actions=[Action.API_KEY_CREATED])

API Key Revoked

Activates when an existing API key is invalidated or revoked.
from composio_openai import ComposioToolSet, Action tool_set = ComposioToolSet() tools = tool_set.get_tools(actions=[Action.API_KEY_REVOKED])

Quota Exceeded

Fires when the account's scraping quota or limit is exceeded.
from composio_openai import ComposioToolSet, Action tool_set = ComposioToolSet() tools = tool_set.get_tools(actions=[Action.QUOTA_EXCEEDED])

Frequently asked questions

What is Composio.dev?

Composio.dev is a cutting-edge framework for building AI applications, designed to make the process of developing AI solutions super easy and fun! It's a collection of powerful tools and libraries that simplify the process of creating AI applications, allowing you to focus on the creative aspects of your project without getting bogged down by the technical details.

How does Composio.dev support Browse AI?

Composio.dev seamlessly integrates with Browse AI, making it a breeze to leverage its capabilities within the Composio.dev platform. You can use Browse AI to call functions on various platforms like Google, GitHub, and others, allowing you to incorporate different services into your AI applications with ease. It also supports user login via OAuth2 and can work with other popular frameworks such as LangChain and CrewAI, giving you the flexibility to build truly innovative AI solutions.

What models can I use with Browse AI and openAI_python?

When using Browse AI and openAI_python, you have access to a wide range of state-of-the-art language models, including GPT-4o (OpenAI), GPT-3.5 (OpenAI), GPT-4 (OpenAI), Claude (Anthropic), PaLM (Google), LLaMA and LLaMA 2 (Meta), Gemini, and many others. This flexibility allows you to choose the model that best suits your specific use case, whether you're building a chatbot, a content creation tool, or any other AI-powered application. You can experiment with different models and find the one that delivers the best performance for your project.

How can I integrate Browse AI with openAI_python?

Integrating Browse AI with openAI_python is super easy with Composio.dev! You can use the Composio.dev API to call functions from both Browse AI and openAI_python, allowing you to tap into their capabilities with just a few lines of code. The SDK is available in Python, JavaScript, and TypeScript, so you can work with the language you're most comfortable with and integrate these powerful tools into your projects seamlessly.

What is the pricing for Browse AI and openAI_python?

Both Browse AI and openAI_python are completely free to use, with a generous free tier that allows up to 1000 requests per month. This makes them accessible for developers and organizations of all sizes, whether you're a student working on a personal project or a startup building the next big thing. You can get started with these powerful tools without worrying about breaking the bank.

What kind of authentication is supported for Browse AI and openAI_python?

Browse AI and openAI_python support OAuth2 authentication, ensuring secure and authorized access to their functionalities. You can use the Composio.dev API to handle authentication and call functions from both Browse AI and openAI_python seamlessly. The SDK is available in Python, JavaScript, and TypeScript for your convenience, making it easy to integrate authentication into your projects and keep your users' data safe and secure.

Can I add Browse AI to my project?

Absolutely! You can easily incorporate Browse AI into your project by utilizing the Composio.dev API. This API allows you to call functions from both Browse AI and openAI_python, enabling you to leverage their capabilities within your application. The SDK is available in Python, JavaScript, and TypeScript to facilitate integration, so you can work with the language you're most comfortable with and add these powerful tools to your project with ease.

What is the accuracy of Browse AI and openAI_python?

Browse AI and openAI_python are designed to provide highly accurate and reliable results, ensuring that your AI applications perform at their best. The integration with Composio.dev ensures precise function calls, enabling you to build robust and powerful AI applications with confidence. The comprehensive framework and the ability to leverage state-of-the-art models ensure reliable and accurate outcomes for your AI development needs, whether you're working on a chatbot, a content creation tool, or any other AI-powered project.

What are some common use cases for Browse AI and openAI_python?

Browse AI and openAI_python can be used for a wide range of AI applications, making them versatile tools for developers and creators alike. Some common use cases include natural language processing, text generation, question answering, sentiment analysis, and more. They're particularly useful for building chatbots, virtual assistants, content creation tools, and other AI-powered applications that can help you automate tasks, engage with users, and create compelling content. Whether you're working on a personal project or building a product for your startup, these tools can help you bring your ideas to life.

How does Browse AI handle data privacy and security?

Data privacy and security are crucial considerations when working with AI systems, and Browse AI takes these issues seriously. It follows industry best practices and adheres to strict data protection regulations, ensuring that your data is kept safe and secure. Browse AI provides robust security measures, such as encryption and access controls, to ensure the confidentiality and integrity of your data. You can rest assured that your sensitive information is protected when using Browse AI for your AI development needs.

Can I customize Browse AI and openAI_python for my specific needs?

Absolutely! Browse AI and openAI_python are highly customizable and extensible, allowing you to tailor their functionality, models, and configurations to meet your specific requirements. Whether you're building a chatbot, a content creation tool, or any other AI-powered application, you can customize these tools to fit your unique needs. Additionally, Composio.dev provides a flexible platform for integrating and orchestrating various AI tools and services, enabling you to create custom AI solutions that are tailored to your project.

What kind of support and documentation is available for Browse AI and openAI_python?

Browse AI and openAI_python have comprehensive documentation and a supportive community, making it easy for you to get started and find answers to your questions. Composio.dev also provides extensive resources, including tutorials, guides, and a dedicated support team to assist you throughout your AI development journey. Whether you're a beginner or an experienced developer, you'll have access to the resources you need to make the most of these powerful tools.
+ Integrate seamlessly with your agentic frameworks
Composio Works with All Shapes and SizesComposio Works with All Shapes and SizesComposio Works with All Shapes and SizesComposio Works with All Shapes and SizesComposio Works with All Shapes and Sizes
Building for AI across continents๐Ÿงช