This repository leverages AI agents (w/ browser_use, elevenlabs) to monitor twitter accounts, analyze trends, score tweets, reply/post/retweet, and speak in twitter spaces.
The workflow is designed for pseudosatirical social causes. It aims to create provocative arguments and pick fights online.
This approach was chosen as a solution because the Twitter API is prohibitively expensive and is often seen as anti-developer.
- Frequent API Changes: Pricing, access levels, and endpoints have changed unpredictably. Some previously available endpoints have been removed or moved to higher-cost tiers.
- Expensive Paid Tiers: Free-tier access is very limited, and paid tiers are costly.
- Rate Limits & Downtime: Even paid users experience unexpected rate limits and occasional outages.
Name a single other major social network around today that has an API and allows third-party clients. The only one I can think of is Reddit - and even in that case, there are numerous features already being locked out of third-party clients. They are on the same path as Twitter, and at some point they will realize that maintaining a gigantic cost center that provides no revenue (since they don't control ads) and does not allow them to rapidly innovate or build a brand (since they don't control the app) does not make a lot of business sense.
The death of the Twitter API is long, long overdue. Bad for us consumers? Sure. But these companies are not charities, they exist to make money.
Given his heavy investment into XAI, Elon Musk should support AI Agentic Workflows which interact with X through selenium/puppeteer. Otherwise, he'd be anti-AI and threaten US digital sovereignty and national security.
- Python 3.8 or higher
- Chrome browser installed (for browser-use)
- Clone this repository. Then, install browser-use and playwright:
pip install browser-use
playwright install
-
Add your API keys for the provider you want to use to your .env file.
-
Set up Twitter authentication:
- Create a
twitter_cookies.txtfile in the root directory - Format the cookies file as shown below:
[{ "name": "auth_token", "value": "YOUR_AUTH_TOKEN", "domain": ".x.com", "path": "/" }, { "name": "ct0", "value": "YOUR_CT0_TOKEN", "domain": ".x.com", "path": "/" }]- You can obtain these cookies by logging into Twitter in your browser and extracting them using browser developer tools
- Create a
- tweet-finder_workflow.py: Monitors Twitter lists for new tweets and fetches their details.
- reply-draft_workflow.py: Generates AI-powered reply options for tweets and posts selected replies.
- setup-new-account_workflow.py: Automates initial account setup (following/blocking users, creating lists).
- get_tweet (get tweet w/ id): Fetches and parses tweet details.
- manage_posts (create/reply to posts): Creates new posts and replies to existing tweets.
- follows (follow user): Manages user following.
- blocks (block user): Manages user blocking.
- lists (create list, add list members, get list post timeline): Creates and manages Twitter lists.
The toolkit stores data in JSON files in the data directory:
- 000_about_me.json: User account information
- 001_saved_tweets.json: Tweets fetched from Twitter
- 002_generated_tweets.json: AI-generated reply options
- 003_posted_tweets.json: Record of posted tweets/replies
- 004_users.json: User information for following/blocking
- 005_lists.json: Twitter list information
- Specification for users with Twitter Premium. More cost effective to use twitter decks (pro.x.com/i/decks/[id]) given less UI clutter.
- Refactor my_twitter_api folder; all API methods should be one .py file with prompts/browser-use settings modularized
- If in prod, use a proper database solution
- Voice AI w/ connection to Twitter Spaces
- Automate the entire twitter account creation process (including 2FA)
- Optimize prompts/settings by determining success rate of each