Full-Stack Developer — Legislative Tracking Platform (MVP)

Location: []

Full-Stack Developer (Python + React) — Legisly™ AI Legislative Tracking Platform (MVP) Project Type: Fixed-Price / Milestone-Based (preferred) · Hourly with weekly cap also open Budget: $55,000 – $60,000 total Timeline: ~7 months (now through August 31, 2025) Hours: 30–40 hrs/week Location: Remote · US timezone preferred What We're Building Legisly™ is an AI-powered legislative tracking platform for California small businesses and nonprofits. The existing tools in this space cost thousands a month and require a team of people to run. Legisly replaces all of that with AI — automated bill discovery, plain-English summaries, smart alerts, and a chat interface that handles the work for you. Think of it as the consumer version of what corporate lobbying firms do internally, priced at $49–$199/month. Launch date: August 31, 2025 (MVP, California only). MVP Scope — What You're Building Core Features (all required) 1. AI-Powered Bill Discovery User signs up, tells the AI about their business (industry, location, concerns). The system automatically surfaces relevant California bills, categorizes them, and explains why each one matters. No manual searching. 2. AI Bill Summaries Every bill gets a plain-English summary (~100 words), an impact analysis, a support/oppose/monitor recommendation, and an "explain like I'm 5" version. Powered by Anthropic Claude API. 3. Smart Alert System Not just "bill updated" — prioritized, context-aware alerts. Example: "AB 1234 moved to floor vote — 72 hours to act." Email-based for MVP. 4. AI Chat Interface Users interact with the platform in natural language. "What should I focus on this week?" "Find all labor bills affecting restaurants." "Draft a letter opposing AB 1234." The AI handles it. 5. Personalized Dashboard AI-generated, not generic. Shows the user's top priorities, newly relevant bills, and upcoming deadlines. Updates based on real legislative activity. Data Source California's legislature provides an official bulk data download at https://downloads.leginfo.legislature.ca.gov/. This is the data backbone of the platform — no scraping required. Here's how it works: the state publishes the legislative database as zip files containing SQL that loads directly into a relational database called capublic. They provide the table creation scripts (capublic.sql), load scripts, and everything you need. There are two types of files you'll work with: Initial load: pubinfo_2025.zip (~649MB) — the full current session snapshot. Downloaded once to seed your database. Daily updates: The small day-of-week files (pubinfo_Mon.zip, pubinfo_Tue.zip, etc.) contain only what changed that day. These are tiny (under 100KB typically) and are what your background job pulls each morning to keep the database current. The capublic database is MySQL by default, but conversion to PostgreSQL is well-documented — the open source community has already mapped this out. The state's schema file shows you exactly what tables and columns exist before you write a single line of application code. Your job is to build the pipeline in Python: download the daily delta → parse and load it into PostgreSQL → make it queryable by the rest of the app. Python is the right tool for this work, and it's also where the AI integration lives. See the stack below. Out of Scope (MVP) No multi-state support, no mobile app, no social/networking features, no API access, no white-label. California only. Web only. Ship the 5 features above and ship them well. Tech Stack Python is the core of this project. The backend API, the AI layer, and the data pipeline are all Python. The frontend is React. This is a deliberate choice — Python is the strongest ecosystem for AI integration (Anthropic's SDK is Python-first) and for the kind of data processing this platform needs. Here's the full picture: LayerTechBackend APIPython 3.12 + FastAPIAI IntegrationPython — Anthropic Claude SDKData PipelinePython — SQLAlchemy, psycopg2Task SchedulingPython — Celery or APSchedulerFrontendReact + TypeScriptDatabasePostgreSQLHostingAWS (or equivalent)AuthAuth0 or ClerkEmailSendGrid or Resend If you have a strong reason to deviate on any of these, make the case. But Python backend is non-negotiable — that's where the value of this platform lives. Milestone Breakdown & Payment This is the structure we're proposing. You're welcome to suggest adjustments during the scoping call, but milestone-based payment is non-negotiable — it's how we manage risk on both sides. #MilestoneDue DateDeliverablePayment1Infrastructure & SetupFeb 28Python/FastAPI backend running, PostgreSQL schema set up, React skeleton deployed, auth working$8,0002Data PipelineMar 31Python ingestion job pulling daily CA legislature feeds, parsing SQL dumps, populating PostgreSQL, basic bill search and display working$12,0003AI IntegrationApr 30Python AI layer live — bill summaries generating via Claude API, discovery engine working, chat interface functional$13,0004Alerts + DashboardMay 31Smart alert system sending emails, personalized AI-generated dashboard rendering correctly$10,0005Beta & PolishJun 3010 beta users onboarded, bugs fixed, onboarding flow smooth$8,0006Launch PrepJul 31Final polish, performance optimization, docs written, staging environment ready$5,0007Launch + SupportAug 31Public launch, 2 weeks post-launch bug fixes included$4,000 Total: $60,000 Payment is released per milestone upon sign-off that the deliverable meets spec. We'll define acceptance criteria for each milestone before you start. What We Expect From You Technical: Build the Python backend from the ground up — FastAPI, the data pipeline, the AI integration layer Build the React frontend that connects to your Python API Design and maintain the PostgreSQL database Set up production infrastructure (CI/CD, monitoring basics) Write documentation as you go — not as an afterthought Communication (this matters as much as the code): Daily async update in Slack by end of day — what you did, what's next, any blockers Monday: 30-min planning call Friday: 30-min demo of what shipped that week If something is going to be late or scope needs to change, flag it immediately — not at the deadline Mindset: You're working with a non-technical founder. Explain trade-offs in plain English. If you can't explain why you made a technical decision in 2 sentences a non-developer understands, rethink the explanation. We use Claude AI internally to help with code generation, documentation, and technical review. You'll be working alongside it, not instead of it. What We're Looking For Must-haves: 5+ years development experience with Python as your primary language Strong Python web framework experience (FastAPI preferred, Flask acceptable) Solid React + TypeScript for the frontend layer PostgreSQL — schema design, queries, migrations Experience building data pipelines or batch processing jobs in Python Experience with AI/ML APIs (Anthropic Claude or OpenAI) — Python SDK specifically Portfolio with at least one shipped SaaS or data-driven product Experience working with a non-technical founder or client References from past clients (2 minimum) Nice-to-haves: Experience with async Python (asyncio, async FastAPI) Any civic tech or government-adjacent project in your history Startup experience (you understand "ship fast, learn fast") Familiarity with Celery or task queue systems How to Apply Don't just send a cover letter. We need to know you can actually do this. Include: Your portfolio / GitHub — link to at least one shipped product built with Python. If it's private, send a video walkthrough. Answers to these questions (be specific — generic answers get skipped): a) Walk us through a Python-based project you've shipped. What was the stack, what did you build, and what's the URL or a demo? b) We ingest legislative data from an official state database dump (SQL files, daily delta updates) and need a Python pipeline to process and load it. Describe a similar data ingestion or ETL project you've built. What were the tricky parts? c) Rate yourself honestly 1–10: Python, FastAPI, React, TypeScript, PostgreSQL, AI API integration (Anthropic or OpenAI). d) Can you commit 30–40 hours/week, consistently, through August 31? e) What's your fixed-price quote for this scope? (Break it down however makes sense to you.) f) How do you communicate with non-technical clients when something goes sideways? Give a real example. Two client references — name, email, and what the project was. What Happens After You Apply Top applicants get invited to: A 30-minute video call (no prep needed — just be ready to talk through your experience and ask us questions about the project) A paid test project ($500, one week) — a small, scoped Python build that shows us how you work If the test project is strong, we sign the contract and you start We're making a hiring decision fast. The sooner you apply, the sooner we move. Why This Project You're building something that actually matters — small businesses and nonprofits in California spend millions a year on lobbying they shouldn't have to. This levels the playing field. The AI piece is real, not cosmetic. The whole product hinges on it working well — and Python is the language that makes it work best. There's a clear path beyond MVP. If this works (and the market says it will), this becomes a multi-state platform. That's ongoing work for the right person. You'll be working directly with the founder, not buried in a corporate chain of command.

Apply: https://www.upwork.com/jobs/~022018739885424274294