Pivotor AI→
Platform Engineer Intern (Backend)
InternshipOn-site
Location
Fremont, CA
Salary
Not listed
Experience
Not specified
Posted
5 days ago
Skills
pythonfastapiflaskrest apissqlalchemysqlnosql databasespostgresqlmongodbdata pipelinesweb scrapingetl workflowsnlp pipelinesllm apisrag architecturesawsmcpgithubsystems thinkingprototyping
Job Description
Summary: Pivotor AI is a conversational career discovery platform. They are seeking a Platform Engineer Intern (Backend) to build and maintain FastAPI REST APIs, implement secure authentication, and develop various backend and AI systems.
Responsibilities:
- Build and maintain FastAPI REST APIs using SQLAlchemy ORM and MVC architecture for internal services and partner integrations
- Implement secure authentication using Firebase Auth (OAuth, secure cookies)
- Develop and productionize proof-of-concept features across backend, data, and AI systems
- Design ETL pipelines with Apache NiFi for scraping, validation, and storage across MongoDB, S3, and LLM endpoints
- Build high-throughput scraping pipelines using BeautifulSoup, Selenium, multiprocessing, and rate-limiting strategies
- Implement NLP pipelines for deduplication, entity matching, and enrichment using spaCy, regex, and LLMs
- Work across AWS infrastructure (EC2, RDS/PostgreSQL, Lambda, S3, IAM, VPC) to deploy and scale services
- Build and maintain CI/CD pipelines with GitHub Actions and production services using NGINX and PM2
- Contribute to LLM + RAG systems, MCP server integrations, and LangGraph agent workflows
- Build internal tools and dashboards using Next.js and Tailwind
- Whiteboard. Iterate. Launch
Required Qualifications:
- 0.5 to 1 year of backend engineering experience (projects, internships, or coursework count)
- Strong Python backend development experience (FastAPI, Flask, or similar frameworks)
- Experience building REST APIs and working with ORMs like SQLAlchemy
- Familiarity with SQL and NoSQL databases (PostgreSQL, MongoDB; graph databases a plus)
- Experience building data pipelines, scraping systems, or ETL workflows
- Exposure to NLP pipelines, LLM APIs, or RAG architectures
- Familiarity with cloud infrastructure (AWS) and deployment workflows
- Hands-on experience with MCP, Cursor, or Claude for AI-assisted development (mandatory)
- A GitHub portfolio that demonstrates how you think and build systems
- Available immediately and on-site in Fremont
- Must accept terms of an unpaid internship with potential for conversion
Required Skills: Python, FastAPI, Flask, REST APIs, SQLAlchemy, SQL, NoSQL databases, PostgreSQL, MongoDB, Data pipelines, Web scraping, ETL workflows, NLP pipelines, LLM APIs, RAG architectures, AWS, MCP, GitHub, Systems thinking, Prototyping
Benefits: Gym, Snack bar
Benefits
Gym
Snack bar