Why This Role Matters
Data quality directly determines the quality of LIUV's investment analysis. This role ensures every data point that enters the AI pipeline is accurate, timely, and properly structured.
Key Responsibilities
- Build and maintain real-time market data ingestion pipelines
- Integrate SEC/EDGAR filing data, earnings transcripts, and financial statements
- Build international data feeds (EU, Brazil exchanges)
- Design the data warehouse and ensure data quality/freshness
- Implement monitoring and alerting for data pipeline health
- Optimize data access patterns for AI agent queries
Ideal Candidate Profile
- 4+ years data engineering experience
- Experience with financial market data (Bloomberg, Refinitiv, or similar)
- Strong SQL, Python, and cloud data tools (Airflow, dbt, Spark)
- Experience with real-time streaming data
- Understanding of financial data structures (XBRL, SEC filings)
What LIUV Offers
- Equity participation — details discussed individually with each candidate
- Competitive compensation — discussed during the interview process
- Remote-first, async culture with flexible schedules
- Ground-floor opportunity at an AI-native venture studio
- Direct impact on a product used by investors worldwide
- Work alongside a mission-driven team democratizing investment intelligence