← Back to Projects
News Aggregator Backend
Technologies Used
Project Date
October 2025
The News Aggregator Backend is a robust backend service designed to collect, normalize, and serve news content from diverse sources. It supports scheduled ingestion from third‑party APIs, applies deduplication to avoid repeated articles, and categorizes content using rules and metadata extraction. A clean REST API enables mobile and web clients to browse, search, and filter content efficiently.
Core Capabilities
- Source ingestion via JSON APIs
- Normalization and sanitization of article payloads
- Content de‑duplication (URL hash, canonical link, title+source heuristics)
- Tagging and categorization pipeline
- REST API for listings, detail, search, filters, and pagination
- Background jobs for crawling and enrichment
- Metrics and health endpoints for operations
Technical Highlights
- Framework: Laravel (modern PHP practices and service layer design)
- Data: Relational database for articles, sources, and tags
- Caching/Queues: Redis for caching, rate limiting, and job queues
- Ingestion: Queue‑driven workers with retry and backoff policies
- API: Versioned REST endpoints with pagination and filtering
- Testing: Feature and unit tests for core services
Example Endpoints
- GET /api/v1/articles?q=market weakened&category_ids[]=1&category_ids[]=2&page=1
- GET /api/v1/articles/{id}
- GET /api/v1/sources
Project Timeline
Duration: October 2025 – October 2025
This project provides a solid foundation for any client application that needs timely, de‑duplicated news content with a predictable and well‑documented API.