Skip to content

This project is a real-time disaster information aggregation system designed to automatically collect, categorize, and display disaster-related data from various sources, including social media, news portals, and open data sources

Notifications You must be signed in to change notification settings

SayamMahajan/Mappers

Folders and files

NameName
Last commit message
Last commit date

Latest commit

Β 

History

11 Commits
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 

Repository files navigation

Mappers - Real-Time Disaster Information Aggregation

πŸ“Œ Project Overview

This project is a real-time disaster information aggregation system designed to automatically collect, categorize, and display disaster-related data from various sources, including social media, news portals, and open data sources. The system leverages advanced filtering algorithms and presents the processed information on a user-friendly dashboard for disaster response agencies, enhancing situational awareness and decision-making.

πŸš€ Features

  • Real-time data aggregation from multiple sources (social media, news, open data sources).
  • Advanced filtering algorithms to remove noise and irrelevant information.
  • Categorization of disaster data based on type, severity, and location.
  • Interactive dashboard for visualization and monitoring.
  • Automated alerts for early disaster warnings.
  • Scalable and efficient architecture for handling large datasets.

πŸ› οΈ Tech Stack

  • Frontend: React.js (or your preferred framework)
  • Backend: Node.js/Express (or another backend technology of choice)
  • Database: MongoDB
  • Data Processing: Python (NLP, ML models)
  • APIs & Integrations: Twitter API, News APIs, Web Scraping (BeautifulSoup, Scrapy)
  • Hosting: AWS/GCP

πŸ—οΈ Installation & Setup

Prerequisites

Ensure you have the following installed:

  • Node.js & npm
  • Python & pip
  • MongoDB/PostgreSQL

Steps

  1. Clone the repository:
    git clone https://github.com/your-username/repo-name.git
    cd repo-name
  2. Install dependencies:
    npm install  # For frontend and backend dependencies
    pip install -r requirements.txt  # For Python-based data processing
  3. Set up environment variables: Create a .env file and configure API keys and database URLs.
    DATABASE_URL=your_database_url
    TWITTER_API_KEY=your_twitter_api_key
    
  4. Run the application:
    npm start  # For frontend
    npm run server  # For backend
    python data_processor.py  # For data aggregation scripts

πŸ“Š Usage

  • Open the dashboard at http://localhost:3000.
  • Configure your sources and filters.
  • View real-time disaster information and alerts.

πŸ—οΈ Contributing

We welcome contributions! Follow these steps to contribute:

  1. Fork the repository.
  2. Create a new branch.
  3. Make your changes and commit them.
  4. Push to your fork and submit a Pull Request.

πŸ“œ License

This project is licensed under the MIT License.

πŸ“ž Contact

For any queries or suggestions, feel free to reach out:


Let's build a smarter disaster response system together! πŸš€

About

This project is a real-time disaster information aggregation system designed to automatically collect, categorize, and display disaster-related data from various sources, including social media, news portals, and open data sources

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Contributors 2

  •  
  •