Skip to content

A Node.js app, which mock millions of data and save into Mysql database, optimize the app code and database to get a better performance

Notifications You must be signed in to change notification settings

92hackers/big-data-practise

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

16 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

A big data scenario to practise how to manage, optimize millions of data

Big data scenario is not patent of big company, we could design a scenario, then mock millions, billions of data, store into database.

Features

  1. Super fast data generation, concurrently generate billions data in Node.js cluster mode.
  2. Full problem list, to track detail problems we will facing under big data scenario.
  3. Data models mocked from an living website: https://zhihu.com, a Quora-like Ask && Answer product.
  4. Friendly to all levels developers, easy to set up, and full tutorials to help.

Hardware requirements

You should owns a little bit high performance computer, which will speed up your practise, give you my PC as an example:

  • CPU: 4 x Core i5-7500 @ 3.4GHz
  • Memory: 16G
  • Disk: 256G SSD

Bigger data set requires more resource, especially large amount of disk, when your practise involved backup, partion, replica and sharding

Tutorial

Full tutorial to begin your practise

Requirements

About

A Node.js app, which mock millions of data and save into Mysql database, optimize the app code and database to get a better performance

Resources

Stars

Watchers

Forks

Packages

No packages published