Configuring Mango DB for processing Big DATA


Recently we moved to MANGO DB from MYSQL for our Big DATA.

Overview :

We have 275 GB(550 Million) of data per collection and we inserted it into mongo which occupies 140 GB(Storage Size including Index).

Like this, we have 25 collections in a single database. Each collection has 30 fields, 16 are indexed.

We daily imports up to 20(10 GB) million data to each collection and from this, we need to fetch 100 million data from our overall collection and we regularly require to do aggregate(group) from this 550 million records to generate live reports.


1. Suitable Server Configure and tuning of Mongo DB ( Core & Ram & Version )

2. Aiming to get Reports less than 1 minute.

Skills: Database Administration, Hadoop, Linux, NoSQL Couch & Mongo

See more: form processing services data entry classifieds, php processing form data, javascript processing mysql data, xml data sevice live stock, processing raw data excel, clone joomla big data base, processing test data definition, processing pro data entry, data entry live chat, data entry data processing simple data entry, post processing excel data quality center, translate hello live united staes japanese, relatedwwwcignuswebcomdata entry indiacatalog processing online data entry indiahtm, processing imdb data, big data processing python, online data processing with storm big data pdf, processing of big data online, Online Live Sessions/classess on Big Data and Hadoop, big data processing with apache spark part 1 introduction

About the Employer:
( 0 reviews ) chennai, India

Project ID: #18099043

2 freelancers are bidding on average ₹24549 for this job


Document database expert

₹25000 INR in 1 day
(12 Reviews)

Hi. Interesting use case you have. Let me first start by saying that if you are going to continue to import 10GB worth of data to each collection on a daily basis. Then there is no way that you can improve your perf More

₹24098 INR in 10 days
(0 Reviews)