Topic: How To Increase Memory In Memo And TableGrid
how to tuning speed memo and tablegrid support big data and more row 6,250,000 + record ?
Help Me Please!
Easy For Beginner Student For Me
My Visual Database → General → How To Increase Memory In Memo And TableGrid
how to tuning speed memo and tablegrid support big data and more row 6,250,000 + record ?
Help Me Please!
Hello prahousefamily,
Did you really write six million two hundred and fifty thousands rows ? Gosh...
Technically, SQLite and MySQL can handle such a quantity of data, their physical limits beeing much much higher.
But practically, I would not rely on them for that. SQLite will take ages to load that many rows in a tablegrid (I'm not even sure that Bergsoft Nextgrid 5, the VCL component used by MVD can load so many rows) even on a very fast SSD.
Big data is often synonym with DataWarehouse and, as such, with denormalised systems : very few tables with as few joint as possible (no indexes on other tables if possible) to prevent long query times and a lot of historised data.
At work, extracting 200.000 rows with a query joining 5 or 6 tables can take several minutes on a dedicated Oracle server. I let you imagine what this can give with as much data as you describe.
There are mainly two types of databases : those with many small interconnected tables (indexes and foreign keys) and those with few very large tables. The normal user does not care about OLAP or OLTP because both will do the job and performance differences will go unnoticed. But for Big Data, you will definitely need to plan ahead your database structure and fine tune you database engine.
If you are planning on doing big data, you will probably collect data from various different heterogeneous sources. You will need an ETL (extract, transform, load) system to rearrange all that in a compatible way with your database.
You'll find a lot of articles on the subject, just Google DataWarehouse and you'll find what you need and good advises on database engine, hyper cubes and data marts. But forget SQLite for that : not fast enough, not reliable enough in my opinion.
Cheers
Math
A good introduction but no technical details : https://www.tutorialspoint.com/dwh/
prahousefamily
Usually it's not problem to store few millions records in a database, but are you really need to show so much data in a TableGrid?
Thank You mathmathou ....For Answer And Tip How To Think use large data by Analysis To DataWarehouse .
My MVD Project Now in Health information and have data more row ,more table relation joint 5-20 table per query and problem slow speed
and result data more +20,000 rows but after mathmathou give tip i try use DataWarehouse Thankyou Again...
Um... Dmitry I have little Question
How Much Maximum Row And Column in TableGrid ? And Lines in Memo Or Maximum Size Support ??
I Have CVS,DumpFile use in tableGrid and Memo For Load into Database If my file large than support i can spllit file before load to component
help me please!
Thank You Dmitry.
prahousefamily
There is no any limit for maximum rows, if you have large files, it just take more time and memory of PC.
My Visual Database → General → How To Increase Memory In Memo And TableGrid
Powered by PunBB, supported by Informer Technologies, Inc.
Theme Hydrogen by Kushi