0
How big can an SQL query be, in relation to a huge database with tables running to 50s?
Optimization of programs is key to all functions within the program. Can an SQL query run through 50tables and in what time really? Would system performance be affected by such a query?
2 Antworten
+ 3
Yes, but it depends upon many things. I am not a expert in this field. But What I have learn. SQL using b+ tree. Its data structure that use to store data in database (most of them use this). At first
1: Which data type you have use to built table columes like, don't use text type in which you can use varchar or use unsigned int if your integer is always positive. It works faster if you use in correct way.
2: How you use query, for getting same result there are many query and they have far differ by performance. You have to write query in optimized way.
3: Use something columes name not wild card(*). Even you have to fetch all columes.
4: Only fetch data that import, if your database is large fetch data using chuck.
More over, if your database totally depends upon search. Use some database server. that is useful and fast for search. Like elastic search
Hope it helps.
0
It is dependent on system resources mostly. There will be hardware limits before logic / design limits, but also dependant on what type of sql server you are using(Sqlite vs oracle, etc).
Joining 50 tables with 5kb information vs querying 2 tables with 500,000 full account records. What will be slower?