+ 1

React App Performance Opinion

Alright so I'm building a React app, the app is going to be hosted on a server with a web socket based node backend. The app relies heavily on calculations from a moderately sized mysql database. I'm trying to make the app as smooth and fast as possible for the users that will be connecting. The app will be calculating thousands of logs and perform live stats and updates. A thought crossed my mind but I want to make sure it will be the most efficient way to go about it. Rather than each socket request firing off a SQL query, why not run a SQL query once per day and store the response in an object on my node backend, then the socket requests are going to be almost instant because it will be pulling data from the already retrieved object. My question is: would storing 5000 - 15000 objects (approx 10 keys each) in an array server side, cause any sort of performance issues?

1st May 2019, 8:52 PM
Brandon
Brandon - avatar
7 Antworten
+ 2
then why not use Mongodb instead?
21st May 2019, 11:00 AM
Gordon
Gordon - avatar
+ 2
How many sets of objects can store on server memory is totally depends on the hardware configuration of your server. You need to test the throughput of your server, that can handle you max number of simultaneous connection.
21st May 2019, 12:52 PM
Calviղ
Calviղ - avatar
+ 1
Calviղ what's your opinion?
21st May 2019, 11:44 AM
Gordon
Gordon - avatar
+ 1
Sounds good, I really appreciate the help and advice
21st May 2019, 2:15 PM
Brandon
Brandon - avatar
0
Certainly an option, all of my projects that required a database required a strict data structure so I haven't tussled with Mongodb yet. May step out of my comfort zone and give that a shot. Thank you
21st May 2019, 11:21 AM
Brandon
Brandon - avatar
0
Frequently access data should be cached on the fast access memory of the server for faster access. Do you know that most of fb graph search data are never return back to database of fb server, they are always getting ready 24H and processed from the caches of fb servers? Anyway fb users access are never stop, their data have no chance to return back to store/read from database, most of the time are on fast access memory. This is one of the main reason why fb with billions user access at the same time, it still provide fast service for everyone.
21st May 2019, 11:46 AM
Calviղ
Calviղ - avatar
0
Not sure I'm familiar with what you mean when you say "fast access memory". Are you referring to an actual set of server storage or are you referring to the data being stored in the backend of the application? I've been building on this since I posted the original question and what I've got going on currently is a global object on my Express server that is updated when a user adds a new entry or makes a modification. For transaction tables the socket event triggers a SQL INSERT and also appends the new transaction to the global object if the SQL response is a success. I then send out an IO emit to all connected clients with the new global object. So far it's been working great for me but I haven't started the primary transaction table that will be the bigger table in the database so I cant say what kind of performance impact that is going to cause the backend. Nor have I tested this with more than just myself as a single user. When live there will be 20 - 40 simultaneous connections.
21st May 2019, 12:25 PM
Brandon
Brandon - avatar