How do you measure the query rate in the long run?

Asked 2 years ago, Updated 2 years ago, 52 views

I'm using a storage and post-grescual.

By the way, I want to know how fast the query sentence I made speeds up when there is a lot of data.

I used django debug toolbar

In this case, I can't figure out how to test by assuming a lot of data on purpose.

I wonder if there is any other way.

I'm a very beginner in programming, so can I move on without paying attention to this speed problem? In other words, whether there is a lot of data or not, is the execution speed of query statements almost the same? Or should I keep the speed in mind?

If the more data you have, the more query speed is affected, is there a way to test it on the assumption that the more data you have in your storage? Or should I use a tool that directly measures the postgresql speed using the sql statement?

Thank you for reading.

django postgresql

2022-09-21 18:25

1 Answers

I don't think it needs to be limited to a specific framework if it's the execution speed of a query statement.

I understand that if you use the database tool, the query execution speed is basically shown. If it's basic, I think you can check it easily using the database tool.

The more data you have, the more data you have to search for, the more time you have. In this case, use the index to tune the query~

I've never used postgresql, but when I searched, there are many people who use pgbench to test their load. But the question you asked is if the amount of data is large. Shouldn't you put dummy data (fake data?) and do it...?


2022-09-21 18:25

If you have any answers or tips


© 2024 OneMinuteCode. All rights reserved.