Python performance: search large list vs sqlite -
Let me have a database table that has three columns:
id
, field1
and field 2
. There can be anywhere between 100 and 100,000 rows in this table. I have a dragon script that includes 10-1000 new lines in this table. However, if the new field1
already exists in the table, then it should not be UPDATE
, INSERT
. Which of the following approaches is more efficient?
- Select
SELECT field 1
( field 1
is unique) from the table and store it in that list. Then, for each new row, use list.count ()
, to determine whether INSERT
or UPDATE
- For each line, run two questions First, from the
SELECT number (*) to the table WHERE field 1 = "foo"
then either INSERT
or < Code> UPDATE .
In other words, is it more efficient to execute the n + 1 questions and get the list, or the sequence to find and search 2n questions?
Assume that there is a unique hurdle on field 1, you can do simple access:
Enter or change the table values (...)
The following syntax is also supported (similar semantics):
Edit: I realize that I really want to answer your question. I'm not giving up, I'm just providing an alternative solution that should be fast. / P>
Comments
Post a Comment