With ThreadPoolExecutor(max_workers=2) as executor:įutures.append(executor.submit(makeSubmission, 296))įutures.append(executor. Here's my code that runs _or_create simultaneously from two different threads, in case it is helpful: from concurrent.futures import ThreadPoolExecutorįrom submissions.models import ExerciseCollectionSubmissionĮ, _ = _or_create( That's not entirely equivalent, so you may need to do something else in your application. I solved the problem by using a threading.RLock object instead of transaction.atomic() when my Django app is running with a sqlite backend. I think this is due to the fact that sqlite cannot handle multiple simultaneous writers, so the application must serialize writes on their own. Changing the timeout database option had no effect on the behavior. When I used transaction.atomic() to wrap a call to _or_create() and called that code simultaneously from two different threads, only one thread would succeed, while the other would get the "database is locked" error. I encountered this error message in a situation that is not (clearly) addressed by the help info linked in patrick's answer. Let’s assume, if something goes out of track then immediately the API functions will return error codes that will assist the users to track down the cause behind the error during processing the data. Solution: always do cursor.close() as soon as possible after having done a (even read-only) query. The SQLite Error 11 can arise if the database disk image is deformed or distorted and can result in database file corruption. the other was accessing the DB in read-only.one was accessing the DB with write operations.I had the same problem when I was using two scripts using the same database at the same time: Here are more informations about Implementation Limits for SQLite. Unless you have a very busy server with thousands of connections at the same second, the reason for this Database is locked error is probably more a bad use of the API, than a problem inherent to SQlite which would be "too light". This is a bit "too easy" to incriminate SQlite for this problem (which is very powerful when correctly used it's not only a toy for small databases, fun fact: An SQLite database is limited in size to 140 terabytes ). At a certain point SQLite becomes too "lite" for real-world applications, and these sorts of concurrency errors indicate you've reached that point. I slightly disagree with the accepted answer which, by quoting this doc, implicitly links OP's problem ( Database is locked) to this: Increase the default timeout value by setting the timeout database option.Rewriting your code to reduce concurrency and ensure that database transactions are short-lived.Switching to another database backend.
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |