Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

How to load an sqlite file into absurd sql #64

Open
oceanwap opened this issue Feb 8, 2023 · 3 comments
Open

How to load an sqlite file into absurd sql #64

oceanwap opened this issue Feb 8, 2023 · 3 comments

Comments

@oceanwap
Copy link

oceanwap commented Feb 8, 2023

I am working on a project, and I need to load sqlite database from server.
Although SQL.js provides a way to do this, I can't find any way to do this in absurd sql. It would be great if someone can help me with this.

In Sql.js
https://github.com/sql-js/sql.js/wiki/Load-a-database-from-the-server

@oceanwap
Copy link
Author

@jlongster Can you help me with this?

@odinndagur
Copy link

I was searching for solutions for this myself and found one that works pretty well in my case. I didn't find any way to upload an existing database file but sqlite has the option to dump its schema and data into text files.

With the sqlite command line interface open - you can use the .output command to make it output to a file on disk instead of the terminal. Then you can use .schema to get the commands to create the tables or .dump to get all the info to recreate the database into your text file.

-> ~ sqlite3 signtest.sqlite3
SQLite version 3.39.4 2022-09-07 20:51:41
Enter ".help" for usage hints.
sqlite> .output ~/dbschema.txt
sqlite> .schema

Yields (this is just for the first table):

CREATE TABLE IF NOT EXISTS "sign_collection" (
	"sign_id"	INTEGER,
	"collection_id"	INTEGER,
	"date_added"	NUMERIC,
	PRIMARY KEY("sign_id","collection_id"),
	FOREIGN KEY("sign_id") REFERENCES "sign"("id"),
	FOREIGN KEY("collection_id") REFERENCES "collection"("id")
);

It includes the semicolons so if you split the file by semicolons you get each command separately. This javascript code loads a text file with fetch and iterates over it splitting the file by semicolons.

async function* splitTextFileBySemicolon(fileURL) {
  const utf8Decoder = new TextDecoder("utf-8");
  let response = await fetch(fileURL);
  let reader = response.body.getReader();
  let {value: chunk, done: readerDone} = await reader.read();
  chunk = chunk ? utf8Decoder.decode(chunk, {stream: true}) : "";

  let re = /;/gm;
  let startIndex = 0;

  for (;;) {
    let result = re.exec(chunk);
    if (!result) {
      if (readerDone) {
        break;
      }
      let remainder = chunk.substr(startIndex);
      ({value: chunk, done: readerDone} = await reader.read());
      chunk = remainder + (chunk ? utf8Decoder.decode(chunk, {stream: true}) : "");
      startIndex = re.lastIndex = 0;
      continue;
    }
    yield chunk.substring(startIndex, result.index);
    startIndex = re.lastIndex;
  }
  if (startIndex < chunk.length) {
    // last line didn't end in a newline char
    yield chunk.substr(startIndex);
  }
}

The last step is just to make the database like normally but then iterate over the .dump commands and execute them in the absurd-sql.js database.

for await (let line of splitTextFileBySemicolon('dbschema.txt')) {
    try{
        db.exec(line);
    } catch (error) {
        console.error(error)
    }
}

It might take a bit of messing with the dump files. The first time I tried it I just dumped the whole database and it worked fine but sometimes it's a bit more finicky and I have to arrange the commands a bit better:

  • create all the tables first.
  • then insert data
  • then create indexes

@oceanwap
Copy link
Author

@odinndagur I really appreciate and thank you that you took time to write this solution.

I am doing the same thing for like past few months but inserting all data again is slow compared to loading sqlite file completely when you have a big file. Second thing, although it's quite unlike but there is a small chance of error while this while process is executed specially if a file is large, like 5 MB or 50 MB.

To handle the problem of foreign keys, and constraint checks I just disable the check itself which also makes insertions fast, yet it's not best solution in my opinion. It would be ideal if we can just load the sqlite file, just like sql.js which probably will take same or less time as inserting one by one from dump file. also, that there is a 100 line code sitting just for this thing and I am not sure if it will occasionally break during a large dump (I have taken enough majors for error handling and recovery).

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants