;doc: import: edits
This commit is contained in:
parent
210f28a7b5
commit
2a67aa327b
@ -49,9 +49,11 @@ Tips:
|
|||||||
you can reduce the chance of this happening in new transactions by importing more often.
|
you can reduce the chance of this happening in new transactions by importing more often.
|
||||||
(If it happens in old transactions, that's harmless.)
|
(If it happens in old transactions, that's harmless.)
|
||||||
|
|
||||||
Note this is just one kind of "deduplication": avoiding reprocessing the same dates across successive runs.
|
Note this is just one kind of "deduplication": not reprocessing the same dates across successive runs.
|
||||||
`import` doesn't detect other kinds of duplication, such as the same transaction appearing multiple times within a single run.
|
`import` doesn't detect other kinds of duplication, such as
|
||||||
(Because that sometimes happens legitimately in real-world data.)
|
the same transaction appearing multiple times within a single run,
|
||||||
|
or a new transaction that looks identical to a transaction already in the journal.
|
||||||
|
(Because these can happen legitimately in real-world data.)
|
||||||
|
|
||||||
Here's a situation where you need to run `import` with care:
|
Here's a situation where you need to run `import` with care:
|
||||||
say you download but forget to import `bank.1.csv`, and a week later you download `bank.2.csv` with some overlapping data.
|
say you download but forget to import `bank.1.csv`, and a week later you download `bank.2.csv` with some overlapping data.
|
||||||
|
|||||||
Loading…
Reference in New Issue
Block a user