This commit is contained in:
Simon Michael 2020-06-06 17:21:18 -07:00
parent a54376e204
commit 09b6d44562
21 changed files with 720 additions and 729 deletions

View File

@ -1,6 +1,6 @@
.\"t .\"t
.TH "hledger_csv" "5" "June 2020" "hledger 1.17.99" "hledger User Manuals" .TH "hledger_csv" "5" "June 2020" "hledger 1.18" "hledger User Manuals"
@ -81,6 +81,11 @@ T{
T}@T{ T}@T{
inline another CSV rules file inline another CSV rules file
T} T}
T{
\f[B]\f[CB]balance-type\f[B]\f[R]
T}@T{
choose which type of balance assignments to use
T}
.TE .TE
.PP .PP
Note, for best error messages when reading CSV files, use a Note, for best error messages when reading CSV files, use a

View File

@ -3,8 +3,8 @@ This is hledger_csv.info, produced by makeinfo version 6.7 from stdin.
 
File: hledger_csv.info, Node: Top, Next: EXAMPLES, Up: (dir) File: hledger_csv.info, Node: Top, Next: EXAMPLES, Up: (dir)
hledger_csv(5) hledger 1.17.99 hledger_csv(5) hledger 1.18
****************************** ***************************
CSV - how hledger reads CSV data, and the CSV rules file format CSV - how hledger reads CSV data, and the CSV rules file format
@ -38,6 +38,7 @@ assignment*
*'date-format'* describe the format of CSV dates *'date-format'* describe the format of CSV dates
*'newest-first'* disambiguate record order when there's only one date *'newest-first'* disambiguate record order when there's only one date
*'include'* inline another CSV rules file *'include'* inline another CSV rules file
*'balance-type'* choose which type of balance assignments to use
Note, for best error messages when reading CSV files, use a '.csv', Note, for best error messages when reading CSV files, use a '.csv',
'.tsv' or '.ssv' file extension or file prefix - see File Extension '.tsv' or '.ssv' file extension or file prefix - see File Extension
@ -1035,74 +1036,74 @@ command the user specified.
 
Tag Table: Tag Table:
Node: Top72 Node: Top72
Node: EXAMPLES2113 Node: EXAMPLES2174
Ref: #examples2219 Ref: #examples2280
Node: Basic2427 Node: Basic2488
Ref: #basic2527 Ref: #basic2588
Node: Bank of Ireland3069 Node: Bank of Ireland3130
Ref: #bank-of-ireland3204 Ref: #bank-of-ireland3265
Node: Amazon4666 Node: Amazon4727
Ref: #amazon4784 Ref: #amazon4845
Node: Paypal6503 Node: Paypal6564
Ref: #paypal6597 Ref: #paypal6658
Node: CSV RULES14241 Node: CSV RULES14302
Ref: #csv-rules14350 Ref: #csv-rules14411
Node: skip14626 Node: skip14687
Ref: #skip14719 Ref: #skip14780
Node: fields15094 Node: fields15155
Ref: #fields15216 Ref: #fields15277
Node: Transaction field names16381 Node: Transaction field names16442
Ref: #transaction-field-names16541 Ref: #transaction-field-names16602
Node: Posting field names16652 Node: Posting field names16713
Ref: #posting-field-names16804 Ref: #posting-field-names16865
Node: account16874 Node: account16935
Ref: #account16990 Ref: #account17051
Node: amount17527 Node: amount17588
Ref: #amount17658 Ref: #amount17719
Node: currency18765 Node: currency18826
Ref: #currency18900 Ref: #currency18961
Node: balance19106 Node: balance19167
Ref: #balance19240 Ref: #balance19301
Node: comment19557 Node: comment19618
Ref: #comment19674 Ref: #comment19735
Node: field assignment19837 Node: field assignment19898
Ref: #field-assignment19980 Ref: #field-assignment20041
Node: separator20798 Node: separator20859
Ref: #separator20927 Ref: #separator20988
Node: if21338 Node: if21399
Ref: #if21440 Ref: #if21501
Node: end23596 Node: end23657
Ref: #end23702 Ref: #end23763
Node: date-format23926 Node: date-format23987
Ref: #date-format24058 Ref: #date-format24119
Node: newest-first24807 Node: newest-first24868
Ref: #newest-first24945 Ref: #newest-first25006
Node: include25628 Node: include25689
Ref: #include25757 Ref: #include25818
Node: balance-type26201 Node: balance-type26262
Ref: #balance-type26321 Ref: #balance-type26382
Node: TIPS27021 Node: TIPS27082
Ref: #tips27103 Ref: #tips27164
Node: Rapid feedback27359 Node: Rapid feedback27420
Ref: #rapid-feedback27476 Ref: #rapid-feedback27537
Node: Valid CSV27936 Node: Valid CSV27997
Ref: #valid-csv28066 Ref: #valid-csv28127
Node: File Extension28258 Node: File Extension28319
Ref: #file-extension28410 Ref: #file-extension28471
Node: Reading multiple CSV files28820 Node: Reading multiple CSV files28881
Ref: #reading-multiple-csv-files29005 Ref: #reading-multiple-csv-files29066
Node: Valid transactions29246 Node: Valid transactions29307
Ref: #valid-transactions29424 Ref: #valid-transactions29485
Node: Deduplicating importing30052 Node: Deduplicating importing30113
Ref: #deduplicating-importing30231 Ref: #deduplicating-importing30292
Node: Setting amounts31264 Node: Setting amounts31325
Ref: #setting-amounts31433 Ref: #setting-amounts31494
Node: Setting currency/commodity32419 Node: Setting currency/commodity32480
Ref: #setting-currencycommodity32611 Ref: #setting-currencycommodity32672
Node: Referencing other fields33414 Node: Referencing other fields33475
Ref: #referencing-other-fields33614 Ref: #referencing-other-fields33675
Node: How CSV rules are evaluated34511 Node: How CSV rules are evaluated34572
Ref: #how-csv-rules-are-evaluated34684 Ref: #how-csv-rules-are-evaluated34745
 
End Tag Table End Tag Table

View File

@ -45,20 +45,22 @@ DESCRIPTION
when there's only one date when there's only one date
include inline another CSV rules include inline another CSV rules
file file
balance-type choose which type of bal-
ance assignments to use
Note, for best error messages when reading CSV files, use a .csv, .tsv Note, for best error messages when reading CSV files, use a .csv, .tsv
or .ssv file extension or file prefix - see File Extension below. or .ssv file extension or file prefix - see File Extension below.
There's an introductory Convert CSV files tutorial on hledger.org. There's an introductory Convert CSV files tutorial on hledger.org.
EXAMPLES EXAMPLES
Here are some sample hledger CSV rules files. See also the full col- Here are some sample hledger CSV rules files. See also the full col-
lection at: lection at:
https://github.com/simonmichael/hledger/tree/master/examples/csv https://github.com/simonmichael/hledger/tree/master/examples/csv
Basic Basic
At minimum, the rules file must identify the date and amount fields, At minimum, the rules file must identify the date and amount fields,
and often it also specifies the date format and how many header lines and often it also specifies the date format and how many header lines
there are. Here's a simple CSV file and a rules file for it: there are. Here's a simple CSV file and a rules file for it:
Date, Description, Id, Amount Date, Description, Id, Amount
@ -77,8 +79,8 @@ EXAMPLES
Default account names are chosen, since we didn't set them. Default account names are chosen, since we didn't set them.
Bank of Ireland Bank of Ireland
Here's a CSV with two amount fields (Debit and Credit), and a balance Here's a CSV with two amount fields (Debit and Credit), and a balance
field, which we can use to add balance assertions, which is not neces- field, which we can use to add balance assertions, which is not neces-
sary but provides extra error checking: sary but provides extra error checking:
Date,Details,Debit,Credit,Balance Date,Details,Debit,Credit,Balance
@ -120,13 +122,13 @@ EXAMPLES
assets:bank:boi:checking EUR-5.0 = EUR126.0 assets:bank:boi:checking EUR-5.0 = EUR126.0
expenses:unknown EUR5.0 expenses:unknown EUR5.0
The balance assertions don't raise an error above, because we're read- The balance assertions don't raise an error above, because we're read-
ing directly from CSV, but they will be checked if these entries are ing directly from CSV, but they will be checked if these entries are
imported into a journal file. imported into a journal file.
Amazon Amazon
Here we convert amazon.com order history, and use an if block to gener- Here we convert amazon.com order history, and use an if block to gener-
ate a third posting if there's a fee. (In practice you'd probably get ate a third posting if there's a fee. (In practice you'd probably get
this data from your bank instead, but it's an example.) this data from your bank instead, but it's an example.)
"Date","Type","To/From","Name","Status","Amount","Fees","Transaction ID" "Date","Type","To/From","Name","Status","Amount","Fees","Transaction ID"
@ -178,7 +180,7 @@ EXAMPLES
expenses:fees $1.00 expenses:fees $1.00
Paypal Paypal
Here's a real-world rules file for (customised) Paypal CSV, with some Here's a real-world rules file for (customised) Paypal CSV, with some
Paypal-specific rules, and a second rules file included: Paypal-specific rules, and a second rules file included:
"Date","Time","TimeZone","Name","Type","Status","Currency","Gross","Fee","Net","From Email Address","To Email Address","Transaction ID","Item Title","Item ID","Reference Txn ID","Receipt ID","Balance","Note" "Date","Time","TimeZone","Name","Type","Status","Currency","Gross","Fee","Net","From Email Address","To Email Address","Transaction ID","Item Title","Item ID","Reference Txn ID","Receipt ID","Balance","Note"
@ -333,9 +335,9 @@ CSV RULES
skip skip
skip N skip N
The word "skip" followed by a number (or no number, meaning 1) tells The word "skip" followed by a number (or no number, meaning 1) tells
hledger to ignore this many non-empty lines preceding the CSV data. hledger to ignore this many non-empty lines preceding the CSV data.
(Empty/blank lines are skipped automatically.) You'll need this when- (Empty/blank lines are skipped automatically.) You'll need this when-
ever your CSV data contains header lines. ever your CSV data contains header lines.
It also has a second purpose: it can be used inside if blocks to ignore It also has a second purpose: it can be used inside if blocks to ignore
@ -344,27 +346,27 @@ CSV RULES
fields fields
fields FIELDNAME1, FIELDNAME2, ... fields FIELDNAME1, FIELDNAME2, ...
A fields list (the word "fields" followed by comma-separated field A fields list (the word "fields" followed by comma-separated field
names) is the quick way to assign CSV field values to hledger fields. names) is the quick way to assign CSV field values to hledger fields.
It does two things: It does two things:
1. it names the CSV fields. This is optional, but can be convenient 1. it names the CSV fields. This is optional, but can be convenient
later for interpolating them. later for interpolating them.
2. when you use a standard hledger field name, it assigns the CSV value 2. when you use a standard hledger field name, it assigns the CSV value
to that part of the hledger transaction. to that part of the hledger transaction.
Here's an example that says "use the 1st, 2nd and 4th fields as the Here's an example that says "use the 1st, 2nd and 4th fields as the
transaction's date, description and amount; name the last two fields transaction's date, description and amount; name the last two fields
for later reference; and ignore the others": for later reference; and ignore the others":
fields date, description, , amount, , , somefield, anotherfield fields date, description, , amount, , , somefield, anotherfield
Field names may not contain whitespace. Fields you don't care about Field names may not contain whitespace. Fields you don't care about
can be left unnamed. Currently there must be least two items (there can be left unnamed. Currently there must be least two items (there
must be at least one comma). must be at least one comma).
Note, always use comma in the fields list, even if your CSV uses an- Note, always use comma in the fields list, even if your CSV uses an-
other separator character. other separator character.
Here are the standard hledger field/pseudo-field names. For more about Here are the standard hledger field/pseudo-field names. For more about
@ -377,52 +379,52 @@ CSV RULES
Posting field names Posting field names
account account
accountN, where N is 1 to 99, causes a posting to be generated, with accountN, where N is 1 to 99, causes a posting to be generated, with
that account name. that account name.
Most often there are two postings, so you'll want to set account1 and Most often there are two postings, so you'll want to set account1 and
account2. Typically account1 is associated with the CSV file, and is account2. Typically account1 is associated with the CSV file, and is
set once with a top-level assignment, while account2 is set based on set once with a top-level assignment, while account2 is set based on
each transaction's description, and in conditional blocks. each transaction's description, and in conditional blocks.
If a posting's account name is left unset but its amount is set (see If a posting's account name is left unset but its amount is set (see
below), a default account name will be chosen (like "expenses:unknown" below), a default account name will be chosen (like "expenses:unknown"
or "income:unknown"). or "income:unknown").
amount amount
amountN sets posting N's amount. If the CSV uses separate fields for amountN sets posting N's amount. If the CSV uses separate fields for
inflows and outflows, you can use amountN-in and amountN-out instead. inflows and outflows, you can use amountN-in and amountN-out instead.
By assigning to amount1, amount2, ... etc. you can generate anywhere By assigning to amount1, amount2, ... etc. you can generate anywhere
from 0 to 99 postings. from 0 to 99 postings.
There is also an older, unnumbered form of these names, suitable for There is also an older, unnumbered form of these names, suitable for
2-posting transactions, which sets both posting 1's and (negated) post- 2-posting transactions, which sets both posting 1's and (negated) post-
ing 2's amount: amount, or amount-in and amount-out. This is still ing 2's amount: amount, or amount-in and amount-out. This is still
supported because it keeps pre-hledger-1.17 csv rules files working, supported because it keeps pre-hledger-1.17 csv rules files working,
and because it can be more succinct, and because it converts posting and because it can be more succinct, and because it converts posting
2's amount to cost if there's a transaction price, which can be useful. 2's amount to cost if there's a transaction price, which can be useful.
If you have an existing rules file using the unnumbered form, you might If you have an existing rules file using the unnumbered form, you might
want to use the numbered form in certain conditional blocks, without want to use the numbered form in certain conditional blocks, without
having to update and retest all the old rules. To facilitate this, having to update and retest all the old rules. To facilitate this,
posting 1 ignores amount/amount-in/amount-out if any of posting 1 ignores amount/amount-in/amount-out if any of
amount1/amount1-in/amount1-out are assigned, and posting 2 ignores them amount1/amount1-in/amount1-out are assigned, and posting 2 ignores them
if any of amount2/amount2-in/amount2-out are assigned, avoiding con- if any of amount2/amount2-in/amount2-out are assigned, avoiding con-
flicts. flicts.
currency currency
If the CSV has the currency symbol in a separate field (ie, not part of If the CSV has the currency symbol in a separate field (ie, not part of
the amount field), you can use currencyN to prepend it to posting N's the amount field), you can use currencyN to prepend it to posting N's
amount. Or, currency with no number affects all postings. amount. Or, currency with no number affects all postings.
balance balance
balanceN sets a balance assertion amount (or if the posting amount is balanceN sets a balance assertion amount (or if the posting amount is
left empty, a balance assignment) on posting N. left empty, a balance assignment) on posting N.
Also, for compatibility with hledger <1.17: balance with no number is Also, for compatibility with hledger <1.17: balance with no number is
equivalent to balance1. equivalent to balance1.
You can adjust the type of assertion/assignment with the balance-type You can adjust the type of assertion/assignment with the balance-type
rule (see below). rule (see below).
comment comment
@ -434,11 +436,11 @@ CSV RULES
field assignment field assignment
HLEDGERFIELDNAME FIELDVALUE HLEDGERFIELDNAME FIELDVALUE
Instead of or in addition to a fields list, you can use a "field as- Instead of or in addition to a fields list, you can use a "field as-
signment" rule to set the value of a single hledger field, by writing signment" rule to set the value of a single hledger field, by writing
its name (any of the standard hledger field names above) followed by a its name (any of the standard hledger field names above) followed by a
text value. The value may contain interpolated CSV fields, referenced text value. The value may contain interpolated CSV fields, referenced
by their 1-based position in the CSV record (%N), or by the name they by their 1-based position in the CSV record (%N), or by the name they
were given in the fields list (%CSVFIELDNAME). Some examples: were given in the fields list (%CSVFIELDNAME). Some examples:
# set the amount to the 4th CSV field, with " USD" appended # set the amount to the 4th CSV field, with " USD" appended
@ -447,18 +449,18 @@ CSV RULES
# combine three fields to make a comment, containing note: and date: tags # combine three fields to make a comment, containing note: and date: tags
comment note: %somefield - %anotherfield, date: %1 comment note: %somefield - %anotherfield, date: %1
Interpolation strips outer whitespace (so a CSV value like " 1 " be- Interpolation strips outer whitespace (so a CSV value like " 1 " be-
comes 1 when interpolated) (#1051). See TIPS below for more about ref- comes 1 when interpolated) (#1051). See TIPS below for more about ref-
erencing other fields. erencing other fields.
separator separator
You can use the separator directive to read other kinds of character- You can use the separator directive to read other kinds of character-
separated data. Eg to read SSV (Semicolon Separated Values), use: separated data. Eg to read SSV (Semicolon Separated Values), use:
separator ; separator ;
The separator directive accepts exactly one single byte character as a The separator directive accepts exactly one single byte character as a
separator. To specify whitespace characters, you may use the special separator. To specify whitespace characters, you may use the special
words TAB or SPACE. Eg to read TSV (Tab Separated Values), use: words TAB or SPACE. Eg to read TSV (Tab Separated Values), use:
separator TAB separator TAB
@ -476,24 +478,24 @@ CSV RULES
RULE RULE
RULE RULE
Conditional blocks ("if blocks") are a block of rules that are applied Conditional blocks ("if blocks") are a block of rules that are applied
only to CSV records which match certain patterns. They are often used only to CSV records which match certain patterns. They are often used
for customising account names based on transaction descriptions. for customising account names based on transaction descriptions.
Each MATCHER can be a record matcher, which looks like this: Each MATCHER can be a record matcher, which looks like this:
REGEX REGEX
REGEX is a case-insensitive regular expression which tries to match REGEX is a case-insensitive regular expression which tries to match
anywhere within the CSV record. It is a POSIX ERE (extended regular anywhere within the CSV record. It is a POSIX ERE (extended regular
expression) that also supports GNU word boundaries (\b, \B, \<, \>), expression) that also supports GNU word boundaries (\b, \B, \<, \>),
and nothing else. If you have trouble, be sure to check our and nothing else. If you have trouble, be sure to check our
https://hledger.org/hledger.html#regular-expressions doc. https://hledger.org/hledger.html#regular-expressions doc.
Important note: the record that is matched is not the original record, Important note: the record that is matched is not the original record,
but a synthetic one, with any enclosing double quotes (but not enclos- but a synthetic one, with any enclosing double quotes (but not enclos-
ing whitespace) removed, and always comma-separated (which means that a ing whitespace) removed, and always comma-separated (which means that a
field containing a comma will appear like two fields). Eg, if the field containing a comma will appear like two fields). Eg, if the
original record is 2020-01-01; "Acme, Inc."; 1,000, the REGEX will ac- original record is 2020-01-01; "Acme, Inc."; 1,000, the REGEX will ac-
tually see 2020-01-01,Acme, Inc., 1,000). tually see 2020-01-01,Acme, Inc., 1,000).
@ -501,16 +503,16 @@ CSV RULES
%CSVFIELD REGEX %CSVFIELD REGEX
which matches just the content of a particular CSV field. CSVFIELD is which matches just the content of a particular CSV field. CSVFIELD is
a percent sign followed by the field's name or column number, like a percent sign followed by the field's name or column number, like
%date or %1. %date or %1.
A single matcher can be written on the same line as the "if"; or multi- A single matcher can be written on the same line as the "if"; or multi-
ple matchers can be written on the following lines, non-indented. Mul- ple matchers can be written on the following lines, non-indented. Mul-
tiple matchers are OR'd (any one of them can match). tiple matchers are OR'd (any one of them can match).
After the patterns there should be one or more rules to apply, all in- After the patterns there should be one or more rules to apply, all in-
dented by at least one space. Three kinds of rule are allowed in con- dented by at least one space. Three kinds of rule are allowed in con-
ditional blocks: ditional blocks:
o field assignments (to set a hledger field) o field assignments (to set a hledger field)
@ -534,7 +536,7 @@ CSV RULES
comment XXX deductible ? check it comment XXX deductible ? check it
end end
This rule can be used inside if blocks (only), to make hledger stop This rule can be used inside if blocks (only), to make hledger stop
reading this CSV file and move on to the next input file, or to command reading this CSV file and move on to the next input file, or to command
execution. Eg: execution. Eg:
@ -545,10 +547,10 @@ CSV RULES
date-format date-format
date-format DATEFMT date-format DATEFMT
This is a helper for the date (and date2) fields. If your CSV dates This is a helper for the date (and date2) fields. If your CSV dates
are not formatted like YYYY-MM-DD, YYYY/MM/DD or YYYY.MM.DD, you'll are not formatted like YYYY-MM-DD, YYYY/MM/DD or YYYY.MM.DD, you'll
need to add a date-format rule describing them with a strptime date need to add a date-format rule describing them with a strptime date
parsing pattern, which must parse the CSV date value completely. Some parsing pattern, which must parse the CSV date value completely. Some
examples: examples:
# MM/DD/YY # MM/DD/YY
@ -570,15 +572,15 @@ CSV RULES
mat.html#v:formatTime mat.html#v:formatTime
newest-first newest-first
hledger always sorts the generated transactions by date. Transactions hledger always sorts the generated transactions by date. Transactions
on the same date should appear in the same order as their CSV records, on the same date should appear in the same order as their CSV records,
as hledger can usually auto-detect whether the CSV's normal order is as hledger can usually auto-detect whether the CSV's normal order is
oldest first or newest first. But if all of the following are true: oldest first or newest first. But if all of the following are true:
o the CSV might sometimes contain just one day of data (all records o the CSV might sometimes contain just one day of data (all records
having the same date) having the same date)
o the CSV records are normally in reverse chronological order (newest o the CSV records are normally in reverse chronological order (newest
at the top) at the top)
o and you care about preserving the order of same-day transactions o and you care about preserving the order of same-day transactions
@ -591,9 +593,9 @@ CSV RULES
include include
include RULESFILE include RULESFILE
This includes the contents of another CSV rules file at this point. This includes the contents of another CSV rules file at this point.
RULESFILE is an absolute file path or a path relative to the current RULESFILE is an absolute file path or a path relative to the current
file's directory. This can be useful for sharing common rules between file's directory. This can be useful for sharing common rules between
several rules files, eg: several rules files, eg:
# someaccount.csv.rules # someaccount.csv.rules
@ -608,10 +610,10 @@ CSV RULES
balance-type balance-type
Balance assertions generated by assigning to balanceN are of the simple Balance assertions generated by assigning to balanceN are of the simple
= type by default, which is a single-commodity, subaccount-excluding = type by default, which is a single-commodity, subaccount-excluding
assertion. You may find the subaccount-including variants more useful, assertion. You may find the subaccount-including variants more useful,
eg if you have created some virtual subaccounts of checking to help eg if you have created some virtual subaccounts of checking to help
with budgeting. You can select a different type of assertion with the with budgeting. You can select a different type of assertion with the
balance-type rule: balance-type rule:
# balance assertions will consider all commodities and all subaccounts # balance assertions will consider all commodities and all subaccounts
@ -626,19 +628,19 @@ CSV RULES
TIPS TIPS
Rapid feedback Rapid feedback
It's a good idea to get rapid feedback while creating/troubleshooting It's a good idea to get rapid feedback while creating/troubleshooting
CSV rules. Here's a good way, using entr from http://eradman.com/entr- CSV rules. Here's a good way, using entr from http://eradman.com/entr-
project : project :
$ ls foo.csv* | entr bash -c 'echo ----; hledger -f foo.csv print desc:SOMEDESC' $ ls foo.csv* | entr bash -c 'echo ----; hledger -f foo.csv print desc:SOMEDESC'
A desc: query (eg) is used to select just one, or a few, transactions A desc: query (eg) is used to select just one, or a few, transactions
of interest. "bash -c" is used to run multiple commands, so we can of interest. "bash -c" is used to run multiple commands, so we can
echo a separator each time the command re-runs, making it easier to echo a separator each time the command re-runs, making it easier to
read the output. read the output.
Valid CSV Valid CSV
hledger accepts CSV conforming to RFC 4180. When CSV values are en- hledger accepts CSV conforming to RFC 4180. When CSV values are en-
closed in quotes, note: closed in quotes, note:
o they must be double quotes (not single quotes) o they must be double quotes (not single quotes)
@ -646,9 +648,9 @@ TIPS
o spaces outside the quotes are not allowed o spaces outside the quotes are not allowed
File Extension File Extension
CSV ("Character Separated Values") files should be named with one of CSV ("Character Separated Values") files should be named with one of
these filename extensions: .csv, .ssv, .tsv. Or, the file path should these filename extensions: .csv, .ssv, .tsv. Or, the file path should
be prefixed with one of csv:, ssv:, tsv:. This helps hledger identify be prefixed with one of csv:, ssv:, tsv:. This helps hledger identify
the format and show the right error messages. For example: the format and show the right error messages. For example:
$ hledger -f foo.ssv print $ hledger -f foo.ssv print
@ -660,44 +662,44 @@ TIPS
More about this: Input files in the hledger manual. More about this: Input files in the hledger manual.
Reading multiple CSV files Reading multiple CSV files
If you use multiple -f options to read multiple CSV files at once, If you use multiple -f options to read multiple CSV files at once,
hledger will look for a correspondingly-named rules file for each CSV hledger will look for a correspondingly-named rules file for each CSV
file. But if you use the --rules-file option, that rules file will be file. But if you use the --rules-file option, that rules file will be
used for all the CSV files. used for all the CSV files.
Valid transactions Valid transactions
After reading a CSV file, hledger post-processes and validates the gen- After reading a CSV file, hledger post-processes and validates the gen-
erated journal entries as it would for a journal file - balancing them, erated journal entries as it would for a journal file - balancing them,
applying balance assignments, and canonicalising amount styles. Any applying balance assignments, and canonicalising amount styles. Any
errors at this stage will be reported in the usual way, displaying the errors at this stage will be reported in the usual way, displaying the
problem entry. problem entry.
There is one exception: balance assertions, if you have generated them, There is one exception: balance assertions, if you have generated them,
will not be checked, since normally these will work only when the CSV will not be checked, since normally these will work only when the CSV
data is part of the main journal. If you do need to check balance as- data is part of the main journal. If you do need to check balance as-
sertions generated from CSV right away, pipe into another hledger: sertions generated from CSV right away, pipe into another hledger:
$ hledger -f file.csv print | hledger -f- print $ hledger -f file.csv print | hledger -f- print
Deduplicating, importing Deduplicating, importing
When you download a CSV file periodically, eg to get your latest bank When you download a CSV file periodically, eg to get your latest bank
transactions, the new file may overlap with the old one, containing transactions, the new file may overlap with the old one, containing
some of the same records. some of the same records.
The import command will (a) detect the new transactions, and (b) append The import command will (a) detect the new transactions, and (b) append
just those transactions to your main journal. It is idempotent, so you just those transactions to your main journal. It is idempotent, so you
don't have to remember how many times you ran it or with which version don't have to remember how many times you ran it or with which version
of the CSV. (It keeps state in a hidden .latest.FILE.csv file.) This of the CSV. (It keeps state in a hidden .latest.FILE.csv file.) This
is the easiest way to import CSV data. Eg: is the easiest way to import CSV data. Eg:
# download the latest CSV files, then run this command. # download the latest CSV files, then run this command.
# Note, no -f flags needed here. # Note, no -f flags needed here.
$ hledger import *.csv [--dry] $ hledger import *.csv [--dry]
This method works for most CSV files. (Where records have a stable This method works for most CSV files. (Where records have a stable
chronological order, and new records appear only at the new end.) chronological order, and new records appear only at the new end.)
A number of other tools and workflows, hledger-specific and otherwise, A number of other tools and workflows, hledger-specific and otherwise,
exist for converting, deduplicating, classifying and managing CSV data. exist for converting, deduplicating, classifying and managing CSV data.
See: See:
@ -708,43 +710,43 @@ TIPS
Setting amounts Setting amounts
A posting amount can be set in one of these ways: A posting amount can be set in one of these ways:
o by assigning (with a fields list or field assigment) to amountN o by assigning (with a fields list or field assigment) to amountN
(posting N's amount) or amount (posting 1's amount) (posting N's amount) or amount (posting 1's amount)
o by assigning to amountN-in and amountN-out (or amount-in and amount- o by assigning to amountN-in and amountN-out (or amount-in and amount-
out). For each CSV record, whichever of these has a non-zero value out). For each CSV record, whichever of these has a non-zero value
will be used, with appropriate sign. If both contain a non-zero will be used, with appropriate sign. If both contain a non-zero
value, this may not work. value, this may not work.
o by assigning to balanceN (or balance) instead of the above, setting o by assigning to balanceN (or balance) instead of the above, setting
the amount indirectly via a balance assignment. If you do this the the amount indirectly via a balance assignment. If you do this the
default account name may be wrong, so you should set that explicitly. default account name may be wrong, so you should set that explicitly.
There is some special handling for an amount's sign: There is some special handling for an amount's sign:
o If an amount value is parenthesised, it will be de-parenthesised and o If an amount value is parenthesised, it will be de-parenthesised and
sign-flipped. sign-flipped.
o If an amount value begins with a double minus sign, those cancel out o If an amount value begins with a double minus sign, those cancel out
and are removed. and are removed.
o If an amount value begins with a plus sign, that will be removed o If an amount value begins with a plus sign, that will be removed
Setting currency/commodity Setting currency/commodity
If the currency/commodity symbol is included in the CSV's amount If the currency/commodity symbol is included in the CSV's amount
field(s), you don't have to do anything special. field(s), you don't have to do anything special.
If the currency is provided as a separate CSV field, you can either: If the currency is provided as a separate CSV field, you can either:
o assign that to currency, which adds it to all posting amounts. The o assign that to currency, which adds it to all posting amounts. The
symbol will prepended to the amount quantity (on the left side). If symbol will prepended to the amount quantity (on the left side). If
you write a trailing space after the symbol, there will be a space you write a trailing space after the symbol, there will be a space
between symbol and amount (an exception to the usual whitespace between symbol and amount (an exception to the usual whitespace
stripping). stripping).
o or assign it to currencyN which adds it to posting N's amount only. o or assign it to currencyN which adds it to posting N's amount only.
o or for more control, construct the amount from symbol and quantity o or for more control, construct the amount from symbol and quantity
using field assignment, eg: using field assignment, eg:
fields date,description,currency,quantity fields date,description,currency,quantity
@ -752,9 +754,9 @@ TIPS
amount %quantity %currency amount %quantity %currency
Referencing other fields Referencing other fields
In field assignments, you can interpolate only CSV fields, not hledger In field assignments, you can interpolate only CSV fields, not hledger
fields. In the example below, there's both a CSV field and a hledger fields. In the example below, there's both a CSV field and a hledger
field named amount1, but %amount1 always means the CSV field, not the field named amount1, but %amount1 always means the CSV field, not the
hledger field: hledger field:
# Name the third CSV field "amount1" # Name the third CSV field "amount1"
@ -766,7 +768,7 @@ TIPS
# Set comment to the CSV amount1 (not the amount1 assigned above) # Set comment to the CSV amount1 (not the amount1 assigned above)
comment %amount1 comment %amount1
Here, since there's no CSV amount1 field, %amount1 will produce a lit- Here, since there's no CSV amount1 field, %amount1 will produce a lit-
eral "amount1": eral "amount1":
fields date,description,csvamount fields date,description,csvamount
@ -774,7 +776,7 @@ TIPS
# Can't interpolate amount1 here # Can't interpolate amount1 here
comment %amount1 comment %amount1
When there are multiple field assignments to the same hledger field, When there are multiple field assignments to the same hledger field,
only the last one takes effect. Here, comment's value will be be B, or only the last one takes effect. Here, comment's value will be be B, or
C if "something" is matched, but never A: C if "something" is matched, but never A:
@ -784,14 +786,14 @@ TIPS
comment C comment C
How CSV rules are evaluated How CSV rules are evaluated
Here's how to think of CSV rules being evaluated (if you really need Here's how to think of CSV rules being evaluated (if you really need
to). First, to). First,
o include - all includes are inlined, from top to bottom, depth first. o include - all includes are inlined, from top to bottom, depth first.
(At each include point the file is inlined and scanned for further (At each include point the file is inlined and scanned for further
includes, recursively, before proceeding.) includes, recursively, before proceeding.)
Then "global" rules are evaluated, top to bottom. If a rule is re- Then "global" rules are evaluated, top to bottom. If a rule is re-
peated, the last one wins: peated, the last one wins:
o skip (at top level) o skip (at top level)
@ -805,30 +807,30 @@ TIPS
Then for each CSV record in turn: Then for each CSV record in turn:
o test all if blocks. If any of them contain a end rule, skip all re- o test all if blocks. If any of them contain a end rule, skip all re-
maining CSV records. Otherwise if any of them contain a skip rule, maining CSV records. Otherwise if any of them contain a skip rule,
skip that many CSV records. If there are multiple matched skip skip that many CSV records. If there are multiple matched skip
rules, the first one wins. rules, the first one wins.
o collect all field assignments at top level and in matched if blocks. o collect all field assignments at top level and in matched if blocks.
When there are multiple assignments for a field, keep only the last When there are multiple assignments for a field, keep only the last
one. one.
o compute a value for each hledger field - either the one that was as- o compute a value for each hledger field - either the one that was as-
signed to it (and interpolate the %CSVFIELDNAME references), or a de- signed to it (and interpolate the %CSVFIELDNAME references), or a de-
fault fault
o generate a synthetic hledger transaction from these values. o generate a synthetic hledger transaction from these values.
This is all part of the CSV reader, one of several readers hledger can This is all part of the CSV reader, one of several readers hledger can
use to parse input files. When all files have been read successfully, use to parse input files. When all files have been read successfully,
the transactions are passed as input to whichever hledger command the the transactions are passed as input to whichever hledger command the
user specified. user specified.
REPORTING BUGS REPORTING BUGS
Report bugs at http://bugs.hledger.org (or on the #hledger IRC channel Report bugs at http://bugs.hledger.org (or on the #hledger IRC channel
or hledger mail list) or hledger mail list)
@ -842,7 +844,7 @@ COPYRIGHT
SEE ALSO SEE ALSO
hledger(1), hledger-ui(1), hledger-web(1), hledger-api(1), hledger(1), hledger-ui(1), hledger-web(1), hledger-api(1),
hledger_csv(5), hledger_journal(5), hledger_timeclock(5), hledger_time- hledger_csv(5), hledger_journal(5), hledger_timeclock(5), hledger_time-
dot(5), ledger(1) dot(5), ledger(1)
@ -850,4 +852,4 @@ SEE ALSO
hledger 1.17.99 June 2020 hledger_csv(5) hledger 1.18 June 2020 hledger_csv(5)

View File

@ -1,6 +1,6 @@
.\"t .\"t
.TH "hledger_journal" "5" "June 2020" "hledger 1.17.99" "hledger User Manuals" .TH "hledger_journal" "5" "June 2020" "hledger 1.18" "hledger User Manuals"

View File

@ -4,8 +4,8 @@ stdin.
 
File: hledger_journal.info, Node: Top, Up: (dir) File: hledger_journal.info, Node: Top, Up: (dir)
hledger_journal(5) hledger 1.17.99 hledger_journal(5) hledger 1.18
********************************** *******************************
Journal - hledger's default file format, representing a General Journal Journal - hledger's default file format, representing a General Journal
@ -1823,124 +1823,124 @@ will have these tags added:
 
Tag Table: Tag Table:
Node: Top76 Node: Top76
Node: Transactions1875 Node: Transactions1869
Ref: #transactions1967 Ref: #transactions1961
Node: Dates3251 Node: Dates3245
Ref: #dates3350 Ref: #dates3344
Node: Simple dates3415 Node: Simple dates3409
Ref: #simple-dates3541 Ref: #simple-dates3535
Node: Secondary dates4050 Node: Secondary dates4044
Ref: #secondary-dates4204 Ref: #secondary-dates4198
Node: Posting dates5540 Node: Posting dates5534
Ref: #posting-dates5669 Ref: #posting-dates5663
Node: Status7041 Node: Status7035
Ref: #status7162 Ref: #status7156
Node: Description8870 Node: Description8864
Ref: #description9004 Ref: #description8998
Node: Payee and note9324 Node: Payee and note9318
Ref: #payee-and-note9438 Ref: #payee-and-note9432
Node: Comments9773 Node: Comments9767
Ref: #comments9899 Ref: #comments9893
Node: Tags11093 Node: Tags11087
Ref: #tags11208 Ref: #tags11202
Node: Postings12601 Node: Postings12595
Ref: #postings12729 Ref: #postings12723
Node: Virtual postings13755 Node: Virtual postings13749
Ref: #virtual-postings13872 Ref: #virtual-postings13866
Node: Account names15177 Node: Account names15171
Ref: #account-names15318 Ref: #account-names15312
Node: Amounts15805 Node: Amounts15799
Ref: #amounts15944 Ref: #amounts15938
Node: Digit group marks17052 Node: Digit group marks17046
Ref: #digit-group-marks17200 Ref: #digit-group-marks17194
Node: Amount display style18138 Node: Amount display style18132
Ref: #amount-display-style18292 Ref: #amount-display-style18286
Node: Transaction prices19729 Node: Transaction prices19723
Ref: #transaction-prices19901 Ref: #transaction-prices19895
Node: Lot prices and lot dates22233 Node: Lot prices and lot dates22227
Ref: #lot-prices-and-lot-dates22430 Ref: #lot-prices-and-lot-dates22424
Node: Balance assertions22918 Node: Balance assertions22912
Ref: #balance-assertions23104 Ref: #balance-assertions23098
Node: Assertions and ordering24137 Node: Assertions and ordering24131
Ref: #assertions-and-ordering24325 Ref: #assertions-and-ordering24319
Node: Assertions and included files25025 Node: Assertions and included files25019
Ref: #assertions-and-included-files25268 Ref: #assertions-and-included-files25262
Node: Assertions and multiple -f options25601 Node: Assertions and multiple -f options25595
Ref: #assertions-and-multiple--f-options25857 Ref: #assertions-and-multiple--f-options25851
Node: Assertions and commodities25989 Node: Assertions and commodities25983
Ref: #assertions-and-commodities26221 Ref: #assertions-and-commodities26215
Node: Assertions and prices27378 Node: Assertions and prices27372
Ref: #assertions-and-prices27592 Ref: #assertions-and-prices27586
Node: Assertions and subaccounts28032 Node: Assertions and subaccounts28026
Ref: #assertions-and-subaccounts28261 Ref: #assertions-and-subaccounts28255
Node: Assertions and virtual postings28585 Node: Assertions and virtual postings28579
Ref: #assertions-and-virtual-postings28827 Ref: #assertions-and-virtual-postings28821
Node: Assertions and precision28969 Node: Assertions and precision28963
Ref: #assertions-and-precision29162 Ref: #assertions-and-precision29156
Node: Balance assignments29429 Node: Balance assignments29423
Ref: #balance-assignments29603 Ref: #balance-assignments29597
Node: Balance assignments and prices30767 Node: Balance assignments and prices30761
Ref: #balance-assignments-and-prices30939 Ref: #balance-assignments-and-prices30933
Node: Directives31163 Node: Directives31157
Ref: #directives31322 Ref: #directives31316
Node: Directives and multiple files37013 Node: Directives and multiple files37007
Ref: #directives-and-multiple-files37196 Ref: #directives-and-multiple-files37190
Node: Comment blocks37860 Node: Comment blocks37854
Ref: #comment-blocks38043 Ref: #comment-blocks38037
Node: Including other files38219 Node: Including other files38213
Ref: #including-other-files38399 Ref: #including-other-files38393
Node: Default year39050 Node: Default year39044
Ref: #default-year39219 Ref: #default-year39213
Node: Declaring commodities39626 Node: Declaring commodities39620
Ref: #declaring-commodities39809 Ref: #declaring-commodities39803
Node: Default commodity41615 Node: Default commodity41609
Ref: #default-commodity41801 Ref: #default-commodity41795
Node: Declaring market prices42690 Node: Declaring market prices42684
Ref: #declaring-market-prices42885 Ref: #declaring-market-prices42879
Node: Declaring accounts43742 Node: Declaring accounts43736
Ref: #declaring-accounts43928 Ref: #declaring-accounts43922
Node: Account comments44853 Node: Account comments44847
Ref: #account-comments45016 Ref: #account-comments45010
Node: Account subdirectives45440 Node: Account subdirectives45434
Ref: #account-subdirectives45635 Ref: #account-subdirectives45629
Node: Account types45948 Node: Account types45942
Ref: #account-types46132 Ref: #account-types46126
Node: Account display order47771 Node: Account display order47765
Ref: #account-display-order47941 Ref: #account-display-order47935
Node: Rewriting accounts49092 Node: Rewriting accounts49086
Ref: #rewriting-accounts49277 Ref: #rewriting-accounts49271
Node: Basic aliases50034 Node: Basic aliases50028
Ref: #basic-aliases50180 Ref: #basic-aliases50174
Node: Regex aliases50884 Node: Regex aliases50878
Ref: #regex-aliases51056 Ref: #regex-aliases51050
Node: Combining aliases51774 Node: Combining aliases51768
Ref: #combining-aliases51967 Ref: #combining-aliases51961
Node: Aliases and multiple files53243 Node: Aliases and multiple files53237
Ref: #aliases-and-multiple-files53452 Ref: #aliases-and-multiple-files53446
Node: end aliases54031 Node: end aliases54025
Ref: #end-aliases54188 Ref: #end-aliases54182
Node: Default parent account54289 Node: Default parent account54283
Ref: #default-parent-account54457 Ref: #default-parent-account54451
Node: Periodic transactions55341 Node: Periodic transactions55335
Ref: #periodic-transactions55516 Ref: #periodic-transactions55510
Node: Periodic rule syntax57388 Node: Periodic rule syntax57382
Ref: #periodic-rule-syntax57594 Ref: #periodic-rule-syntax57588
Node: Two spaces between period expression and description!58298 Node: Two spaces between period expression and description!58292
Ref: #two-spaces-between-period-expression-and-description58617 Ref: #two-spaces-between-period-expression-and-description58611
Node: Forecasting with periodic transactions59301 Node: Forecasting with periodic transactions59295
Ref: #forecasting-with-periodic-transactions59606 Ref: #forecasting-with-periodic-transactions59600
Node: Budgeting with periodic transactions61661 Node: Budgeting with periodic transactions61655
Ref: #budgeting-with-periodic-transactions61900 Ref: #budgeting-with-periodic-transactions61894
Node: Auto postings62349 Node: Auto postings62343
Ref: #auto-postings62489 Ref: #auto-postings62483
Node: Auto postings and multiple files64668 Node: Auto postings and multiple files64662
Ref: #auto-postings-and-multiple-files64872 Ref: #auto-postings-and-multiple-files64866
Node: Auto postings and dates65081 Node: Auto postings and dates65075
Ref: #auto-postings-and-dates65355 Ref: #auto-postings-and-dates65349
Node: Auto postings and transaction balancing / inferred amounts / balance assertions65530 Node: Auto postings and transaction balancing / inferred amounts / balance assertions65524
Ref: #auto-postings-and-transaction-balancing-inferred-amounts-balance-assertions65881 Ref: #auto-postings-and-transaction-balancing-inferred-amounts-balance-assertions65875
Node: Auto posting tags66223 Node: Auto posting tags66217
Ref: #auto-posting-tags66438 Ref: #auto-posting-tags66432
 
End Tag Table End Tag Table

View File

@ -1480,4 +1480,4 @@ SEE ALSO
hledger 1.17.99 June 2020 hledger_journal(5) hledger 1.18 June 2020 hledger_journal(5)

View File

@ -1,5 +1,5 @@
.TH "hledger_timeclock" "5" "June 2020" "hledger 1.17.99" "hledger User Manuals" .TH "hledger_timeclock" "5" "June 2020" "hledger 1.18" "hledger User Manuals"

View File

@ -4,8 +4,8 @@ stdin.
 
File: hledger_timeclock.info, Node: Top, Up: (dir) File: hledger_timeclock.info, Node: Top, Up: (dir)
hledger_timeclock(5) hledger 1.17.99 hledger_timeclock(5) hledger 1.18
************************************ *********************************
Timeclock - the time logging format of timeclock.el, as read by hledger Timeclock - the time logging format of timeclock.el, as read by hledger

View File

@ -78,4 +78,4 @@ SEE ALSO
hledger 1.17.99 June 2020 hledger_timeclock(5) hledger 1.18 June 2020 hledger_timeclock(5)

View File

@ -1,5 +1,5 @@
.TH "hledger_timedot" "5" "June 2020" "hledger 1.17.99" "hledger User Manuals" .TH "hledger_timedot" "5" "June 2020" "hledger 1.18" "hledger User Manuals"

View File

@ -4,8 +4,8 @@ stdin.
 
File: hledger_timedot.info, Node: Top, Up: (dir) File: hledger_timedot.info, Node: Top, Up: (dir)
hledger_timedot(5) hledger 1.17.99 hledger_timedot(5) hledger 1.18
********************************** *******************************
Timedot - hledger's human-friendly time logging format Timedot - hledger's human-friendly time logging format

View File

@ -161,4 +161,4 @@ SEE ALSO
hledger 1.17.99 June 2020 hledger_timedot(5) hledger 1.18 June 2020 hledger_timedot(5)

View File

@ -1,5 +1,5 @@
.TH "hledger-ui" "1" "June 2020" "hledger-ui 1.17.99" "hledger User Manuals" .TH "hledger-ui" "1" "June 2020" "hledger-ui 1.18" "hledger User Manuals"

View File

@ -3,8 +3,8 @@ This is hledger-ui.info, produced by makeinfo version 6.7 from stdin.
 
File: hledger-ui.info, Node: Top, Next: OPTIONS, Up: (dir) File: hledger-ui.info, Node: Top, Next: OPTIONS, Up: (dir)
hledger-ui(1) hledger-ui 1.17.99 hledger-ui(1) hledger-ui 1.18
******************************** *****************************
hledger-ui - terminal interface for the hledger accounting tool hledger-ui - terminal interface for the hledger accounting tool
@ -499,26 +499,26 @@ program is restarted.
 
Tag Table: Tag Table:
Node: Top71 Node: Top71
Node: OPTIONS1476 Node: OPTIONS1470
Ref: #options1573 Ref: #options1567
Node: KEYS5004 Node: KEYS4998
Ref: #keys5099 Ref: #keys5093
Node: SCREENS9375 Node: SCREENS9369
Ref: #screens9480 Ref: #screens9474
Node: Accounts screen9570 Node: Accounts screen9564
Ref: #accounts-screen9698 Ref: #accounts-screen9692
Node: Register screen11914 Node: Register screen11908
Ref: #register-screen12069 Ref: #register-screen12063
Node: Transaction screen14066 Node: Transaction screen14060
Ref: #transaction-screen14224 Ref: #transaction-screen14218
Node: Error screen15094 Node: Error screen15088
Ref: #error-screen15216 Ref: #error-screen15210
Node: ENVIRONMENT15460 Node: ENVIRONMENT15454
Ref: #environment15574 Ref: #environment15568
Node: FILES16381 Node: FILES16375
Ref: #files16480 Ref: #files16474
Node: BUGS16693 Node: BUGS16687
Ref: #bugs16770 Ref: #bugs16764
 
End Tag Table End Tag Table

View File

@ -441,4 +441,4 @@ SEE ALSO
hledger-ui 1.17.99 June 2020 hledger-ui(1) hledger-ui 1.18 June 2020 hledger-ui(1)

View File

@ -1,5 +1,5 @@
.TH "hledger-web" "1" "June 2020" "hledger-web 1.17.99" "hledger User Manuals" .TH "hledger-web" "1" "June 2020" "hledger-web 1.18" "hledger User Manuals"
@ -342,7 +342,9 @@ You can get JSON data from these routes:
\f[R] \f[R]
.fi .fi
.PP .PP
Eg, all account names in the journal (similar to the accounts command): Eg, all account names in the journal (similar to the accounts command).
(hledger-web\[aq]s JSON does not include newlines, here we use python to
prettify it):
.IP .IP
.nf .nf
\f[C] \f[C]
@ -410,117 +412,107 @@ You can add a new transaction to the journal with a PUT request to
capability (enabled by default). capability (enabled by default).
The payload must be the full, exact JSON representation of a hledger The payload must be the full, exact JSON representation of a hledger
transaction (partial data won\[aq]t do). transaction (partial data won\[aq]t do).
You can get sample JSON from \f[C]/transactions\f[R] or You can get sample JSON from hledger-web\[aq]s \f[C]/transactions\f[R]
\f[C]/accounttransactions\f[R], or you can export it with or \f[C]/accounttransactions\f[R], or you can export it with
hledger-lib\[aq]s \f[C]writeJsonFile\f[R] helper, like so: hledger-lib, eg like so:
.IP .IP
.nf .nf
\f[C] \f[C]
$ make ghci-web \&.../hledger$ stack ghci hledger-lib
>>> import Hledger >>> writeJsonFile \[dq]txn.json\[dq] (head $ jtxns samplejournal)
>>> writeJsonFile \[dq]txn.json\[dq] (head $ jtxns samplejournal) -- export samplejournal\[aq]s first txn
>>> :q >>> :q
\f[R] \f[R]
.fi .fi
.PP .PP
If you like, reformat the json to make it human-readable:
.IP
.nf
\f[C]
$ python -m json.tool txn.json >pretty
$ mv pretty txn.json
\f[R]
.fi
.PP
Here\[aq]s how it looks as of hledger-1.17 (remember, this JSON Here\[aq]s how it looks as of hledger-1.17 (remember, this JSON
corresponds to hledger\[aq]s Transaction and related data types): corresponds to hledger\[aq]s Transaction and related data types):
.IP .IP
.nf .nf
\f[C] \f[C]
{ {
\[dq]tcode\[dq]: \[dq]\[dq],
\[dq]tcomment\[dq]: \[dq]\[dq], \[dq]tcomment\[dq]: \[dq]\[dq],
\[dq]tdate\[dq]: \[dq]2008-01-01\[dq],
\[dq]tdate2\[dq]: null,
\[dq]tdescription\[dq]: \[dq]income\[dq],
\[dq]tindex\[dq]: 1,
\[dq]tpostings\[dq]: [ \[dq]tpostings\[dq]: [
{ {
\[dq]paccount\[dq]: \[dq]assets:bank:checking\[dq], \[dq]pbalanceassertion\[dq]: null,
\[dq]pstatus\[dq]: \[dq]Unmarked\[dq],
\[dq]pamount\[dq]: [ \[dq]pamount\[dq]: [
{ {
\[dq]acommodity\[dq]: \[dq]$\[dq],
\[dq]aismultiplier\[dq]: false,
\[dq]aprice\[dq]: null, \[dq]aprice\[dq]: null,
\[dq]acommodity\[dq]: \[dq]$\[dq],
\[dq]aquantity\[dq]: { \[dq]aquantity\[dq]: {
\[dq]decimalMantissa\[dq]: 10000000000, \[dq]floatingPoint\[dq]: 1,
\[dq]decimalPlaces\[dq]: 10, \[dq]decimalPlaces\[dq]: 10,
\[dq]floatingPoint\[dq]: 1 \[dq]decimalMantissa\[dq]: 10000000000
}, },
\[dq]aismultiplier\[dq]: false,
\[dq]astyle\[dq]: { \[dq]astyle\[dq]: {
\[dq]ascommodityside\[dq]: \[dq]L\[dq], \[dq]ascommodityside\[dq]: \[dq]L\[dq],
\[dq]ascommodityspaced\[dq]: false,
\[dq]asdecimalpoint\[dq]: \[dq].\[dq],
\[dq]asdigitgroups\[dq]: null, \[dq]asdigitgroups\[dq]: null,
\[dq]asprecision\[dq]: 2 \[dq]ascommodityspaced\[dq]: false,
\[dq]asprecision\[dq]: 2,
\[dq]asdecimalpoint\[dq]: \[dq].\[dq]
} }
} }
], ],
\[dq]pbalanceassertion\[dq]: null,
\[dq]pcomment\[dq]: \[dq]\[dq],
\[dq]pdate\[dq]: null,
\[dq]pdate2\[dq]: null,
\[dq]poriginal\[dq]: null,
\[dq]pstatus\[dq]: \[dq]Unmarked\[dq],
\[dq]ptags\[dq]: [],
\[dq]ptransaction_\[dq]: \[dq]1\[dq], \[dq]ptransaction_\[dq]: \[dq]1\[dq],
\[dq]ptype\[dq]: \[dq]RegularPosting\[dq] \[dq]paccount\[dq]: \[dq]assets:bank:checking\[dq],
\[dq]pdate\[dq]: null,
\[dq]ptype\[dq]: \[dq]RegularPosting\[dq],
\[dq]pcomment\[dq]: \[dq]\[dq],
\[dq]pdate2\[dq]: null,
\[dq]ptags\[dq]: [],
\[dq]poriginal\[dq]: null
}, },
{ {
\[dq]paccount\[dq]: \[dq]income:salary\[dq], \[dq]pbalanceassertion\[dq]: null,
\[dq]pstatus\[dq]: \[dq]Unmarked\[dq],
\[dq]pamount\[dq]: [ \[dq]pamount\[dq]: [
{ {
\[dq]acommodity\[dq]: \[dq]$\[dq],
\[dq]aismultiplier\[dq]: false,
\[dq]aprice\[dq]: null, \[dq]aprice\[dq]: null,
\[dq]acommodity\[dq]: \[dq]$\[dq],
\[dq]aquantity\[dq]: { \[dq]aquantity\[dq]: {
\[dq]decimalMantissa\[dq]: -10000000000, \[dq]floatingPoint\[dq]: -1,
\[dq]decimalPlaces\[dq]: 10, \[dq]decimalPlaces\[dq]: 10,
\[dq]floatingPoint\[dq]: -1 \[dq]decimalMantissa\[dq]: -10000000000
}, },
\[dq]aismultiplier\[dq]: false,
\[dq]astyle\[dq]: { \[dq]astyle\[dq]: {
\[dq]ascommodityside\[dq]: \[dq]L\[dq], \[dq]ascommodityside\[dq]: \[dq]L\[dq],
\[dq]ascommodityspaced\[dq]: false,
\[dq]asdecimalpoint\[dq]: \[dq].\[dq],
\[dq]asdigitgroups\[dq]: null, \[dq]asdigitgroups\[dq]: null,
\[dq]asprecision\[dq]: 2 \[dq]ascommodityspaced\[dq]: false,
\[dq]asprecision\[dq]: 2,
\[dq]asdecimalpoint\[dq]: \[dq].\[dq]
} }
} }
], ],
\[dq]pbalanceassertion\[dq]: null,
\[dq]pcomment\[dq]: \[dq]\[dq],
\[dq]pdate\[dq]: null,
\[dq]pdate2\[dq]: null,
\[dq]poriginal\[dq]: null,
\[dq]pstatus\[dq]: \[dq]Unmarked\[dq],
\[dq]ptags\[dq]: [],
\[dq]ptransaction_\[dq]: \[dq]1\[dq], \[dq]ptransaction_\[dq]: \[dq]1\[dq],
\[dq]ptype\[dq]: \[dq]RegularPosting\[dq] \[dq]paccount\[dq]: \[dq]income:salary\[dq],
\[dq]pdate\[dq]: null,
\[dq]ptype\[dq]: \[dq]RegularPosting\[dq],
\[dq]pcomment\[dq]: \[dq]\[dq],
\[dq]pdate2\[dq]: null,
\[dq]ptags\[dq]: [],
\[dq]poriginal\[dq]: null
} }
], ],
\[dq]tprecedingcomment\[dq]: \[dq]\[dq], \[dq]ttags\[dq]: [],
\[dq]tsourcepos\[dq]: { \[dq]tsourcepos\[dq]: {
\[dq]tag\[dq]: \[dq]JournalSourcePos\[dq],
\[dq]contents\[dq]: [ \[dq]contents\[dq]: [
\[dq]\[dq], \[dq]\[dq],
[ [
1, 1,
1 1
] ]
], ]
\[dq]tag\[dq]: \[dq]JournalSourcePos\[dq]
}, },
\[dq]tstatus\[dq]: \[dq]Unmarked\[dq], \[dq]tdate\[dq]: \[dq]2008-01-01\[dq],
\[dq]ttags\[dq]: [] \[dq]tcode\[dq]: \[dq]\[dq],
\[dq]tindex\[dq]: 1,
\[dq]tprecedingcomment\[dq]: \[dq]\[dq],
\[dq]tdate2\[dq]: null,
\[dq]tdescription\[dq]: \[dq]income\[dq],
\[dq]tstatus\[dq]: \[dq]Unmarked\[dq]
} }
\f[R] \f[R]
.fi .fi

View File

@ -3,8 +3,8 @@ This is hledger-web.info, produced by makeinfo version 6.7 from stdin.
 
File: hledger-web.info, Node: Top, Next: OPTIONS, Up: (dir) File: hledger-web.info, Node: Top, Next: OPTIONS, Up: (dir)
hledger-web(1) hledger-web 1.17.99 hledger-web(1) hledger-web 1.18
********************************** *******************************
hledger-web - web interface for the hledger accounting tool hledger-web - web interface for the hledger accounting tool
@ -347,7 +347,8 @@ $ hledger-web -f examples/sample.journal --serve-api
/accounttransactions/ACCOUNTNAME /accounttransactions/ACCOUNTNAME
Eg, all account names in the journal (similar to the accounts Eg, all account names in the journal (similar to the accounts
command): command). (hledger-web's JSON does not include newlines, here we use
python to prettify it):
$ curl -s http://127.0.0.1:5000/accountnames | python -m json.tool $ curl -s http://127.0.0.1:5000/accountnames | python -m json.tool
[ [
@ -405,106 +406,100 @@ AccountTransactionsReportItem (etc).
'/add', if hledger-web was started with the 'add' capability (enabled by '/add', if hledger-web was started with the 'add' capability (enabled by
default). The payload must be the full, exact JSON representation of a default). The payload must be the full, exact JSON representation of a
hledger transaction (partial data won't do). You can get sample JSON hledger transaction (partial data won't do). You can get sample JSON
from '/transactions' or '/accounttransactions', or you can export it from hledger-web's '/transactions' or '/accounttransactions', or you can
with hledger-lib's 'writeJsonFile' helper, like so: export it with hledger-lib, eg like so:
$ make ghci-web .../hledger$ stack ghci hledger-lib
>>> import Hledger >>> writeJsonFile "txn.json" (head $ jtxns samplejournal)
>>> writeJsonFile "txn.json" (head $ jtxns samplejournal) -- export samplejournal's first txn
>>> :q >>> :q
If you like, reformat the json to make it human-readable:
$ python -m json.tool txn.json >pretty
$ mv pretty txn.json
Here's how it looks as of hledger-1.17 (remember, this JSON Here's how it looks as of hledger-1.17 (remember, this JSON
corresponds to hledger's Transaction and related data types): corresponds to hledger's Transaction and related data types):
{ {
"tcode": "",
"tcomment": "", "tcomment": "",
"tdate": "2008-01-01",
"tdate2": null,
"tdescription": "income",
"tindex": 1,
"tpostings": [ "tpostings": [
{ {
"paccount": "assets:bank:checking", "pbalanceassertion": null,
"pstatus": "Unmarked",
"pamount": [ "pamount": [
{ {
"acommodity": "$",
"aismultiplier": false,
"aprice": null, "aprice": null,
"acommodity": "$",
"aquantity": { "aquantity": {
"decimalMantissa": 10000000000, "floatingPoint": 1,
"decimalPlaces": 10, "decimalPlaces": 10,
"floatingPoint": 1 "decimalMantissa": 10000000000
}, },
"aismultiplier": false,
"astyle": { "astyle": {
"ascommodityside": "L", "ascommodityside": "L",
"ascommodityspaced": false,
"asdecimalpoint": ".",
"asdigitgroups": null, "asdigitgroups": null,
"asprecision": 2 "ascommodityspaced": false,
"asprecision": 2,
"asdecimalpoint": "."
} }
} }
], ],
"pbalanceassertion": null,
"pcomment": "",
"pdate": null,
"pdate2": null,
"poriginal": null,
"pstatus": "Unmarked",
"ptags": [],
"ptransaction_": "1", "ptransaction_": "1",
"ptype": "RegularPosting" "paccount": "assets:bank:checking",
"pdate": null,
"ptype": "RegularPosting",
"pcomment": "",
"pdate2": null,
"ptags": [],
"poriginal": null
}, },
{ {
"paccount": "income:salary", "pbalanceassertion": null,
"pstatus": "Unmarked",
"pamount": [ "pamount": [
{ {
"acommodity": "$",
"aismultiplier": false,
"aprice": null, "aprice": null,
"acommodity": "$",
"aquantity": { "aquantity": {
"decimalMantissa": -10000000000, "floatingPoint": -1,
"decimalPlaces": 10, "decimalPlaces": 10,
"floatingPoint": -1 "decimalMantissa": -10000000000
}, },
"aismultiplier": false,
"astyle": { "astyle": {
"ascommodityside": "L", "ascommodityside": "L",
"ascommodityspaced": false,
"asdecimalpoint": ".",
"asdigitgroups": null, "asdigitgroups": null,
"asprecision": 2 "ascommodityspaced": false,
"asprecision": 2,
"asdecimalpoint": "."
} }
} }
], ],
"pbalanceassertion": null,
"pcomment": "",
"pdate": null,
"pdate2": null,
"poriginal": null,
"pstatus": "Unmarked",
"ptags": [],
"ptransaction_": "1", "ptransaction_": "1",
"ptype": "RegularPosting" "paccount": "income:salary",
"pdate": null,
"ptype": "RegularPosting",
"pcomment": "",
"pdate2": null,
"ptags": [],
"poriginal": null
} }
], ],
"tprecedingcomment": "", "ttags": [],
"tsourcepos": { "tsourcepos": {
"tag": "JournalSourcePos",
"contents": [ "contents": [
"", "",
[ [
1, 1,
1 1
] ]
], ]
"tag": "JournalSourcePos"
}, },
"tstatus": "Unmarked", "tdate": "2008-01-01",
"ttags": [] "tcode": "",
"tindex": 1,
"tprecedingcomment": "",
"tdate2": null,
"tdescription": "income",
"tstatus": "Unmarked"
} }
And here's how to test adding it with curl. This should add a new And here's how to test adding it with curl. This should add a new
@ -569,22 +564,22 @@ awkward.
 
Tag Table: Tag Table:
Node: Top72 Node: Top72
Node: OPTIONS1752 Node: OPTIONS1746
Ref: #options1857 Ref: #options1851
Node: PERMISSIONS8201 Node: PERMISSIONS8195
Ref: #permissions8340 Ref: #permissions8334
Node: EDITING UPLOADING DOWNLOADING9552 Node: EDITING UPLOADING DOWNLOADING9546
Ref: #editing-uploading-downloading9733 Ref: #editing-uploading-downloading9727
Node: RELOADING10567 Node: RELOADING10561
Ref: #reloading10701 Ref: #reloading10695
Node: JSON API11134 Node: JSON API11128
Ref: #json-api11248 Ref: #json-api11242
Node: ENVIRONMENT16812 Node: ENVIRONMENT16723
Ref: #environment16928 Ref: #environment16839
Node: FILES17661 Node: FILES17572
Ref: #files17761 Ref: #files17672
Node: BUGS17974 Node: BUGS17885
Ref: #bugs18052 Ref: #bugs17963
 
End Tag Table End Tag Table

View File

@ -305,7 +305,9 @@ JSON API
/accounts /accounts
/accounttransactions/ACCOUNTNAME /accounttransactions/ACCOUNTNAME
Eg, all account names in the journal (similar to the accounts command): Eg, all account names in the journal (similar to the accounts command).
(hledger-web's JSON does not include newlines, here we use python to
prettify it):
$ curl -s http://127.0.0.1:5000/accountnames | python -m json.tool $ curl -s http://127.0.0.1:5000/accountnames | python -m json.tool
[ [
@ -363,106 +365,100 @@ JSON API
/add, if hledger-web was started with the add capability (enabled by /add, if hledger-web was started with the add capability (enabled by
default). The payload must be the full, exact JSON representation of a default). The payload must be the full, exact JSON representation of a
hledger transaction (partial data won't do). You can get sample JSON hledger transaction (partial data won't do). You can get sample JSON
from /transactions or /accounttransactions, or you can export it with from hledger-web's /transactions or /accounttransactions, or you can
hledger-lib's writeJsonFile helper, like so: export it with hledger-lib, eg like so:
$ make ghci-web .../hledger$ stack ghci hledger-lib
>>> import Hledger >>> writeJsonFile "txn.json" (head $ jtxns samplejournal)
>>> writeJsonFile "txn.json" (head $ jtxns samplejournal) -- export samplejournal's first txn
>>> :q >>> :q
If you like, reformat the json to make it human-readable:
$ python -m json.tool txn.json >pretty
$ mv pretty txn.json
Here's how it looks as of hledger-1.17 (remember, this JSON corresponds Here's how it looks as of hledger-1.17 (remember, this JSON corresponds
to hledger's Transaction and related data types): to hledger's Transaction and related data types):
{ {
"tcode": "",
"tcomment": "", "tcomment": "",
"tdate": "2008-01-01",
"tdate2": null,
"tdescription": "income",
"tindex": 1,
"tpostings": [ "tpostings": [
{ {
"paccount": "assets:bank:checking", "pbalanceassertion": null,
"pstatus": "Unmarked",
"pamount": [ "pamount": [
{ {
"acommodity": "$",
"aismultiplier": false,
"aprice": null, "aprice": null,
"acommodity": "$",
"aquantity": { "aquantity": {
"decimalMantissa": 10000000000, "floatingPoint": 1,
"decimalPlaces": 10, "decimalPlaces": 10,
"floatingPoint": 1 "decimalMantissa": 10000000000
}, },
"aismultiplier": false,
"astyle": { "astyle": {
"ascommodityside": "L", "ascommodityside": "L",
"ascommodityspaced": false,
"asdecimalpoint": ".",
"asdigitgroups": null, "asdigitgroups": null,
"asprecision": 2 "ascommodityspaced": false,
"asprecision": 2,
"asdecimalpoint": "."
} }
} }
], ],
"pbalanceassertion": null,
"pcomment": "",
"pdate": null,
"pdate2": null,
"poriginal": null,
"pstatus": "Unmarked",
"ptags": [],
"ptransaction_": "1", "ptransaction_": "1",
"ptype": "RegularPosting" "paccount": "assets:bank:checking",
"pdate": null,
"ptype": "RegularPosting",
"pcomment": "",
"pdate2": null,
"ptags": [],
"poriginal": null
}, },
{ {
"paccount": "income:salary", "pbalanceassertion": null,
"pstatus": "Unmarked",
"pamount": [ "pamount": [
{ {
"acommodity": "$",
"aismultiplier": false,
"aprice": null, "aprice": null,
"acommodity": "$",
"aquantity": { "aquantity": {
"decimalMantissa": -10000000000, "floatingPoint": -1,
"decimalPlaces": 10, "decimalPlaces": 10,
"floatingPoint": -1 "decimalMantissa": -10000000000
}, },
"aismultiplier": false,
"astyle": { "astyle": {
"ascommodityside": "L", "ascommodityside": "L",
"ascommodityspaced": false,
"asdecimalpoint": ".",
"asdigitgroups": null, "asdigitgroups": null,
"asprecision": 2 "ascommodityspaced": false,
"asprecision": 2,
"asdecimalpoint": "."
} }
} }
], ],
"pbalanceassertion": null,
"pcomment": "",
"pdate": null,
"pdate2": null,
"poriginal": null,
"pstatus": "Unmarked",
"ptags": [],
"ptransaction_": "1", "ptransaction_": "1",
"ptype": "RegularPosting" "paccount": "income:salary",
"pdate": null,
"ptype": "RegularPosting",
"pcomment": "",
"pdate2": null,
"ptags": [],
"poriginal": null
} }
], ],
"tprecedingcomment": "", "ttags": [],
"tsourcepos": { "tsourcepos": {
"tag": "JournalSourcePos",
"contents": [ "contents": [
"", "",
[ [
1, 1,
1 1
] ]
], ]
"tag": "JournalSourcePos"
}, },
"tstatus": "Unmarked", "tdate": "2008-01-01",
"ttags": [] "tcode": "",
"tindex": 1,
"tprecedingcomment": "",
"tdate2": null,
"tdescription": "income",
"tstatus": "Unmarked"
} }
And here's how to test adding it with curl. This should add a new en- And here's how to test adding it with curl. This should add a new en-
@ -533,4 +529,4 @@ SEE ALSO
hledger-web 1.17.99 June 2020 hledger-web(1) hledger-web 1.18 June 2020 hledger-web(1)

View File

@ -1,6 +1,6 @@
.\"t .\"t
.TH "hledger" "1" "June 2020" "hledger 1.17.99" "hledger User Manuals" .TH "hledger" "1" "June 2020" "hledger 1.18" "hledger User Manuals"

View File

@ -3,8 +3,8 @@ This is hledger.info, produced by makeinfo version 6.7 from stdin.
 
File: hledger.info, Node: Top, Next: COMMON TASKS, Up: (dir) File: hledger.info, Node: Top, Next: COMMON TASKS, Up: (dir)
hledger(1) hledger 1.17.99 hledger(1) hledger 1.18
************************** ***********************
hledger - a command-line accounting tool hledger - a command-line accounting tool
@ -3776,187 +3776,187 @@ $ LANG=en_US.UTF-8 hledger -f my.journal print
 
Tag Table: Tag Table:
Node: Top68 Node: Top68
Node: COMMON TASKS2321 Node: COMMON TASKS2315
Ref: #common-tasks2433 Ref: #common-tasks2427
Node: Getting help2840 Node: Getting help2834
Ref: #getting-help2972 Ref: #getting-help2966
Node: Constructing command lines3525 Node: Constructing command lines3519
Ref: #constructing-command-lines3717 Ref: #constructing-command-lines3711
Node: Starting a journal file4414 Node: Starting a journal file4408
Ref: #starting-a-journal-file4612 Ref: #starting-a-journal-file4606
Node: Setting opening balances5800 Node: Setting opening balances5794
Ref: #setting-opening-balances5996 Ref: #setting-opening-balances5990
Node: Recording transactions9137 Node: Recording transactions9131
Ref: #recording-transactions9317 Ref: #recording-transactions9311
Node: Reconciling9873 Node: Reconciling9867
Ref: #reconciling10016 Ref: #reconciling10010
Node: Reporting12273 Node: Reporting12267
Ref: #reporting12413 Ref: #reporting12407
Node: Migrating to a new file16412 Node: Migrating to a new file16406
Ref: #migrating-to-a-new-file16560 Ref: #migrating-to-a-new-file16554
Node: OPTIONS16859 Node: OPTIONS16853
Ref: #options16966 Ref: #options16960
Node: General options17336 Node: General options17330
Ref: #general-options17461 Ref: #general-options17455
Node: Command options20231 Node: Command options20225
Ref: #command-options20382 Ref: #command-options20376
Node: Command arguments20780 Node: Command arguments20774
Ref: #command-arguments20927 Ref: #command-arguments20921
Node: Queries21807 Node: Queries21801
Ref: #queries21962 Ref: #queries21956
Node: Special characters in arguments and queries25924 Node: Special characters in arguments and queries25918
Ref: #special-characters-in-arguments-and-queries26152 Ref: #special-characters-in-arguments-and-queries26146
Node: More escaping26603 Node: More escaping26597
Ref: #more-escaping26765 Ref: #more-escaping26759
Node: Even more escaping27061 Node: Even more escaping27055
Ref: #even-more-escaping27255 Ref: #even-more-escaping27249
Node: Less escaping27926 Node: Less escaping27920
Ref: #less-escaping28088 Ref: #less-escaping28082
Node: Unicode characters28333 Node: Unicode characters28327
Ref: #unicode-characters28515 Ref: #unicode-characters28509
Node: Input files29927 Node: Input files29921
Ref: #input-files30070 Ref: #input-files30064
Node: Output destination31999 Node: Output destination31993
Ref: #output-destination32151 Ref: #output-destination32145
Node: Output format32576 Node: Output format32570
Ref: #output-format32726 Ref: #output-format32720
Node: Regular expressions34308 Node: Regular expressions34302
Ref: #regular-expressions34465 Ref: #regular-expressions34459
Node: Smart dates36201 Node: Smart dates36195
Ref: #smart-dates36352 Ref: #smart-dates36346
Node: Report start & end date37713 Node: Report start & end date37707
Ref: #report-start-end-date37885 Ref: #report-start-end-date37879
Node: Report intervals39382 Node: Report intervals39376
Ref: #report-intervals39547 Ref: #report-intervals39541
Node: Period expressions39937 Node: Period expressions39931
Ref: #period-expressions40097 Ref: #period-expressions40091
Node: Depth limiting44233 Node: Depth limiting44227
Ref: #depth-limiting44377 Ref: #depth-limiting44371
Node: Pivoting44709 Node: Pivoting44703
Ref: #pivoting44832 Ref: #pivoting44826
Node: Valuation46508 Node: Valuation46502
Ref: #valuation46610 Ref: #valuation46604
Node: -B Cost47530 Node: -B Cost47524
Ref: #b-cost47634 Ref: #b-cost47628
Node: -V Value47806 Node: -V Value47800
Ref: #v-value47959 Ref: #v-value47953
Node: -X Market value in specified commodity49232 Node: -X Market value in specified commodity49226
Ref: #x-market-value-in-specified-commodity49451 Ref: #x-market-value-in-specified-commodity49445
Node: Market prices49629 Node: Market prices49623
Ref: #market-prices49814 Ref: #market-prices49808
Node: --value Flexible valuation50739 Node: --value Flexible valuation50733
Ref: #value-flexible-valuation50940 Ref: #value-flexible-valuation50934
Node: Effect of --value on reports55445 Node: Effect of --value on reports55439
Ref: #effect-of---value-on-reports55626 Ref: #effect-of---value-on-reports55620
Node: COMMANDS61172 Node: COMMANDS61166
Ref: #commands61280 Ref: #commands61274
Node: accounts62364 Node: accounts62358
Ref: #accounts62462 Ref: #accounts62456
Node: activity63161 Node: activity63155
Ref: #activity63271 Ref: #activity63265
Node: add63654 Node: add63648
Ref: #add63753 Ref: #add63747
Node: balance66492 Node: balance66486
Ref: #balance66603 Ref: #balance66597
Node: Classic balance report68061 Node: Classic balance report68055
Ref: #classic-balance-report68234 Ref: #classic-balance-report68228
Node: Customising the classic balance report69603 Node: Customising the classic balance report69597
Ref: #customising-the-classic-balance-report69831 Ref: #customising-the-classic-balance-report69825
Node: Colour support71907 Node: Colour support71901
Ref: #colour-support72074 Ref: #colour-support72068
Node: Flat mode72247 Node: Flat mode72241
Ref: #flat-mode72395 Ref: #flat-mode72389
Node: Depth limited balance reports72808 Node: Depth limited balance reports72802
Ref: #depth-limited-balance-reports72993 Ref: #depth-limited-balance-reports72987
Node: Percentages73449 Node: Percentages73443
Ref: #percentages73615 Ref: #percentages73609
Node: Multicolumn balance report74752 Node: Multicolumn balance report74746
Ref: #multicolumn-balance-report74932 Ref: #multicolumn-balance-report74926
Node: Budget report80194 Node: Budget report80188
Ref: #budget-report80337 Ref: #budget-report80331
Node: Nested budgets85603 Node: Nested budgets85597
Ref: #nested-budgets85715 Ref: #nested-budgets85709
Ref: #output-format-189196 Ref: #output-format-189190
Node: balancesheet89393 Node: balancesheet89387
Ref: #balancesheet89529 Ref: #balancesheet89523
Node: balancesheetequity90995 Node: balancesheetequity90989
Ref: #balancesheetequity91144 Ref: #balancesheetequity91138
Node: cashflow91867 Node: cashflow91861
Ref: #cashflow91995 Ref: #cashflow91989
Node: check-dates93174 Node: check-dates93168
Ref: #check-dates93301 Ref: #check-dates93295
Node: check-dupes93580 Node: check-dupes93574
Ref: #check-dupes93704 Ref: #check-dupes93698
Node: close93997 Node: close93991
Ref: #close94111 Ref: #close94105
Node: close usage95633 Node: close usage95627
Ref: #close-usage95726 Ref: #close-usage95720
Node: commodities98539 Node: commodities98533
Ref: #commodities98666 Ref: #commodities98660
Node: descriptions98748 Node: descriptions98742
Ref: #descriptions98876 Ref: #descriptions98870
Node: diff99057 Node: diff99051
Ref: #diff99163 Ref: #diff99157
Node: files100210 Node: files100204
Ref: #files100310 Ref: #files100304
Node: help100457 Node: help100451
Ref: #help100557 Ref: #help100551
Node: import101638 Node: import101632
Ref: #import101752 Ref: #import101746
Node: Importing balance assignments102645 Node: Importing balance assignments102639
Ref: #importing-balance-assignments102793 Ref: #importing-balance-assignments102787
Node: incomestatement103442 Node: incomestatement103436
Ref: #incomestatement103575 Ref: #incomestatement103569
Node: notes105062 Node: notes105056
Ref: #notes105175 Ref: #notes105169
Node: payees105301 Node: payees105295
Ref: #payees105407 Ref: #payees105401
Node: prices105565 Node: prices105559
Ref: #prices105671 Ref: #prices105665
Node: print106012 Node: print106006
Ref: #print106122 Ref: #print106116
Node: print-unique110908 Node: print-unique110902
Ref: #print-unique111034 Ref: #print-unique111028
Node: register111319 Node: register111313
Ref: #register111446 Ref: #register111440
Node: Custom register output115618 Node: Custom register output115612
Ref: #custom-register-output115747 Ref: #custom-register-output115741
Node: register-match117084 Node: register-match117078
Ref: #register-match117218 Ref: #register-match117212
Node: rewrite117569 Node: rewrite117563
Ref: #rewrite117684 Ref: #rewrite117678
Node: Re-write rules in a file119539 Node: Re-write rules in a file119533
Ref: #re-write-rules-in-a-file119673 Ref: #re-write-rules-in-a-file119667
Node: Diff output format120883 Node: Diff output format120877
Ref: #diff-output-format121052 Ref: #diff-output-format121046
Node: rewrite vs print --auto122144 Node: rewrite vs print --auto122138
Ref: #rewrite-vs.-print---auto122323 Ref: #rewrite-vs.-print---auto122317
Node: roi122879 Node: roi122873
Ref: #roi122977 Ref: #roi122971
Node: stats123989 Node: stats123983
Ref: #stats124088 Ref: #stats124082
Node: tags124876 Node: tags124870
Ref: #tags124974 Ref: #tags124968
Node: test125268 Node: test125262
Ref: #test125376 Ref: #test125370
Node: Add-on commands126123 Node: Add-on commands126117
Ref: #add-on-commands126240 Ref: #add-on-commands126234
Node: ui127583 Node: ui127577
Ref: #ui127671 Ref: #ui127665
Node: web127725 Node: web127719
Ref: #web127828 Ref: #web127822
Node: iadd127944 Node: iadd127938
Ref: #iadd128055 Ref: #iadd128049
Node: interest128137 Node: interest128131
Ref: #interest128244 Ref: #interest128238
Node: ENVIRONMENT128484 Node: ENVIRONMENT128478
Ref: #environment128596 Ref: #environment128590
Node: FILES129425 Node: FILES129419
Ref: #files-1129528 Ref: #files-1129522
Node: LIMITATIONS129741 Node: LIMITATIONS129735
Ref: #limitations129860 Ref: #limitations129854
Node: TROUBLESHOOTING130602 Node: TROUBLESHOOTING130596
Ref: #troubleshooting130715 Ref: #troubleshooting130709
 
End Tag Table End Tag Table

View File

@ -3231,4 +3231,4 @@ SEE ALSO
hledger 1.17.99 June 2020 hledger(1) hledger 1.18 June 2020 hledger(1)