;regen manuals

This commit is contained in:
Simon Michael 2020-06-22 12:20:14 -07:00
parent c8a84e3c96
commit 368297102d
7 changed files with 995 additions and 761 deletions

View File

@ -610,7 +610,7 @@ separator TAB
.fi .fi
.PP .PP
See also: File Extension. See also: File Extension.
.SS \f[C]if\f[R] .SS \f[C]if\f[R] block
.IP .IP
.nf .nf
\f[C] \f[C]
@ -702,6 +702,76 @@ banking thru software
comment XXX deductible ? check it comment XXX deductible ? check it
\f[R] \f[R]
.fi .fi
.SS \f[C]if\f[R] table
.IP
.nf
\f[C]
if,CSVFIELDNAME1,CSVFIELDNAME2,...,CSVFIELDNAMEn
MATCHER1,VALUE11,VALUE12,...,VALUE1n
MATCHER2,VALUE21,VALUE22,...,VALUE2n
MATCHER3,VALUE31,VALUE32,...,VALUE3n
<empty line>
\f[R]
.fi
.PP
Conditional tables (\[dq]if tables\[dq]) are a different syntax to
specify field assignments that will be applied only to CSV records which
match certain patterns.
.PP
MATCHER could be either field or record matcher, as described above.
When MATCHER matches, values from that row would be assigned to the CSV
fields named on the \f[C]if\f[R] line, in the same order.
.PP
Therefore \f[C]if\f[R] table is exactly equivalent to a sequence of of
\f[C]if\f[R] blocks:
.IP
.nf
\f[C]
if MATCHER1
CSVFIELDNAME1 VALUE11
CSVFIELDNAME2 VALUE12
...
CSVFIELDNAMEn VALUE1n
if MATCHER2
CSVFIELDNAME1 VALUE21
CSVFIELDNAME2 VALUE22
...
CSVFIELDNAMEn VALUE2n
if MATCHER3
CSVFIELDNAME1 VALUE31
CSVFIELDNAME2 VALUE32
...
CSVFIELDNAMEn VALUE3n
\f[R]
.fi
.PP
Each line starting with MATCHER should contain enough (possibly empty)
values for all the listed fields.
.PP
Rules would be checked and applied in the order they are listed in the
table and, like with \f[C]if\f[R] blocks, later rules (in the same or
another table) or \f[C]if\f[R] blocks could override the effect of any
rule.
.PP
Instead of \[aq],\[aq] you can use a variety of other non-alphanumeric
characters as a separator.
First character after \f[C]if\f[R] is taken to be the separator for the
rest of the table.
It is the responsibility of the user to ensure that separator does not
occur inside MATCHERs and values - there is no way to escape separator.
.PP
Example:
.IP
.nf
\f[C]
if,account2,comment
atm transaction fee,expenses:business:banking,deductible? check it
%description groceries,expenses:groceries,
2020/01/12.*Plumbing LLC,expenses:house:upkeep,emergency plumbing call-out
\f[R]
.fi
.SS \f[C]end\f[R] .SS \f[C]end\f[R]
.PP .PP
This rule can be used inside if blocks (only), to make hledger stop This rule can be used inside if blocks (only), to make hledger stop

View File

@ -374,7 +374,8 @@ Blank lines and lines beginning with '#' or ';' are ignored.
* fields:: * fields::
* field assignment:: * field assignment::
* separator:: * separator::
* if:: * if block::
* if table::
* end:: * end::
* date-format:: * date-format::
* newest-first:: * newest-first::
@ -567,7 +568,7 @@ becomes '1' when interpolated) (#1051). See TIPS below for more about
referencing other fields. referencing other fields.
 
File: hledger_csv.info, Node: separator, Next: if, Prev: field assignment, Up: CSV RULES File: hledger_csv.info, Node: separator, Next: if block, Prev: field assignment, Up: CSV RULES
2.4 'separator' 2.4 'separator'
=============== ===============
@ -587,10 +588,10 @@ separator TAB
See also: File Extension. See also: File Extension.
 
File: hledger_csv.info, Node: if, Next: end, Prev: separator, Up: CSV RULES File: hledger_csv.info, Node: if block, Next: if table, Prev: separator, Up: CSV RULES
2.5 'if' 2.5 'if' block
======== ==============
if MATCHER if MATCHER
RULE RULE
@ -659,9 +660,70 @@ banking thru software
comment XXX deductible ? check it comment XXX deductible ? check it
 
File: hledger_csv.info, Node: end, Next: date-format, Prev: if, Up: CSV RULES File: hledger_csv.info, Node: if table, Next: end, Prev: if block, Up: CSV RULES
2.6 'end' 2.6 'if' table
==============
if,CSVFIELDNAME1,CSVFIELDNAME2,...,CSVFIELDNAMEn
MATCHER1,VALUE11,VALUE12,...,VALUE1n
MATCHER2,VALUE21,VALUE22,...,VALUE2n
MATCHER3,VALUE31,VALUE32,...,VALUE3n
<empty line>
Conditional tables ("if tables") are a different syntax to specify
field assignments that will be applied only to CSV records which match
certain patterns.
MATCHER could be either field or record matcher, as described above.
When MATCHER matches, values from that row would be assigned to the CSV
fields named on the 'if' line, in the same order.
Therefore 'if' table is exactly equivalent to a sequence of of 'if'
blocks:
if MATCHER1
CSVFIELDNAME1 VALUE11
CSVFIELDNAME2 VALUE12
...
CSVFIELDNAMEn VALUE1n
if MATCHER2
CSVFIELDNAME1 VALUE21
CSVFIELDNAME2 VALUE22
...
CSVFIELDNAMEn VALUE2n
if MATCHER3
CSVFIELDNAME1 VALUE31
CSVFIELDNAME2 VALUE32
...
CSVFIELDNAMEn VALUE3n
Each line starting with MATCHER should contain enough (possibly
empty) values for all the listed fields.
Rules would be checked and applied in the order they are listed in
the table and, like with 'if' blocks, later rules (in the same or
another table) or 'if' blocks could override the effect of any rule.
Instead of ',' you can use a variety of other non-alphanumeric
characters as a separator. First character after 'if' is taken to be
the separator for the rest of the table. It is the responsibility of
the user to ensure that separator does not occur inside MATCHERs and
values - there is no way to escape separator.
Example:
if,account2,comment
atm transaction fee,expenses:business:banking,deductible? check it
%description groceries,expenses:groceries,
2020/01/12.*Plumbing LLC,expenses:house:upkeep,emergency plumbing call-out

File: hledger_csv.info, Node: end, Next: date-format, Prev: if table, Up: CSV RULES
2.7 'end'
========= =========
This rule can be used inside if blocks (only), to make hledger stop This rule can be used inside if blocks (only), to make hledger stop
@ -675,7 +737,7 @@ if ,,,,
 
File: hledger_csv.info, Node: date-format, Next: newest-first, Prev: end, Up: CSV RULES File: hledger_csv.info, Node: date-format, Next: newest-first, Prev: end, Up: CSV RULES
2.7 'date-format' 2.8 'date-format'
================= =================
date-format DATEFMT date-format DATEFMT
@ -706,7 +768,7 @@ https://hackage.haskell.org/package/time/docs/Data-Time-Format.html#v:formatTime
 
File: hledger_csv.info, Node: newest-first, Next: include, Prev: date-format, Up: CSV RULES File: hledger_csv.info, Node: newest-first, Next: include, Prev: date-format, Up: CSV RULES
2.8 'newest-first' 2.9 'newest-first'
================== ==================
hledger always sorts the generated transactions by date. Transactions hledger always sorts the generated transactions by date. Transactions
@ -728,8 +790,8 @@ newest-first
 
File: hledger_csv.info, Node: include, Next: balance-type, Prev: newest-first, Up: CSV RULES File: hledger_csv.info, Node: include, Next: balance-type, Prev: newest-first, Up: CSV RULES
2.9 'include' 2.10 'include'
============= ==============
include RULESFILE include RULESFILE
@ -751,7 +813,7 @@ include categorisation.rules
 
File: hledger_csv.info, Node: balance-type, Prev: include, Up: CSV RULES File: hledger_csv.info, Node: balance-type, Prev: include, Up: CSV RULES
2.10 'balance-type' 2.11 'balance-type'
=================== ===================
Balance assertions generated by assigning to balanceN are of the simple Balance assertions generated by assigning to balanceN are of the simple
@ -1048,62 +1110,64 @@ Node: Paypal6570
Ref: #paypal6664 Ref: #paypal6664
Node: CSV RULES14308 Node: CSV RULES14308
Ref: #csv-rules14417 Ref: #csv-rules14417
Node: skip14693 Node: skip14712
Ref: #skip14786 Ref: #skip14805
Node: fields15161 Node: fields15180
Ref: #fields15283 Ref: #fields15302
Node: Transaction field names16448 Node: Transaction field names16467
Ref: #transaction-field-names16608 Ref: #transaction-field-names16627
Node: Posting field names16719 Node: Posting field names16738
Ref: #posting-field-names16871 Ref: #posting-field-names16890
Node: account16941 Node: account16960
Ref: #account17057 Ref: #account17076
Node: amount17594 Node: amount17613
Ref: #amount17725 Ref: #amount17744
Node: currency18832 Node: currency18851
Ref: #currency18967 Ref: #currency18986
Node: balance19173 Node: balance19192
Ref: #balance19307 Ref: #balance19326
Node: comment19624 Node: comment19643
Ref: #comment19741 Ref: #comment19760
Node: field assignment19904 Node: field assignment19923
Ref: #field-assignment20047 Ref: #field-assignment20066
Node: separator20865 Node: separator20884
Ref: #separator20994 Ref: #separator21019
Node: if21405 Node: if block21430
Ref: #if21507 Ref: #if-block21555
Node: end23663 Node: if table23711
Ref: #end23769 Ref: #if-table23830
Node: date-format23993 Node: end25568
Ref: #date-format24125 Ref: #end25680
Node: newest-first24874 Node: date-format25904
Ref: #newest-first25012 Ref: #date-format26036
Node: include25695 Node: newest-first26785
Ref: #include25824 Ref: #newest-first26923
Node: balance-type26268 Node: include27606
Ref: #balance-type26388 Ref: #include27737
Node: TIPS27088 Node: balance-type28181
Ref: #tips27170 Ref: #balance-type28301
Node: Rapid feedback27426 Node: TIPS29001
Ref: #rapid-feedback27543 Ref: #tips29083
Node: Valid CSV28003 Node: Rapid feedback29339
Ref: #valid-csv28133 Ref: #rapid-feedback29456
Node: File Extension28325 Node: Valid CSV29916
Ref: #file-extension28477 Ref: #valid-csv30046
Node: Reading multiple CSV files28887 Node: File Extension30238
Ref: #reading-multiple-csv-files29072 Ref: #file-extension30390
Node: Valid transactions29313 Node: Reading multiple CSV files30800
Ref: #valid-transactions29491 Ref: #reading-multiple-csv-files30985
Node: Deduplicating importing30119 Node: Valid transactions31226
Ref: #deduplicating-importing30298 Ref: #valid-transactions31404
Node: Setting amounts31331 Node: Deduplicating importing32032
Ref: #setting-amounts31500 Ref: #deduplicating-importing32211
Node: Setting currency/commodity32487 Node: Setting amounts33244
Ref: #setting-currencycommodity32679 Ref: #setting-amounts33413
Node: Referencing other fields33482 Node: Setting currency/commodity34400
Ref: #referencing-other-fields33682 Ref: #setting-currencycommodity34592
Node: How CSV rules are evaluated34579 Node: Referencing other fields35395
Ref: #how-csv-rules-are-evaluated34752 Ref: #referencing-other-fields35595
Node: How CSV rules are evaluated36492
Ref: #how-csv-rules-are-evaluated36665
 
End Tag Table End Tag Table

View File

@ -467,7 +467,7 @@ CSV RULES
See also: File Extension. See also: File Extension.
if if block
if MATCHER if MATCHER
RULE RULE
@ -535,8 +535,63 @@ CSV RULES
account2 expenses:business:banking account2 expenses:business:banking
comment XXX deductible ? check it comment XXX deductible ? check it
if table
if,CSVFIELDNAME1,CSVFIELDNAME2,...,CSVFIELDNAMEn
MATCHER1,VALUE11,VALUE12,...,VALUE1n
MATCHER2,VALUE21,VALUE22,...,VALUE2n
MATCHER3,VALUE31,VALUE32,...,VALUE3n
<empty line>
Conditional tables ("if tables") are a different syntax to specify
field assignments that will be applied only to CSV records which match
certain patterns.
MATCHER could be either field or record matcher, as described above.
When MATCHER matches, values from that row would be assigned to the CSV
fields named on the if line, in the same order.
Therefore if table is exactly equivalent to a sequence of of if blocks:
if MATCHER1
CSVFIELDNAME1 VALUE11
CSVFIELDNAME2 VALUE12
...
CSVFIELDNAMEn VALUE1n
if MATCHER2
CSVFIELDNAME1 VALUE21
CSVFIELDNAME2 VALUE22
...
CSVFIELDNAMEn VALUE2n
if MATCHER3
CSVFIELDNAME1 VALUE31
CSVFIELDNAME2 VALUE32
...
CSVFIELDNAMEn VALUE3n
Each line starting with MATCHER should contain enough (possibly empty)
values for all the listed fields.
Rules would be checked and applied in the order they are listed in the
table and, like with if blocks, later rules (in the same or another ta-
ble) or if blocks could override the effect of any rule.
Instead of ',' you can use a variety of other non-alphanumeric charac-
ters as a separator. First character after if is taken to be the sepa-
rator for the rest of the table. It is the responsibility of the user
to ensure that separator does not occur inside MATCHERs and values -
there is no way to escape separator.
Example:
if,account2,comment
atm transaction fee,expenses:business:banking,deductible? check it
%description groceries,expenses:groceries,
2020/01/12.*Plumbing LLC,expenses:house:upkeep,emergency plumbing call-out
end end
This rule can be used inside if blocks (only), to make hledger stop This rule can be used inside if blocks (only), to make hledger stop
reading this CSV file and move on to the next input file, or to command reading this CSV file and move on to the next input file, or to command
execution. Eg: execution. Eg:
@ -547,10 +602,10 @@ CSV RULES
date-format date-format
date-format DATEFMT date-format DATEFMT
This is a helper for the date (and date2) fields. If your CSV dates This is a helper for the date (and date2) fields. If your CSV dates
are not formatted like YYYY-MM-DD, YYYY/MM/DD or YYYY.MM.DD, you'll are not formatted like YYYY-MM-DD, YYYY/MM/DD or YYYY.MM.DD, you'll
need to add a date-format rule describing them with a strptime date need to add a date-format rule describing them with a strptime date
parsing pattern, which must parse the CSV date value completely. Some parsing pattern, which must parse the CSV date value completely. Some
examples: examples:
# MM/DD/YY # MM/DD/YY
@ -572,15 +627,15 @@ CSV RULES
mat.html#v:formatTime mat.html#v:formatTime
newest-first newest-first
hledger always sorts the generated transactions by date. Transactions hledger always sorts the generated transactions by date. Transactions
on the same date should appear in the same order as their CSV records, on the same date should appear in the same order as their CSV records,
as hledger can usually auto-detect whether the CSV's normal order is as hledger can usually auto-detect whether the CSV's normal order is
oldest first or newest first. But if all of the following are true: oldest first or newest first. But if all of the following are true:
o the CSV might sometimes contain just one day of data (all records o the CSV might sometimes contain just one day of data (all records
having the same date) having the same date)
o the CSV records are normally in reverse chronological order (newest o the CSV records are normally in reverse chronological order (newest
at the top) at the top)
o and you care about preserving the order of same-day transactions o and you care about preserving the order of same-day transactions
@ -593,9 +648,9 @@ CSV RULES
include include
include RULESFILE include RULESFILE
This includes the contents of another CSV rules file at this point. This includes the contents of another CSV rules file at this point.
RULESFILE is an absolute file path or a path relative to the current RULESFILE is an absolute file path or a path relative to the current
file's directory. This can be useful for sharing common rules between file's directory. This can be useful for sharing common rules between
several rules files, eg: several rules files, eg:
# someaccount.csv.rules # someaccount.csv.rules
@ -610,10 +665,10 @@ CSV RULES
balance-type balance-type
Balance assertions generated by assigning to balanceN are of the simple Balance assertions generated by assigning to balanceN are of the simple
= type by default, which is a single-commodity, subaccount-excluding = type by default, which is a single-commodity, subaccount-excluding
assertion. You may find the subaccount-including variants more useful, assertion. You may find the subaccount-including variants more useful,
eg if you have created some virtual subaccounts of checking to help eg if you have created some virtual subaccounts of checking to help
with budgeting. You can select a different type of assertion with the with budgeting. You can select a different type of assertion with the
balance-type rule: balance-type rule:
# balance assertions will consider all commodities and all subaccounts # balance assertions will consider all commodities and all subaccounts
@ -628,19 +683,19 @@ CSV RULES
TIPS TIPS
Rapid feedback Rapid feedback
It's a good idea to get rapid feedback while creating/troubleshooting It's a good idea to get rapid feedback while creating/troubleshooting
CSV rules. Here's a good way, using entr from http://eradman.com/entr- CSV rules. Here's a good way, using entr from http://eradman.com/entr-
project : project :
$ ls foo.csv* | entr bash -c 'echo ----; hledger -f foo.csv print desc:SOMEDESC' $ ls foo.csv* | entr bash -c 'echo ----; hledger -f foo.csv print desc:SOMEDESC'
A desc: query (eg) is used to select just one, or a few, transactions A desc: query (eg) is used to select just one, or a few, transactions
of interest. "bash -c" is used to run multiple commands, so we can of interest. "bash -c" is used to run multiple commands, so we can
echo a separator each time the command re-runs, making it easier to echo a separator each time the command re-runs, making it easier to
read the output. read the output.
Valid CSV Valid CSV
hledger accepts CSV conforming to RFC 4180. When CSV values are en- hledger accepts CSV conforming to RFC 4180. When CSV values are en-
closed in quotes, note: closed in quotes, note:
o they must be double quotes (not single quotes) o they must be double quotes (not single quotes)
@ -648,9 +703,9 @@ TIPS
o spaces outside the quotes are not allowed o spaces outside the quotes are not allowed
File Extension File Extension
CSV ("Character Separated Values") files should be named with one of CSV ("Character Separated Values") files should be named with one of
these filename extensions: .csv, .ssv, .tsv. Or, the file path should these filename extensions: .csv, .ssv, .tsv. Or, the file path should
be prefixed with one of csv:, ssv:, tsv:. This helps hledger identify be prefixed with one of csv:, ssv:, tsv:. This helps hledger identify
the format and show the right error messages. For example: the format and show the right error messages. For example:
$ hledger -f foo.ssv print $ hledger -f foo.ssv print
@ -662,44 +717,44 @@ TIPS
More about this: Input files in the hledger manual. More about this: Input files in the hledger manual.
Reading multiple CSV files Reading multiple CSV files
If you use multiple -f options to read multiple CSV files at once, If you use multiple -f options to read multiple CSV files at once,
hledger will look for a correspondingly-named rules file for each CSV hledger will look for a correspondingly-named rules file for each CSV
file. But if you use the --rules-file option, that rules file will be file. But if you use the --rules-file option, that rules file will be
used for all the CSV files. used for all the CSV files.
Valid transactions Valid transactions
After reading a CSV file, hledger post-processes and validates the gen- After reading a CSV file, hledger post-processes and validates the gen-
erated journal entries as it would for a journal file - balancing them, erated journal entries as it would for a journal file - balancing them,
applying balance assignments, and canonicalising amount styles. Any applying balance assignments, and canonicalising amount styles. Any
errors at this stage will be reported in the usual way, displaying the errors at this stage will be reported in the usual way, displaying the
problem entry. problem entry.
There is one exception: balance assertions, if you have generated them, There is one exception: balance assertions, if you have generated them,
will not be checked, since normally these will work only when the CSV will not be checked, since normally these will work only when the CSV
data is part of the main journal. If you do need to check balance as- data is part of the main journal. If you do need to check balance as-
sertions generated from CSV right away, pipe into another hledger: sertions generated from CSV right away, pipe into another hledger:
$ hledger -f file.csv print | hledger -f- print $ hledger -f file.csv print | hledger -f- print
Deduplicating, importing Deduplicating, importing
When you download a CSV file periodically, eg to get your latest bank When you download a CSV file periodically, eg to get your latest bank
transactions, the new file may overlap with the old one, containing transactions, the new file may overlap with the old one, containing
some of the same records. some of the same records.
The import command will (a) detect the new transactions, and (b) append The import command will (a) detect the new transactions, and (b) append
just those transactions to your main journal. It is idempotent, so you just those transactions to your main journal. It is idempotent, so you
don't have to remember how many times you ran it or with which version don't have to remember how many times you ran it or with which version
of the CSV. (It keeps state in a hidden .latest.FILE.csv file.) This of the CSV. (It keeps state in a hidden .latest.FILE.csv file.) This
is the easiest way to import CSV data. Eg: is the easiest way to import CSV data. Eg:
# download the latest CSV files, then run this command. # download the latest CSV files, then run this command.
# Note, no -f flags needed here. # Note, no -f flags needed here.
$ hledger import *.csv [--dry] $ hledger import *.csv [--dry]
This method works for most CSV files. (Where records have a stable This method works for most CSV files. (Where records have a stable
chronological order, and new records appear only at the new end.) chronological order, and new records appear only at the new end.)
A number of other tools and workflows, hledger-specific and otherwise, A number of other tools and workflows, hledger-specific and otherwise,
exist for converting, deduplicating, classifying and managing CSV data. exist for converting, deduplicating, classifying and managing CSV data.
See: See:
@ -710,43 +765,43 @@ TIPS
Setting amounts Setting amounts
A posting amount can be set in one of these ways: A posting amount can be set in one of these ways:
o by assigning (with a fields list or field assignment) to amountN o by assigning (with a fields list or field assignment) to amountN
(posting N's amount) or amount (posting 1's amount) (posting N's amount) or amount (posting 1's amount)
o by assigning to amountN-in and amountN-out (or amount-in and amount- o by assigning to amountN-in and amountN-out (or amount-in and amount-
out). For each CSV record, whichever of these has a non-zero value out). For each CSV record, whichever of these has a non-zero value
will be used, with appropriate sign. If both contain a non-zero will be used, with appropriate sign. If both contain a non-zero
value, this may not work. value, this may not work.
o by assigning to balanceN (or balance) instead of the above, setting o by assigning to balanceN (or balance) instead of the above, setting
the amount indirectly via a balance assignment. If you do this the the amount indirectly via a balance assignment. If you do this the
default account name may be wrong, so you should set that explicitly. default account name may be wrong, so you should set that explicitly.
There is some special handling for an amount's sign: There is some special handling for an amount's sign:
o If an amount value is parenthesised, it will be de-parenthesised and o If an amount value is parenthesised, it will be de-parenthesised and
sign-flipped. sign-flipped.
o If an amount value begins with a double minus sign, those cancel out o If an amount value begins with a double minus sign, those cancel out
and are removed. and are removed.
o If an amount value begins with a plus sign, that will be removed o If an amount value begins with a plus sign, that will be removed
Setting currency/commodity Setting currency/commodity
If the currency/commodity symbol is included in the CSV's amount If the currency/commodity symbol is included in the CSV's amount
field(s), you don't have to do anything special. field(s), you don't have to do anything special.
If the currency is provided as a separate CSV field, you can either: If the currency is provided as a separate CSV field, you can either:
o assign that to currency, which adds it to all posting amounts. The o assign that to currency, which adds it to all posting amounts. The
symbol will prepended to the amount quantity (on the left side). If symbol will prepended to the amount quantity (on the left side). If
you write a trailing space after the symbol, there will be a space you write a trailing space after the symbol, there will be a space
between symbol and amount (an exception to the usual whitespace between symbol and amount (an exception to the usual whitespace
stripping). stripping).
o or assign it to currencyN which adds it to posting N's amount only. o or assign it to currencyN which adds it to posting N's amount only.
o or for more control, construct the amount from symbol and quantity o or for more control, construct the amount from symbol and quantity
using field assignment, eg: using field assignment, eg:
fields date,description,currency,quantity fields date,description,currency,quantity
@ -754,9 +809,9 @@ TIPS
amount %quantity %currency amount %quantity %currency
Referencing other fields Referencing other fields
In field assignments, you can interpolate only CSV fields, not hledger In field assignments, you can interpolate only CSV fields, not hledger
fields. In the example below, there's both a CSV field and a hledger fields. In the example below, there's both a CSV field and a hledger
field named amount1, but %amount1 always means the CSV field, not the field named amount1, but %amount1 always means the CSV field, not the
hledger field: hledger field:
# Name the third CSV field "amount1" # Name the third CSV field "amount1"
@ -768,7 +823,7 @@ TIPS
# Set comment to the CSV amount1 (not the amount1 assigned above) # Set comment to the CSV amount1 (not the amount1 assigned above)
comment %amount1 comment %amount1
Here, since there's no CSV amount1 field, %amount1 will produce a lit- Here, since there's no CSV amount1 field, %amount1 will produce a lit-
eral "amount1": eral "amount1":
fields date,description,csvamount fields date,description,csvamount
@ -776,7 +831,7 @@ TIPS
# Can't interpolate amount1 here # Can't interpolate amount1 here
comment %amount1 comment %amount1
When there are multiple field assignments to the same hledger field, When there are multiple field assignments to the same hledger field,
only the last one takes effect. Here, comment's value will be be B, or only the last one takes effect. Here, comment's value will be be B, or
C if "something" is matched, but never A: C if "something" is matched, but never A:
@ -786,14 +841,14 @@ TIPS
comment C comment C
How CSV rules are evaluated How CSV rules are evaluated
Here's how to think of CSV rules being evaluated (if you really need Here's how to think of CSV rules being evaluated (if you really need
to). First, to). First,
o include - all includes are inlined, from top to bottom, depth first. o include - all includes are inlined, from top to bottom, depth first.
(At each include point the file is inlined and scanned for further (At each include point the file is inlined and scanned for further
includes, recursively, before proceeding.) includes, recursively, before proceeding.)
Then "global" rules are evaluated, top to bottom. If a rule is re- Then "global" rules are evaluated, top to bottom. If a rule is re-
peated, the last one wins: peated, the last one wins:
o skip (at top level) o skip (at top level)
@ -807,30 +862,30 @@ TIPS
Then for each CSV record in turn: Then for each CSV record in turn:
o test all if blocks. If any of them contain a end rule, skip all re- o test all if blocks. If any of them contain a end rule, skip all re-
maining CSV records. Otherwise if any of them contain a skip rule, maining CSV records. Otherwise if any of them contain a skip rule,
skip that many CSV records. If there are multiple matched skip skip that many CSV records. If there are multiple matched skip
rules, the first one wins. rules, the first one wins.
o collect all field assignments at top level and in matched if blocks. o collect all field assignments at top level and in matched if blocks.
When there are multiple assignments for a field, keep only the last When there are multiple assignments for a field, keep only the last
one. one.
o compute a value for each hledger field - either the one that was as- o compute a value for each hledger field - either the one that was as-
signed to it (and interpolate the %CSVFIELDNAME references), or a de- signed to it (and interpolate the %CSVFIELDNAME references), or a de-
fault fault
o generate a synthetic hledger transaction from these values. o generate a synthetic hledger transaction from these values.
This is all part of the CSV reader, one of several readers hledger can This is all part of the CSV reader, one of several readers hledger can
use to parse input files. When all files have been read successfully, use to parse input files. When all files have been read successfully,
the transactions are passed as input to whichever hledger command the the transactions are passed as input to whichever hledger command the
user specified. user specified.
REPORTING BUGS REPORTING BUGS
Report bugs at http://bugs.hledger.org (or on the #hledger IRC channel Report bugs at http://bugs.hledger.org (or on the #hledger IRC channel
or hledger mail list) or hledger mail list)
@ -844,7 +899,7 @@ COPYRIGHT
SEE ALSO SEE ALSO
hledger(1), hledger-ui(1), hledger-web(1), hledger-api(1), hledger(1), hledger-ui(1), hledger-web(1), hledger-api(1),
hledger_csv(5), hledger_journal(5), hledger_timeclock(5), hledger_time- hledger_csv(5), hledger_journal(5), hledger_timeclock(5), hledger_time-
dot(5), ledger(1) dot(5), ledger(1)

View File

@ -72,7 +72,7 @@ reordered. See also the import command.
This command also supports the output destination and output format This command also supports the output destination and output format
options The output formats supported are txt, csv, and (experimental) options The output formats supported are txt, csv, and (experimental)
json. json and sql.
Here's an example of print's CSV output: Here's an example of print's CSV output:

View File

@ -1070,7 +1070,8 @@ $ hledger print -o - # write to stdout (the default)
Some commands (print, register, the balance commands) offer a choice of Some commands (print, register, the balance commands) offer a choice of
output format. output format.
In addition to the usual plain text format (\f[C]txt\f[R]), there are In addition to the usual plain text format (\f[C]txt\f[R]), there are
CSV (\f[C]csv\f[R]), HTML (\f[C]html\f[R]) and JSON (\f[C]json\f[R]). CSV (\f[C]csv\f[R]), HTML (\f[C]html\f[R]), JSON (\f[C]json\f[R]) and
SQL (\f[C]sql\f[R]).
This is controlled by the \f[C]-O/--output-format\f[R] option: This is controlled by the \f[C]-O/--output-format\f[R] option:
.IP .IP
.nf .nf
@ -1119,6 +1120,20 @@ your control.
We hope this approach will not cause problems in practice; if you find We hope this approach will not cause problems in practice; if you find
otherwise, please let us know. otherwise, please let us know.
(Cf #1195) (Cf #1195)
.PP
Notes about SQL output:
.IP \[bu] 2
SQL output is also marked experimental, and much like JSON could use
real-world feedback.
.IP \[bu] 2
SQL output is expected to work with sqlite, MySQL and PostgreSQL
.IP \[bu] 2
SQL output is structured with the expectations that statements will be
executed in the empty database.
If you already have tables created via SQL output of hledger, you would
probably want to either clear tables of existing data (via
\f[C]delete\f[R] or \f[C]truncate\f[R] SQL statements) or drop tables
completely as otherwise your postings will be duped.
.SS Regular expressions .SS Regular expressions
.PP .PP
hledger uses regular expressions in a number of places: hledger uses regular expressions in a number of places:
@ -3763,7 +3778,7 @@ See also the import command.
.PP .PP
This command also supports the output destination and output format This command also supports the output destination and output format
options The output formats supported are \f[C]txt\f[R], \f[C]csv\f[R], options The output formats supported are \f[C]txt\f[R], \f[C]csv\f[R],
and (experimental) \f[C]json\f[R]. and (experimental) \f[C]json\f[R] and \f[C]sql\f[R].
.PP .PP
Here\[aq]s an example of print\[aq]s CSV output: Here\[aq]s an example of print\[aq]s CSV output:
.IP .IP

View File

@ -1007,8 +1007,8 @@ File: hledger.info, Node: Output format, Next: Regular expressions, Prev: Out
Some commands (print, register, the balance commands) offer a choice of Some commands (print, register, the balance commands) offer a choice of
output format. In addition to the usual plain text format ('txt'), output format. In addition to the usual plain text format ('txt'),
there are CSV ('csv'), HTML ('html') and JSON ('json'). This is there are CSV ('csv'), HTML ('html'), JSON ('json') and SQL ('sql').
controlled by the '-O/--output-format' option: This is controlled by the '-O/--output-format' option:
$ hledger print -O csv $ hledger print -O csv
@ -1039,6 +1039,20 @@ $ hledger balancesheet -o foo.txt -O html # write HTML to foo.txt
your control. We hope this approach will not cause problems in your control. We hope this approach will not cause problems in
practice; if you find otherwise, please let us know. (Cf #1195) practice; if you find otherwise, please let us know. (Cf #1195)
Notes about SQL output:
* SQL output is also marked experimental, and much like JSON could
use real-world feedback.
* SQL output is expected to work with sqlite, MySQL and PostgreSQL
* SQL output is structured with the expectations that statements will
be executed in the empty database. If you already have tables
created via SQL output of hledger, you would probably want to
either clear tables of existing data (via 'delete' or 'truncate'
SQL statements) or drop tables completely as otherwise your
postings will be duped.
 
File: hledger.info, Node: Regular expressions, Next: Smart dates, Prev: Output format, Up: OPTIONS File: hledger.info, Node: Regular expressions, Next: Smart dates, Prev: Output format, Up: OPTIONS
@ -3192,7 +3206,7 @@ reordered. See also the import command.
This command also supports the output destination and output format This command also supports the output destination and output format
options The output formats supported are 'txt', 'csv', and options The output formats supported are 'txt', 'csv', and
(experimental) 'json'. (experimental) 'json' and 'sql'.
Here's an example of print's CSV output: Here's an example of print's CSV output:
@ -3902,153 +3916,153 @@ Node: Output destination32183
Ref: #output-destination32335 Ref: #output-destination32335
Node: Output format32760 Node: Output format32760
Ref: #output-format32910 Ref: #output-format32910
Node: Regular expressions34492 Node: Regular expressions35077
Ref: #regular-expressions34649 Ref: #regular-expressions35234
Node: Smart dates36385 Node: Smart dates36970
Ref: #smart-dates36536 Ref: #smart-dates37121
Node: Report start & end date37897 Node: Report start & end date38482
Ref: #report-start-end-date38069 Ref: #report-start-end-date38654
Node: Report intervals39566 Node: Report intervals40151
Ref: #report-intervals39731 Ref: #report-intervals40316
Node: Period expressions40121 Node: Period expressions40706
Ref: #period-expressions40281 Ref: #period-expressions40866
Node: Depth limiting44417 Node: Depth limiting45002
Ref: #depth-limiting44561 Ref: #depth-limiting45146
Node: Pivoting44893 Node: Pivoting45478
Ref: #pivoting45016 Ref: #pivoting45601
Node: Valuation46692 Node: Valuation47277
Ref: #valuation46794 Ref: #valuation47379
Node: -B Cost47483 Node: -B Cost48068
Ref: #b-cost47587 Ref: #b-cost48172
Node: -V Value47720 Node: -V Value48305
Ref: #v-value47866 Ref: #v-value48451
Node: -X Value in specified commodity48061 Node: -X Value in specified commodity48646
Ref: #x-value-in-specified-commodity48260 Ref: #x-value-in-specified-commodity48845
Node: Valuation date48409 Node: Valuation date48994
Ref: #valuation-date48577 Ref: #valuation-date49162
Node: Market prices48987 Node: Market prices49572
Ref: #market-prices49167 Ref: #market-prices49752
Node: --infer-value market prices from transactions49944 Node: --infer-value market prices from transactions50529
Ref: #infer-value-market-prices-from-transactions50193 Ref: #infer-value-market-prices-from-transactions50778
Node: Valuation commodity51475 Node: Valuation commodity52060
Ref: #valuation-commodity51684 Ref: #valuation-commodity52269
Node: Simple valuation examples52910 Node: Simple valuation examples53495
Ref: #simple-valuation-examples53112 Ref: #simple-valuation-examples53697
Node: --value Flexible valuation53771 Node: --value Flexible valuation54356
Ref: #value-flexible-valuation53979 Ref: #value-flexible-valuation54564
Node: More valuation examples55926 Node: More valuation examples56511
Ref: #more-valuation-examples56135 Ref: #more-valuation-examples56720
Node: Effect of valuation on reports58140 Node: Effect of valuation on reports58725
Ref: #effect-of-valuation-on-reports58328 Ref: #effect-of-valuation-on-reports58913
Node: COMMANDS63849 Node: COMMANDS64434
Ref: #commands63957 Ref: #commands64542
Node: accounts65041 Node: accounts65626
Ref: #accounts65139 Ref: #accounts65724
Node: activity65838 Node: activity66423
Ref: #activity65948 Ref: #activity66533
Node: add66331 Node: add66916
Ref: #add66430 Ref: #add67015
Node: balance69169 Node: balance69754
Ref: #balance69280 Ref: #balance69865
Node: Classic balance report70738 Node: Classic balance report71323
Ref: #classic-balance-report70911 Ref: #classic-balance-report71496
Node: Customising the classic balance report72280 Node: Customising the classic balance report72865
Ref: #customising-the-classic-balance-report72508 Ref: #customising-the-classic-balance-report73093
Node: Colour support74584 Node: Colour support75169
Ref: #colour-support74751 Ref: #colour-support75336
Node: Flat mode74924 Node: Flat mode75509
Ref: #flat-mode75072 Ref: #flat-mode75657
Node: Depth limited balance reports75485 Node: Depth limited balance reports76070
Ref: #depth-limited-balance-reports75670 Ref: #depth-limited-balance-reports76255
Node: Percentages76126 Node: Percentages76711
Ref: #percentages76292 Ref: #percentages76877
Node: Multicolumn balance report77429 Node: Multicolumn balance report78014
Ref: #multicolumn-balance-report77609 Ref: #multicolumn-balance-report78194
Node: Budget report82871 Node: Budget report83456
Ref: #budget-report83014 Ref: #budget-report83599
Node: Nested budgets88280 Node: Nested budgets88865
Ref: #nested-budgets88392 Ref: #nested-budgets88977
Ref: #output-format-191873 Ref: #output-format-192458
Node: balancesheet92070 Node: balancesheet92655
Ref: #balancesheet92206 Ref: #balancesheet92791
Node: balancesheetequity93672 Node: balancesheetequity94257
Ref: #balancesheetequity93821 Ref: #balancesheetequity94406
Node: cashflow94544 Node: cashflow95129
Ref: #cashflow94672 Ref: #cashflow95257
Node: check-dates95851 Node: check-dates96436
Ref: #check-dates95978 Ref: #check-dates96563
Node: check-dupes96257 Node: check-dupes96842
Ref: #check-dupes96381 Ref: #check-dupes96966
Node: close96674 Node: close97259
Ref: #close96788 Ref: #close97373
Node: close usage98310 Node: close usage98895
Ref: #close-usage98403 Ref: #close-usage98988
Node: commodities101216 Node: commodities101801
Ref: #commodities101343 Ref: #commodities101928
Node: descriptions101425 Node: descriptions102010
Ref: #descriptions101553 Ref: #descriptions102138
Node: diff101734 Node: diff102319
Ref: #diff101840 Ref: #diff102425
Node: files102887 Node: files103472
Ref: #files102987 Ref: #files103572
Node: help103134 Node: help103719
Ref: #help103234 Ref: #help103819
Node: import104315 Node: import104900
Ref: #import104429 Ref: #import105014
Node: Importing balance assignments105322 Node: Importing balance assignments105907
Ref: #importing-balance-assignments105470 Ref: #importing-balance-assignments106055
Node: incomestatement106119 Node: incomestatement106704
Ref: #incomestatement106252 Ref: #incomestatement106837
Node: notes107739 Node: notes108324
Ref: #notes107852 Ref: #notes108437
Node: payees107978 Node: payees108563
Ref: #payees108084 Ref: #payees108669
Node: prices108242 Node: prices108827
Ref: #prices108348 Ref: #prices108933
Node: print108689 Node: print109274
Ref: #print108799 Ref: #print109384
Node: print-unique113585 Node: print-unique114180
Ref: #print-unique113711 Ref: #print-unique114306
Node: register113996 Node: register114591
Ref: #register114123 Ref: #register114718
Node: Custom register output118295 Node: Custom register output118890
Ref: #custom-register-output118424 Ref: #custom-register-output119019
Node: register-match119761 Node: register-match120356
Ref: #register-match119895 Ref: #register-match120490
Node: rewrite120246 Node: rewrite120841
Ref: #rewrite120361 Ref: #rewrite120956
Node: Re-write rules in a file122216 Node: Re-write rules in a file122811
Ref: #re-write-rules-in-a-file122350 Ref: #re-write-rules-in-a-file122945
Node: Diff output format123560 Node: Diff output format124155
Ref: #diff-output-format123729 Ref: #diff-output-format124324
Node: rewrite vs print --auto124821 Node: rewrite vs print --auto125416
Ref: #rewrite-vs.-print---auto125000 Ref: #rewrite-vs.-print---auto125595
Node: roi125556 Node: roi126151
Ref: #roi125654 Ref: #roi126249
Node: stats126666 Node: stats127261
Ref: #stats126765 Ref: #stats127360
Node: tags127553 Node: tags128148
Ref: #tags127651 Ref: #tags128246
Node: test127945 Node: test128540
Ref: #test128053 Ref: #test128648
Node: Add-on commands128800 Node: Add-on commands129395
Ref: #add-on-commands128917 Ref: #add-on-commands129512
Node: ui130260 Node: ui130855
Ref: #ui130348 Ref: #ui130943
Node: web130402 Node: web130997
Ref: #web130505 Ref: #web131100
Node: iadd130621 Node: iadd131216
Ref: #iadd130732 Ref: #iadd131327
Node: interest130814 Node: interest131409
Ref: #interest130921 Ref: #interest131516
Node: ENVIRONMENT131161 Node: ENVIRONMENT131756
Ref: #environment131273 Ref: #environment131868
Node: FILES132102 Node: FILES132697
Ref: #files-1132205 Ref: #files-1132800
Node: LIMITATIONS132418 Node: LIMITATIONS133013
Ref: #limitations132537 Ref: #limitations133132
Node: TROUBLESHOOTING133279 Node: TROUBLESHOOTING133874
Ref: #troubleshooting133392 Ref: #troubleshooting133987
 
End Tag Table End Tag Table

File diff suppressed because it is too large Load Diff