doc: regenerate embedded manuals

[ci skip]
This commit is contained in:
Simon Michael 2017-11-28 17:20:41 -08:00
parent c433873e04
commit 4228203740
18 changed files with 1005 additions and 624 deletions

View File

@ -1,4 +1,4 @@
This is hledger-api.1.info, produced by makeinfo version 6.1 from stdin.
This is hledger-api.1.info, produced by makeinfo version 6.0 from stdin.

File: hledger-api.1.info, Node: Top, Next: OPTIONS, Up: (dir)

View File

@ -8,19 +8,77 @@
CSV \- how hledger reads CSV data, and the CSV rules file format
.SH DESCRIPTION
.PP
hledger can read CSV files, converting each CSV record into a journal
entry (transaction), if you provide some conversion hints in a "rules
file".
This file should be named like the CSV file with an additional
\f[C]\&.rules\f[] suffix (eg: \f[C]mybank.csv.rules\f[]); or, you can
specify the file with \f[C]\-\-rules\-file\ PATH\f[].
hledger will create it if necessary, with some default rules which
you\[aq]ll need to adjust.
At minimum, the rules file must specify the \f[C]date\f[] and
\f[C]amount\f[] fields.
For an example, see Cookbook: convert CSV files.
hledger can read CSV (comma\-separated value) files as if they were
journal files, automatically converting each CSV record into a
transaction.
(To learn about \f[I]writing\f[] CSV, see CSV output.)
.PP
To learn about \f[I]exporting\f[] CSV, see CSV output.
Converting CSV to transactions requires some special conversion rules.
These do several things:
.IP \[bu] 2
they describe the layout and format of the CSV data
.IP \[bu] 2
they can customize the generated journal entries using a simple
templating language
.IP \[bu] 2
they can add refinements based on patterns in the CSV data, eg
categorizing transactions with more detailed account names.
.PP
When reading a CSV file named \f[C]FILE.csv\f[], hledger looks for a
conversion rules file named \f[C]FILE.csv.rules\f[] in the same
directory.
You can override this with the \f[C]\-\-rules\-file\f[] option.
If the rules file does not exist, hledger will auto\-create one with
some example rules, which you\[aq]ll need to adjust.
.PP
At minimum, the rules file must identify the \f[C]date\f[] and
\f[C]amount\f[] fields.
It may also be necessary to specify the date format, and the number of
header lines to skip.
Eg:
.IP
.nf
\f[C]
fields\ date,\ _,\ _,\ amount
date\-format\ \ %d/%m/%Y
skip\ 1
\f[]
.fi
.PP
A more complete example:
.IP
.nf
\f[C]
#\ hledger\ CSV\ rules\ for\ amazon.com\ order\ history
#\ sample:
#\ "Date","Type","To/From","Name","Status","Amount","Fees","Transaction\ ID"
#\ "Jul\ 29,\ 2012","Payment","To","Adapteva,\ Inc.","Completed","$25.00","$0.00","17LA58JSK6PRD4HDGLNJQPI1PB9N8DKPVHL"
#\ skip\ one\ header\ line
skip\ 1
#\ name\ the\ csv\ fields\ (and\ assign\ the\ transaction\[aq]s\ date,\ amount\ and\ code)
fields\ date,\ _,\ toorfrom,\ name,\ amzstatus,\ amount,\ fees,\ code
#\ how\ to\ parse\ the\ date
date\-format\ %b\ %\-d,\ %Y
#\ combine\ two\ fields\ to\ make\ the\ description
description\ %toorfrom\ %name
#\ save\ these\ fields\ as\ tags
comment\ \ \ \ \ status:%amzstatus,\ fees:%fees
#\ set\ the\ base\ account\ for\ all\ transactions
account1\ \ \ \ assets:amazon
#\ flip\ the\ sign\ on\ the\ amount
amount\ \ \ \ \ \ \-%amount
\f[]
.fi
.PP
For more examples, see Convert CSV files.
.SH CSV RULES
.PP
The following seven kinds of rule can appear in the rules file, in any

View File

@ -1,4 +1,4 @@
This is hledger_csv.5.info, produced by makeinfo version 6.1 from stdin.
This is hledger_csv.5.info, produced by makeinfo version 6.0 from stdin.

File: hledger_csv.5.info, Node: Top, Next: CSV RULES, Up: (dir)
@ -6,16 +6,63 @@ File: hledger_csv.5.info, Node: Top, Next: CSV RULES, Up: (dir)
hledger_csv(5) hledger 1.4
**************************
hledger can read CSV files, converting each CSV record into a journal
entry (transaction), if you provide some conversion hints in a "rules
file". This file should be named like the CSV file with an additional
'.rules' suffix (eg: 'mybank.csv.rules'); or, you can specify the file
with '--rules-file PATH'. hledger will create it if necessary, with
some default rules which you'll need to adjust. At minimum, the rules
file must specify the 'date' and 'amount' fields. For an example, see
Cookbook: convert CSV files.
hledger can read CSV (comma-separated value) files as if they were
journal files, automatically converting each CSV record into a
transaction. (To learn about _writing_ CSV, see CSV output.)
To learn about _exporting_ CSV, see CSV output.
Converting CSV to transactions requires some special conversion
rules. These do several things:
* they describe the layout and format of the CSV data
* they can customize the generated journal entries using a simple
templating language
* they can add refinements based on patterns in the CSV data, eg
categorizing transactions with more detailed account names.
When reading a CSV file named 'FILE.csv', hledger looks for a
conversion rules file named 'FILE.csv.rules' in the same directory. You
can override this with the '--rules-file' option. If the rules file
does not exist, hledger will auto-create one with some example rules,
which you'll need to adjust.
At minimum, the rules file must identify the 'date' and 'amount'
fields. It may also be necessary to specify the date format, and the
number of header lines to skip. Eg:
fields date, _, _, amount
date-format %d/%m/%Y
skip 1
A more complete example:
# hledger CSV rules for amazon.com order history
# sample:
# "Date","Type","To/From","Name","Status","Amount","Fees","Transaction ID"
# "Jul 29, 2012","Payment","To","Adapteva, Inc.","Completed","$25.00","$0.00","17LA58JSK6PRD4HDGLNJQPI1PB9N8DKPVHL"
# skip one header line
skip 1
# name the csv fields (and assign the transaction's date, amount and code)
fields date, _, toorfrom, name, amzstatus, amount, fees, code
# how to parse the date
date-format %b %-d, %Y
# combine two fields to make the description
description %toorfrom %name
# save these fields as tags
comment status:%amzstatus, fees:%fees
# set the base account for all transactions
account1 assets:amazon
# flip the sign on the amount
amount -%amount
For more examples, see Convert CSV files.
* Menu:
* CSV RULES::
@ -270,33 +317,33 @@ one rules file will be used for all the CSV files being read.

Tag Table:
Node: Top74
Node: CSV RULES810
Ref: #csv-rules920
Node: skip1182
Ref: #skip1278
Node: date-format1450
Ref: #date-format1579
Node: field list2085
Ref: #field-list2224
Node: field assignment2929
Ref: #field-assignment3086
Node: conditional block3590
Ref: #conditional-block3746
Node: include4642
Ref: #include4774
Node: newest-first5005
Ref: #newest-first5121
Node: CSV TIPS5532
Ref: #csv-tips5628
Node: CSV ordering5746
Ref: #csv-ordering5866
Node: CSV accounts6047
Ref: #csv-accounts6187
Node: CSV amounts6441
Ref: #csv-amounts6589
Node: CSV balance assertions7364
Ref: #csv-balance-assertions7548
Node: Reading multiple CSV files7753
Ref: #reading-multiple-csv-files7925
Node: CSV RULES2165
Ref: #csv-rules2275
Node: skip2537
Ref: #skip2633
Node: date-format2805
Ref: #date-format2934
Node: field list3440
Ref: #field-list3579
Node: field assignment4284
Ref: #field-assignment4441
Node: conditional block4945
Ref: #conditional-block5101
Node: include5997
Ref: #include6129
Node: newest-first6360
Ref: #newest-first6476
Node: CSV TIPS6887
Ref: #csv-tips6983
Node: CSV ordering7101
Ref: #csv-ordering7221
Node: CSV accounts7402
Ref: #csv-accounts7542
Node: CSV amounts7796
Ref: #csv-amounts7944
Node: CSV balance assertions8719
Ref: #csv-balance-assertions8903
Node: Reading multiple CSV files9108
Ref: #reading-multiple-csv-files9280

End Tag Table

View File

@ -7,16 +7,65 @@ NAME
CSV - how hledger reads CSV data, and the CSV rules file format
DESCRIPTION
hledger can read CSV files, converting each CSV record into a journal
entry (transaction), if you provide some conversion hints in a "rules
file". This file should be named like the CSV file with an additional
.rules suffix (eg: mybank.csv.rules); or, you can specify the file with
--rules-file PATH. hledger will create it if necessary, with some
default rules which you'll need to adjust. At minimum, the rules file
must specify the date and amount fields. For an example, see Cookbook:
convert CSV files.
hledger can read CSV (comma-separated value) files as if they were
journal files, automatically converting each CSV record into a transac-
tion. (To learn about writing CSV, see CSV output.)
To learn about exporting CSV, see CSV output.
Converting CSV to transactions requires some special conversion rules.
These do several things:
o they describe the layout and format of the CSV data
o they can customize the generated journal entries using a simple tem-
plating language
o they can add refinements based on patterns in the CSV data, eg cate-
gorizing transactions with more detailed account names.
When reading a CSV file named FILE.csv, hledger looks for a conversion
rules file named FILE.csv.rules in the same directory. You can over-
ride this with the --rules-file option. If the rules file does not
exist, hledger will auto-create one with some example rules, which
you'll need to adjust.
At minimum, the rules file must identify the date and amount fields.
It may also be necessary to specify the date format, and the number of
header lines to skip. Eg:
fields date, _, _, amount
date-format %d/%m/%Y
skip 1
A more complete example:
# hledger CSV rules for amazon.com order history
# sample:
# "Date","Type","To/From","Name","Status","Amount","Fees","Transaction ID"
# "Jul 29, 2012","Payment","To","Adapteva, Inc.","Completed","$25.00","$0.00","17LA58JSK6PRD4HDGLNJQPI1PB9N8DKPVHL"
# skip one header line
skip 1
# name the csv fields (and assign the transaction's date, amount and code)
fields date, _, toorfrom, name, amzstatus, amount, fees, code
# how to parse the date
date-format %b %-d, %Y
# combine two fields to make the description
description %toorfrom %name
# save these fields as tags
comment status:%amzstatus, fees:%fees
# set the base account for all transactions
account1 assets:amazon
# flip the sign on the amount
amount -%amount
For more examples, see Convert CSV files.
CSV RULES
The following seven kinds of rule can appear in the rules file, in any

View File

@ -384,7 +384,26 @@ digit groups (thousands, or any other grouping) can be separated by
commas (in which case period is used for decimal point) or periods (in
which case comma is used for decimal point)
.PP
You can use any of these variations when recording data, but when
You can use any of these variations when recording data.
However, there is some ambiguous way of representing numbers like
\f[C]$1.000\f[] and \f[C]$1,000\f[] both may mean either one thousand or
one dollar.
By default hledger will assume that this is sole delimiter is used only
for decimals.
On the other hand commodity format declared prior to that line will help
to resolve that ambiguity differently:
.IP
.nf
\f[C]
commodity\ $1,000.00
2017/12/25\ New\ life\ of\ Scrooge
\ \ \ \ expenses:gifts\ \ $1,000
\ \ \ \ assets
\f[]
.fi
.PP
Though journal may contain mixed styles to represent amount, when
hledger displays amounts, it will choose a consistent format for each
commodity.
(Except for price amounts, which are always formatted as written).
@ -716,9 +735,9 @@ P\ 2010/1/1\ €\ $1.40
.SS Comments
.PP
Lines in the journal beginning with a semicolon (\f[C];\f[]) or hash
(\f[C]#\f[]) or asterisk (\f[C]*\f[]) are comments, and will be ignored.
(Asterisk comments make it easy to treat your journal like an org\-mode
outline in emacs.)
(\f[C]#\f[]) or star (\f[C]*\f[]) are comments, and will be ignored.
(Star comments cause org\-mode nodes to be ignored, allowing emacs users
to fold and navigate their journals with org\-mode or orgstruct\-mode.)
.PP
Also, anything between \f[C]comment\f[] and \f[C]end\ comment\f[]
directives is a (multi\-line) comment.
@ -730,20 +749,22 @@ description and/or indented on the following lines (before the
postings).
Similarly, you can attach comments to an individual posting by writing
them after the amount and/or indented on the following lines.
Transaction and posting comments must begin with a semicolon
(\f[C];\f[]).
.PP
Some examples:
.IP
.nf
\f[C]
#\ a\ journal\ comment
#\ a\ file\ comment
;\ also\ a\ journal\ comment
;\ also\ a\ file\ comment
comment
This\ is\ a\ multiline\ comment,
This\ is\ a\ multiline\ file\ comment,
which\ continues\ until\ a\ line
where\ the\ "end\ comment"\ string
appears\ on\ its\ own.
appears\ on\ its\ own\ (or\ end\ of\ file).
end\ comment
2012/5/14\ something\ \ ;\ a\ transaction\ comment
@ -752,7 +773,7 @@ end\ comment
\ \ \ \ posting2
\ \ \ \ ;\ a\ comment\ for\ posting\ 2
\ \ \ \ ;\ another\ comment\ line\ for\ posting\ 2
;\ a\ journal\ comment\ (because\ not\ indented)
;\ a\ file\ comment\ (because\ not\ indented)
\f[]
.fi
.SS Tags
@ -1038,7 +1059,7 @@ commodity\-less amounts, or until the next D directive.
D\ $1,000.00
1/1
\ \ a\ \ \ \ \ 5\ \ \ \ #\ <\-\ commodity\-less\ amount,\ becomes\ $1
\ \ a\ \ \ \ \ 5\ \ \ \ ;\ <\-\ commodity\-less\ amount,\ becomes\ $1
\ \ b
\f[]
.fi

View File

@ -1,4 +1,4 @@
This is hledger_journal.5.info, produced by makeinfo version 6.1 from
This is hledger_journal.5.info, produced by makeinfo version 6.0 from
stdin.

@ -361,7 +361,20 @@ commodity name. Some examples:
commas (in which case period is used for decimal point) or periods
(in which case comma is used for decimal point)
You can use any of these variations when recording data, but when
You can use any of these variations when recording data. However,
there is some ambiguous way of representing numbers like '$1.000' and
'$1,000' both may mean either one thousand or one dollar. By default
hledger will assume that this is sole delimiter is used only for
decimals. On the other hand commodity format declared prior to that
line will help to resolve that ambiguity differently:
commodity $1,000.00
2017/12/25 New life of Scrooge
expenses:gifts $1,000
assets
Though journal may contain mixed styles to represent amount, when
hledger displays amounts, it will choose a consistent format for each
commodity. (Except for price amounts, which are always formatted as
written). The display format is chosen as follows:
@ -684,8 +697,9 @@ File: hledger_journal.5.info, Node: Comments, Next: Tags, Prev: Prices, Up:
=============
Lines in the journal beginning with a semicolon (';') or hash ('#') or
asterisk ('*') are comments, and will be ignored. (Asterisk comments
make it easy to treat your journal like an org-mode outline in emacs.)
star ('*') are comments, and will be ignored. (Star comments cause
org-mode nodes to be ignored, allowing emacs users to fold and navigate
their journals with org-mode or orgstruct-mode.)
Also, anything between 'comment' and 'end comment' directives is a
(multi-line) comment. If there is no 'end comment', the comment extends
@ -695,18 +709,19 @@ to the end of the file.
description and/or indented on the following lines (before the
postings). Similarly, you can attach comments to an individual posting
by writing them after the amount and/or indented on the following lines.
Transaction and posting comments must begin with a semicolon (';').
Some examples:
# a journal comment
# a file comment
; also a journal comment
; also a file comment
comment
This is a multiline comment,
This is a multiline file comment,
which continues until a line
where the "end comment" string
appears on its own.
appears on its own (or end of file).
end comment
2012/5/14 something ; a transaction comment
@ -715,7 +730,7 @@ end comment
posting2
; a comment for posting 2
; another comment line for posting 2
; a journal comment (because not indented)
; a file comment (because not indented)

File: hledger_journal.5.info, Node: Tags, Next: Directives, Prev: Comments, Up: FILE FORMAT
@ -992,7 +1007,7 @@ amounts, or until the next D directive.
D $1,000.00
1/1
a 5 # <- commodity-less amount, becomes $1
a 5 ; <- commodity-less amount, becomes $1
b

@ -1087,61 +1102,61 @@ Node: Account names11207
Ref: #account-names11352
Node: Amounts11839
Ref: #amounts11977
Node: Virtual Postings14078
Ref: #virtual-postings14239
Node: Balance Assertions15459
Ref: #balance-assertions15636
Node: Assertions and ordering16532
Ref: #assertions-and-ordering16720
Node: Assertions and included files17420
Ref: #assertions-and-included-files17663
Node: Assertions and multiple -f options17996
Ref: #assertions-and-multiple--f-options18252
Node: Assertions and commodities18384
Ref: #assertions-and-commodities18621
Node: Assertions and subaccounts19317
Ref: #assertions-and-subaccounts19551
Node: Assertions and virtual postings20072
Ref: #assertions-and-virtual-postings20281
Node: Balance Assignments20423
Ref: #balance-assignments20594
Node: Prices21713
Ref: #prices21848
Node: Transaction prices21899
Ref: #transaction-prices22046
Node: Market prices24202
Ref: #market-prices24339
Node: Comments25299
Ref: #comments25423
Node: Tags26536
Ref: #tags26656
Node: Directives28058
Ref: #directives28173
Node: Account aliases28366
Ref: #account-aliases28512
Node: Basic aliases29116
Ref: #basic-aliases29261
Node: Regex aliases29951
Ref: #regex-aliases30121
Node: Multiple aliases30839
Ref: #multiple-aliases31013
Node: end aliases31511
Ref: #end-aliases31653
Node: account directive31754
Ref: #account-directive31936
Node: apply account directive32232
Ref: #apply-account-directive32430
Node: Multi-line comments33089
Ref: #multi-line-comments33281
Node: commodity directive33409
Ref: #commodity-directive33595
Node: Default commodity34467
Ref: #default-commodity34642
Node: Default year35179
Ref: #default-year35346
Node: Including other files35769
Ref: #including-other-files35928
Node: EDITOR SUPPORT36325
Ref: #editor-support36445
Node: Virtual Postings14568
Ref: #virtual-postings14729
Node: Balance Assertions15949
Ref: #balance-assertions16126
Node: Assertions and ordering17022
Ref: #assertions-and-ordering17210
Node: Assertions and included files17910
Ref: #assertions-and-included-files18153
Node: Assertions and multiple -f options18486
Ref: #assertions-and-multiple--f-options18742
Node: Assertions and commodities18874
Ref: #assertions-and-commodities19111
Node: Assertions and subaccounts19807
Ref: #assertions-and-subaccounts20041
Node: Assertions and virtual postings20562
Ref: #assertions-and-virtual-postings20771
Node: Balance Assignments20913
Ref: #balance-assignments21084
Node: Prices22203
Ref: #prices22338
Node: Transaction prices22389
Ref: #transaction-prices22536
Node: Market prices24692
Ref: #market-prices24829
Node: Comments25789
Ref: #comments25913
Node: Tags27155
Ref: #tags27275
Node: Directives28677
Ref: #directives28792
Node: Account aliases28985
Ref: #account-aliases29131
Node: Basic aliases29735
Ref: #basic-aliases29880
Node: Regex aliases30570
Ref: #regex-aliases30740
Node: Multiple aliases31458
Ref: #multiple-aliases31632
Node: end aliases32130
Ref: #end-aliases32272
Node: account directive32373
Ref: #account-directive32555
Node: apply account directive32851
Ref: #apply-account-directive33049
Node: Multi-line comments33708
Ref: #multi-line-comments33900
Node: commodity directive34028
Ref: #commodity-directive34214
Node: Default commodity35086
Ref: #default-commodity35261
Node: Default year35798
Ref: #default-year35965
Node: Including other files36388
Ref: #including-other-files36547
Node: EDITOR SUPPORT36944
Ref: #editor-support37064

End Tag Table

View File

@ -181,6 +181,7 @@ FILE FORMAT
description or posting account name, separated from it by a space,
indicating one of three statuses:
mark status
------------------
unmarked
@ -206,6 +207,7 @@ FILE FORMAT
What "uncleared", "pending", and "cleared" actually mean is up to you.
Here's one suggestion:
status meaning
--------------------------------------------------------------------------
uncleared recorded but not yet reconciled; needs review
@ -276,7 +278,20 @@ FILE FORMAT
commas (in which case period is used for decimal point) or periods
(in which case comma is used for decimal point)
You can use any of these variations when recording data, but when
You can use any of these variations when recording data. However,
there is some ambiguous way of representing numbers like $1.000 and
$1,000 both may mean either one thousand or one dollar. By default
hledger will assume that this is sole delimiter is used only for deci-
mals. On the other hand commodity format declared prior to that line
will help to resolve that ambiguity differently:
commodity $1,000.00
2017/12/25 New life of Scrooge
expenses:gifts $1,000
assets
Though journal may contain mixed styles to represent amount, when
hledger displays amounts, it will choose a consistent format for each
commodity. (Except for price amounts, which are always formatted as
written). The display format is chosen as follows:
@ -521,9 +536,10 @@ FILE FORMAT
P 2010/1/1 $1.40
Comments
Lines in the journal beginning with a semicolon (;) or hash (#) or
asterisk (*) are comments, and will be ignored. (Asterisk comments
make it easy to treat your journal like an org-mode outline in emacs.)
Lines in the journal beginning with a semicolon (;) or hash (#) or star
(*) are comments, and will be ignored. (Star comments cause org-mode
nodes to be ignored, allowing emacs users to fold and navigate their
journals with org-mode or orgstruct-mode.)
Also, anything between comment and end comment directives is a
(multi-line) comment. If there is no end comment, the comment extends
@ -533,18 +549,19 @@ FILE FORMAT
description and/or indented on the following lines (before the post-
ings). Similarly, you can attach comments to an individual posting by
writing them after the amount and/or indented on the following lines.
Transaction and posting comments must begin with a semicolon (;).
Some examples:
# a journal comment
# a file comment
; also a journal comment
; also a file comment
comment
This is a multiline comment,
This is a multiline file comment,
which continues until a line
where the "end comment" string
appears on its own.
appears on its own (or end of file).
end comment
2012/5/14 something ; a transaction comment
@ -553,7 +570,7 @@ FILE FORMAT
posting2
; a comment for posting 2
; another comment line for posting 2
; a journal comment (because not indented)
; a file comment (because not indented)
Tags
Tags are a way to add extra labels or labelled data to postings and
@ -758,7 +775,7 @@ FILE FORMAT
D $1,000.00
1/1
a 5 # <- commodity-less amount, becomes $1
a 5 ; <- commodity-less amount, becomes $1
b
Default year
@ -803,6 +820,7 @@ EDITOR SUPPORT
These were written with Ledger in mind, but also work with hledger
files:
Emacs http://www.ledger-cli.org/3.0/doc/ledger-mode.html
Vim https://github.com/ledger/ledger/wiki/Get-
ting-started

View File

@ -1,4 +1,4 @@
This is hledger_timeclock.5.info, produced by makeinfo version 6.1 from
This is hledger_timeclock.5.info, produced by makeinfo version 6.0 from
stdin.


View File

@ -1,4 +1,4 @@
This is hledger_timedot.5.info, produced by makeinfo version 6.1 from
This is hledger_timedot.5.info, produced by makeinfo version 6.0 from
stdin.


View File

@ -272,6 +272,11 @@ troubleshooting.
updated file.
This allows some basic data entry.
.PP
\f[C]A\f[] is like \f[C]a\f[], but runs the hledger\-iadd tool, which
provides a curses\-style interface.
This key will be available if \f[C]hledger\-iadd\f[] is installed in
$PATH.
.PP
\f[C]E\f[] runs $HLEDGER_UI_EDITOR, or $EDITOR, or a default
(\f[C]emacsclient\ \-a\ ""\ \-nw\f[]) on the journal file.
With some editors (emacs, vi), the cursor will be positioned at the

View File

@ -1,4 +1,4 @@
This is hledger-ui.1.info, produced by makeinfo version 6.1 from stdin.
This is hledger-ui.1.info, produced by makeinfo version 6.0 from stdin.

File: hledger-ui.1.info, Node: Top, Next: OPTIONS, Up: (dir)
@ -207,6 +207,10 @@ temporarily can be useful for troubleshooting.
'a' runs command-line hledger's add command, and reloads the updated
file. This allows some basic data entry.
'A' is like 'a', but runs the hledger-iadd tool, which provides a
curses-style interface. This key will be available if 'hledger-iadd' is
installed in $PATH.
'E' runs $HLEDGER_UI_EDITOR, or $EDITOR, or a default ('emacsclient
-a "" -nw') on the journal file. With some editors (emacs, vi), the
cursor will be positioned at the current transaction when invoked from
@ -369,15 +373,15 @@ Node: OPTIONS825
Ref: #options924
Node: KEYS3861
Ref: #keys3958
Node: SCREENS6754
Ref: #screens6841
Node: Accounts screen6931
Ref: #accounts-screen7061
Node: Register screen9291
Ref: #register-screen9448
Node: Transaction screen11522
Ref: #transaction-screen11682
Node: Error screen12552
Ref: #error-screen12676
Node: SCREENS6917
Ref: #screens7004
Node: Accounts screen7094
Ref: #accounts-screen7224
Node: Register screen9454
Ref: #register-screen9611
Node: Transaction screen11685
Ref: #transaction-screen11845
Node: Error screen12715
Ref: #error-screen12839

End Tag Table

View File

@ -195,6 +195,10 @@ KEYS
a runs command-line hledger's add command, and reloads the updated
file. This allows some basic data entry.
A is like a, but runs the hledger-iadd tool, which provides a
curses-style interface. This key will be available if hledger-iadd is
installed in $PATH.
E runs $HLEDGER_UI_EDITOR, or $EDITOR, or a default (emac-
sclient -a "" -nw) on the journal file. With some editors (emacs, vi),
the cursor will be positioned at the current transaction when invoked

View File

@ -1,4 +1,4 @@
This is hledger-web.1.info, produced by makeinfo version 6.1 from stdin.
This is hledger-web.1.info, produced by makeinfo version 6.0 from stdin.

File: hledger-web.1.info, Node: Top, Next: OPTIONS, Up: (dir)

View File

@ -721,11 +721,32 @@ T{
T}
.TE
.PP
Note that \f[C]weekly\f[], \f[C]monthly\f[], \f[C]quarterly\f[] and
\f[C]yearly\f[] intervals will always start on the first day on week,
month, quarter or year accordingly, and will end on the last day of same
period, even if associated period expression specifies different
explicit start and end date.
.SS For example:
.PP
\f[C]\-p\ "weekly\ from\ 2009/1/1\ to\ 2009/4/1"\f[] \-\- starts on
2008/12/29, closest preceeding Monday
\f[C]\-p\ "monthly\ in\ 2008/11/25"\f[] \-\- starts on 2018/11/01
.PD 0
.P
.PD
\f[C]\-p\ "quarterly\ from\ 2009\-05\-05\ to\ 2009\-06\-01"\f[] \-
starts on 2009/04/01, ends on 2009/06/30, which are first and last days
of Q2 2009 \f[C]\-p\ "yearly\ from\ 2009\-12\-29"\f[] \- starts on
2009/01/01, first day of 2009
\-\-\-\-\-\-\-\-\-\-\-\-\-\-\-\-\-\-\-\-\-\-\-\-\-\-\-\-\-\-\-\-\-\-\-\-\-\-\-\-\-\-
.PP
The following more complex report intervals are also supported:
\f[C]biweekly\f[], \f[C]bimonthly\f[],
\f[C]every\ N\ days|weeks|months|quarters|years\f[],
\f[C]every\ Nth\ day\ [of\ month]\f[],
\f[C]every\ Nth\ day\ of\ week\f[].
\f[C]every\ day|week|month|quarter|year\f[],
\f[C]every\ N\ days|weeks|months|quarters|years\f[].
.PP
All of these will start on the first day of the requested period and end
on the last one, as described above.
.PP
Examples:
.PP
@ -733,13 +754,56 @@ Examples:
tab(@);
l.
T{
\f[C]\-p\ "bimonthly\ from\ 2008"\f[]
\f[C]\-p\ "bimonthly\ from\ 2008"\f[] \-\- periods will have boundaries
on 2008/01/01, 2008/03/01, ...
T}
T{
\f[C]\-p\ "every\ 2\ weeks"\f[]
\f[C]\-p\ "every\ 2\ weeks"\f[] \-\- starts on closest preceeding Monday
T}
T{
\f[C]\-p\ "every\ 5\ days\ from\ 1/3"\f[]
\f[C]\-p\ "every\ 5\ month\ from\ 2009/03"\f[] \-\- periods will have
boundaries on 2009/03/01, 2009/08/01, ...
T}
.TE
.PP
If you want intervals that start on arbitrary day of your choosing and
span a week, month or year, you need to use any of the following:
.PP
\f[C]every\ Nth\ day\ of\ week\f[], \f[C]every\ <weekday>\f[],
\f[C]every\ Nth\ day\ [of\ month]\f[],
\f[C]every\ Nth\ weekday\ [of\ month]\f[],
\f[C]every\ MM/DD\ [of\ year]\f[], \f[C]every\ Nth\ MMM\ [of\ year]\f[],
\f[C]every\ MMM\ Nth\ [of\ year]\f[].
.PP
Examples:
.PP
.TS
tab(@);
l.
T{
\f[C]\-p\ "every\ 2nd\ day\ of\ week"\f[] \-\- periods will go from Tue
to Tue
T}
T{
\f[C]\-p\ "every\ Tue"\f[] \-\- same
T}
T{
\f[C]\-p\ "every\ 15th\ day"\f[] \-\- period boundaries will be on 15th
of each month
T}
T{
\f[C]\-p\ "every\ 2nd\ Monday"\f[] \-\- period boundaries will be on
second Monday of each month
T}
T{
\f[C]\-p\ "every\ 11/05"\f[] \-\- yearly periods with boundaries on 5th
of Nov
T}
T{
\f[C]\-p\ "every\ 5th\ Nov"\f[] \-\- same
T}
T{
\f[C]\-p\ "every\ Nov\ 5th"\f[] \-\- same
T}
.TE
.PP

View File

@ -1,4 +1,4 @@
This is hledger.1.info, produced by makeinfo version 6.1 from stdin.
This is hledger.1.info, produced by makeinfo version 6.0 from stdin.

File: hledger.1.info, Node: Top, Next: EXAMPLES, Up: (dir)
@ -125,6 +125,7 @@ File: hledger.1.info, Node: OPTIONS, Next: QUERIES, Prev: EXAMPLES, Up: Top
* Report start & end date::
* Report intervals::
* Period expressions::
* For example::
* Depth limiting::
* Pivoting::
* Cost::
@ -432,7 +433,7 @@ complex intervals may be specified with a period expression. Report
intervals can not be specified with a query, currently.

File: hledger.1.info, Node: Period expressions, Next: Depth limiting, Prev: Report intervals, Up: OPTIONS
File: hledger.1.info, Node: Period expressions, Next: For example, Prev: Report intervals, Up: OPTIONS
2.10 Period expressions
=======================
@ -486,15 +487,54 @@ start/end dates (if any), the word 'in' is optional. Examples:
'-p "monthly in 2008"'
'-p "quarterly"'
Note that 'weekly', 'monthly', 'quarterly' and 'yearly' intervals
will always start on the first day on week, month, quarter or year
accordingly, and will end on the last day of same period, even if
associated period expression specifies different explicit start and end
date.

File: hledger.1.info, Node: For example, Next: Depth limiting, Prev: Period expressions, Up: OPTIONS
2.11 For example:
=================
'-p "weekly from 2009/1/1 to 2009/4/1"' - starts on 2008/12/29, closest
preceeding Monday '-p "monthly in 2008/11/25"' - starts on 2018/11/01
'-p "quarterly from 2009-05-05 to 2009-06-01"' - starts on 2009/04/01,
ends on 2009/06/30, which are first and last days of Q2 2009 '-p "yearly
from 2009-12-29"' - starts on 2009/01/01, first day of 2009
----------------------------
The following more complex report intervals are also supported:
'biweekly', 'bimonthly', 'every N days|weeks|months|quarters|years',
'every Nth day [of month]', 'every Nth day of week'.
'biweekly', 'bimonthly', 'every day|week|month|quarter|year', 'every N
days|weeks|months|quarters|years'.
All of these will start on the first day of the requested period and
end on the last one, as described above.
Examples:
'-p "bimonthly from 2008"'
'-p "every 2 weeks"'
'-p "every 5 days from 1/3"'
'-p "bimonthly from 2008"' - periods will have boundaries on 2008/01/01, 2008/03/01, ...
'-p "every 2 weeks"' - starts on closest preceeding Monday
'-p "every 5 month from 2009/03"' - periods will have boundaries on 2009/03/01, 2009/08/01, ...
If you want intervals that start on arbitrary day of your choosing
and span a week, month or year, you need to use any of the following:
'every Nth day of week', 'every <weekday>', 'every Nth day [of
month]', 'every Nth weekday [of month]', 'every MM/DD [of year]', 'every
Nth MMM [of year]', 'every MMM Nth [of year]'.
Examples:
'-p "every 2nd day of week"' - periods will go from Tue to Tue
'-p "every Tue"' - same
'-p "every 15th day"' - period boundaries will be on 15th of each month
'-p "every 2nd Monday"' - period boundaries will be on second Monday of each month
'-p "every 11/05"' - yearly periods with boundaries on 5th of Nov
'-p "every 5th Nov"' - same
'-p "every Nov 5th"' - same
Show historical balances at end of 15th each month (N is exclusive
end date):
@ -507,9 +547,9 @@ start date and exclusive end date):
'hledger register checking -p "every 3rd day of week"'

File: hledger.1.info, Node: Depth limiting, Next: Pivoting, Prev: Period expressions, Up: OPTIONS
File: hledger.1.info, Node: Depth limiting, Next: Pivoting, Prev: For example, Up: OPTIONS
2.11 Depth limiting
2.12 Depth limiting
===================
With the '--depth N' option (short form: '-N'), commands like account,
@ -521,7 +561,7 @@ less detail. This flag has the same effect as a 'depth:' query argument

File: hledger.1.info, Node: Pivoting, Next: Cost, Prev: Depth limiting, Up: OPTIONS
2.12 Pivoting
2.13 Pivoting
=============
Normally hledger sums amounts, and organizes them in a hierarchy, based
@ -578,7 +618,7 @@ $ hledger balance --pivot member acct:.

File: hledger.1.info, Node: Cost, Next: Market value, Prev: Pivoting, Up: OPTIONS
2.13 Cost
2.14 Cost
=========
The '-B/--cost' flag converts amounts to their cost at transaction time,
@ -587,7 +627,7 @@ if they have a transaction price specified.

File: hledger.1.info, Node: Market value, Next: Regular expressions, Prev: Cost, Up: OPTIONS
2.14 Market value
2.15 Market value
=================
The '-V/--value' flag converts the reported amounts to their market
@ -636,7 +676,7 @@ directives, not transaction prices (unlike Ledger).

File: hledger.1.info, Node: Regular expressions, Prev: Market value, Up: OPTIONS
2.15 Regular expressions
2.16 Regular expressions
========================
hledger uses regular expressions in a number of places:
@ -2222,129 +2262,131 @@ Node: EXAMPLES1886
Ref: #examples1988
Node: OPTIONS3634
Ref: #options3738
Node: General options4038
Ref: #general-options4165
Node: Command options6484
Ref: #command-options6637
Node: Command arguments7035
Ref: #command-arguments7191
Node: Argument files7312
Ref: #argument-files7465
Node: Special characters7731
Ref: #special-characters7886
Node: Input files9305
Ref: #input-files9443
Node: Smart dates11406
Ref: #smart-dates11549
Node: Report start & end date12528
Ref: #report-start-end-date12700
Node: Report intervals13766
Ref: #report-intervals13931
Node: Period expressions14332
Ref: #period-expressions14494
Node: Depth limiting16834
Ref: #depth-limiting16980
Node: Pivoting17322
Ref: #pivoting17442
Node: Cost19118
Ref: #cost19228
Node: Market value19346
Ref: #market-value19483
Node: Regular expressions20783
Ref: #regular-expressions20921
Node: QUERIES22282
Ref: #queries22386
Node: COMMANDS26353
Ref: #commands26467
Node: accounts27450
Ref: #accounts27550
Node: activity28543
Ref: #activity28655
Node: add29014
Ref: #add29115
Node: balance31773
Ref: #balance31886
Node: Flat mode35043
Ref: #flat-mode35170
Node: Depth limited balance reports35590
Ref: #depth-limited-balance-reports35793
Node: Multicolumn balance reports36213
Ref: #multicolumn-balance-reports36424
Node: Custom balance output41072
Ref: #custom-balance-output41256
Node: Colour support43349
Ref: #colour-support43510
Node: Output destination43683
Ref: #output-destination43841
Node: CSV output44111
Ref: #csv-output44230
Node: balancesheet44627
Ref: #balancesheet44765
Node: balancesheetequity46733
Ref: #balancesheetequity46884
Node: cashflow47673
Ref: #cashflow47803
Node: check-dates49715
Ref: #check-dates49844
Node: check-dupes49961
Ref: #check-dupes50088
Node: equity50225
Ref: #equity50337
Node: help50500
Ref: #help50603
Node: import51677
Ref: #import51793
Node: incomestatement52523
Ref: #incomestatement52659
Node: prices54612
Ref: #prices54729
Node: print54772
Ref: #print54884
Node: print-unique59730
Ref: #print-unique59858
Node: register59926
Ref: #register60055
Node: Custom register output64556
Ref: #custom-register-output64687
Node: register-match65984
Ref: #register-match66120
Node: rewrite66303
Ref: #rewrite66422
Node: stats66491
Ref: #stats66596
Node: tags67477
Ref: #tags67577
Node: test67813
Ref: #test67899
Node: ADD-ON COMMANDS68267
Ref: #add-on-commands68379
Node: Official add-ons69666
Ref: #official-add-ons69808
Node: api69895
Ref: #api69986
Node: ui70038
Ref: #ui70139
Node: web70197
Ref: #web70288
Node: Third party add-ons70334
Ref: #third-party-add-ons70511
Node: diff70646
Ref: #diff70745
Node: iadd70844
Ref: #iadd70960
Node: interest71043
Ref: #interest71166
Node: irr71261
Ref: #irr71361
Node: Experimental add-ons71439
Ref: #experimental-add-ons71593
Node: autosync71884
Ref: #autosync71998
Node: budget72237
Ref: #budget72361
Node: chart72427
Ref: #chart72546
Node: check72617
Ref: #check72721
Node: General options4054
Ref: #general-options4181
Node: Command options6500
Ref: #command-options6653
Node: Command arguments7051
Ref: #command-arguments7207
Node: Argument files7328
Ref: #argument-files7481
Node: Special characters7747
Ref: #special-characters7902
Node: Input files9321
Ref: #input-files9459
Node: Smart dates11422
Ref: #smart-dates11565
Node: Report start & end date12544
Ref: #report-start-end-date12716
Node: Report intervals13782
Ref: #report-intervals13947
Node: Period expressions14348
Ref: #period-expressions14507
Node: For example16552
Ref: #for-example16697
Node: Depth limiting18621
Ref: #depth-limiting18760
Node: Pivoting19102
Ref: #pivoting19222
Node: Cost20898
Ref: #cost21008
Node: Market value21126
Ref: #market-value21263
Node: Regular expressions22563
Ref: #regular-expressions22701
Node: QUERIES24062
Ref: #queries24166
Node: COMMANDS28133
Ref: #commands28247
Node: accounts29230
Ref: #accounts29330
Node: activity30323
Ref: #activity30435
Node: add30794
Ref: #add30895
Node: balance33553
Ref: #balance33666
Node: Flat mode36823
Ref: #flat-mode36950
Node: Depth limited balance reports37370
Ref: #depth-limited-balance-reports37573
Node: Multicolumn balance reports37993
Ref: #multicolumn-balance-reports38204
Node: Custom balance output42852
Ref: #custom-balance-output43036
Node: Colour support45129
Ref: #colour-support45290
Node: Output destination45463
Ref: #output-destination45621
Node: CSV output45891
Ref: #csv-output46010
Node: balancesheet46407
Ref: #balancesheet46545
Node: balancesheetequity48513
Ref: #balancesheetequity48664
Node: cashflow49453
Ref: #cashflow49583
Node: check-dates51495
Ref: #check-dates51624
Node: check-dupes51741
Ref: #check-dupes51868
Node: equity52005
Ref: #equity52117
Node: help52280
Ref: #help52383
Node: import53457
Ref: #import53573
Node: incomestatement54303
Ref: #incomestatement54439
Node: prices56392
Ref: #prices56509
Node: print56552
Ref: #print56664
Node: print-unique61510
Ref: #print-unique61638
Node: register61706
Ref: #register61835
Node: Custom register output66336
Ref: #custom-register-output66467
Node: register-match67764
Ref: #register-match67900
Node: rewrite68083
Ref: #rewrite68202
Node: stats68271
Ref: #stats68376
Node: tags69257
Ref: #tags69357
Node: test69593
Ref: #test69679
Node: ADD-ON COMMANDS70047
Ref: #add-on-commands70159
Node: Official add-ons71446
Ref: #official-add-ons71588
Node: api71675
Ref: #api71766
Node: ui71818
Ref: #ui71919
Node: web71977
Ref: #web72068
Node: Third party add-ons72114
Ref: #third-party-add-ons72291
Node: diff72426
Ref: #diff72525
Node: iadd72624
Ref: #iadd72740
Node: interest72823
Ref: #interest72946
Node: irr73041
Ref: #irr73141
Node: Experimental add-ons73219
Ref: #experimental-add-ons73373
Node: autosync73664
Ref: #autosync73778
Node: budget74017
Ref: #budget74141
Node: chart74207
Ref: #chart74326
Node: check74397
Ref: #check74501

End Tag Table

View File

@ -286,6 +286,7 @@ OPTIONS
format automatically based on the file extension, or if that is not
recognised, by trying each built-in "reader" in turn:
Reader: Reads: Used for file extensions:
-----------------------------------------------------------------------------
journal hledger's journal format, also .journal .j .hledger
@ -323,14 +324,16 @@ OPTIONS
Examples:
2009/1/1, 2009/01/01, simple dates, several sep-
2009-1-1, 2009.1.1 arators allowed
2009/1, 2009 same as above - a missing
day or month defaults to 1
1/1, january, jan, relative dates, meaning
this year january 1 of the current
year
next year january 1 of next year
this month the 1st of the current
month
@ -355,6 +358,7 @@ OPTIONS
Examples:
-b 2016/3/17 begin on St. Patrick's
day 2016
-e 12/1 end at the start of decem-
@ -394,6 +398,7 @@ OPTIONS
long as you don't run two dates together. "to" can also be written as
"-". These are equivalent to the above:
-p "2009/1/1 2009/4/1"
-p2009/1/1to2009/4/1
-p2009/1/1-2009/4/1
@ -401,6 +406,7 @@ OPTIONS
Dates are smart dates, so if the current year is 2009, the above can
also be written as:
-p "1/1 4/1"
-p "january-apr"
-p "this year to 4/1"
@ -408,6 +414,7 @@ OPTIONS
If you specify only one date, the missing start or end date will be the
earliest or latest transaction in your journal:
-p "from 2009/1/1" everything after january
1, 2009
-p "from 2009/1" the same
@ -418,6 +425,7 @@ OPTIONS
A single date with no "from" or "to" defines both the start and end
date like so:
-p "2009" the year 2009; equivalent
to "2009/1/1 to 2010/1/1"
-p "2009/1" the month of jan; equiva-
@ -432,19 +440,65 @@ OPTIONS
-Y flags. Between report interval and start/end dates (if any), the
word in is optional. Examples:
-p "weekly from 2009/1/1 to 2009/4/1"
-p "monthly in 2008"
-p "quarterly"
Note that weekly, monthly, quarterly and yearly intervals will always
start on the first day on week, month, quarter or year accordingly, and
will end on the last day of same period, even if associated period
expression specifies different explicit start and end date.
For example:
-p "weekly from 2009/1/1 to 2009/4/1" -- starts on 2008/12/29, closest
preceeding Monday -p "monthly in 2008/11/25" -- starts on 2018/11/01
-p "quarterly from 2009-05-05 to 2009-06-01" - starts on 2009/04/01,
ends on 2009/06/30, which are first and last days of Q2 2009
-p "yearly from 2009-12-29" - starts on 2009/01/01, first day of 2009
------------------------------------------
The following more complex report intervals are also supported:
biweekly, bimonthly, every N days|weeks|months|quarters|years,
every Nth day [of month], every Nth day of week.
biweekly, bimonthly, every day|week|month|quarter|year,
every N days|weeks|months|quarters|years.
All of these will start on the first day of the requested period and
end on the last one, as described above.
Examples:
-p "bimonthly from 2008"
-p "every 2 weeks"
-p "every 5 days from 1/3"
-p "bimonthly from 2008" -- periods
will have boundaries on 2008/01/01,
2008/03/01, ...
-p "every 2 weeks" -- starts on closest
preceeding Monday
-p "every 5 month from 2009/03" --
periods will have boundaries on
2009/03/01, 2009/08/01, ...
If you want intervals that start on arbitrary day of your choosing and
span a week, month or year, you need to use any of the following:
every Nth day of week, every <weekday>, every Nth day [of month],
every Nth weekday [of month], every MM/DD [of year],
every Nth MMM [of year], every MMM Nth [of year].
Examples:
-p "every 2nd day of week" -- periods
will go from Tue to Tue
-p "every Tue" -- same
-p "every 15th day" -- period bound-
aries will be on 15th of each month
-p "every 2nd Monday" -- period bound-
aries will be on second Monday of each
month
-p "every 11/05" -- yearly periods with
boundaries on 5th of Nov
-p "every 5th Nov" -- same
-p "every Nov 5th" -- same
Show historical balances at end of 15th each month (N is exclusive end
date):