doc: regenerate embedded manuals

[ci skip]
This commit is contained in:
Simon Michael 2017-11-28 17:20:41 -08:00
parent c433873e04
commit 4228203740
18 changed files with 1005 additions and 624 deletions

View File

@ -1,4 +1,4 @@
This is hledger-api.1.info, produced by makeinfo version 6.1 from stdin. This is hledger-api.1.info, produced by makeinfo version 6.0 from stdin.
 
File: hledger-api.1.info, Node: Top, Next: OPTIONS, Up: (dir) File: hledger-api.1.info, Node: Top, Next: OPTIONS, Up: (dir)

View File

@ -8,19 +8,77 @@
CSV \- how hledger reads CSV data, and the CSV rules file format CSV \- how hledger reads CSV data, and the CSV rules file format
.SH DESCRIPTION .SH DESCRIPTION
.PP .PP
hledger can read CSV files, converting each CSV record into a journal hledger can read CSV (comma\-separated value) files as if they were
entry (transaction), if you provide some conversion hints in a "rules journal files, automatically converting each CSV record into a
file". transaction.
This file should be named like the CSV file with an additional (To learn about \f[I]writing\f[] CSV, see CSV output.)
\f[C]\&.rules\f[] suffix (eg: \f[C]mybank.csv.rules\f[]); or, you can
specify the file with \f[C]\-\-rules\-file\ PATH\f[].
hledger will create it if necessary, with some default rules which
you\[aq]ll need to adjust.
At minimum, the rules file must specify the \f[C]date\f[] and
\f[C]amount\f[] fields.
For an example, see Cookbook: convert CSV files.
.PP .PP
To learn about \f[I]exporting\f[] CSV, see CSV output. Converting CSV to transactions requires some special conversion rules.
These do several things:
.IP \[bu] 2
they describe the layout and format of the CSV data
.IP \[bu] 2
they can customize the generated journal entries using a simple
templating language
.IP \[bu] 2
they can add refinements based on patterns in the CSV data, eg
categorizing transactions with more detailed account names.
.PP
When reading a CSV file named \f[C]FILE.csv\f[], hledger looks for a
conversion rules file named \f[C]FILE.csv.rules\f[] in the same
directory.
You can override this with the \f[C]\-\-rules\-file\f[] option.
If the rules file does not exist, hledger will auto\-create one with
some example rules, which you\[aq]ll need to adjust.
.PP
At minimum, the rules file must identify the \f[C]date\f[] and
\f[C]amount\f[] fields.
It may also be necessary to specify the date format, and the number of
header lines to skip.
Eg:
.IP
.nf
\f[C]
fields\ date,\ _,\ _,\ amount
date\-format\ \ %d/%m/%Y
skip\ 1
\f[]
.fi
.PP
A more complete example:
.IP
.nf
\f[C]
#\ hledger\ CSV\ rules\ for\ amazon.com\ order\ history
#\ sample:
#\ "Date","Type","To/From","Name","Status","Amount","Fees","Transaction\ ID"
#\ "Jul\ 29,\ 2012","Payment","To","Adapteva,\ Inc.","Completed","$25.00","$0.00","17LA58JSK6PRD4HDGLNJQPI1PB9N8DKPVHL"
#\ skip\ one\ header\ line
skip\ 1
#\ name\ the\ csv\ fields\ (and\ assign\ the\ transaction\[aq]s\ date,\ amount\ and\ code)
fields\ date,\ _,\ toorfrom,\ name,\ amzstatus,\ amount,\ fees,\ code
#\ how\ to\ parse\ the\ date
date\-format\ %b\ %\-d,\ %Y
#\ combine\ two\ fields\ to\ make\ the\ description
description\ %toorfrom\ %name
#\ save\ these\ fields\ as\ tags
comment\ \ \ \ \ status:%amzstatus,\ fees:%fees
#\ set\ the\ base\ account\ for\ all\ transactions
account1\ \ \ \ assets:amazon
#\ flip\ the\ sign\ on\ the\ amount
amount\ \ \ \ \ \ \-%amount
\f[]
.fi
.PP
For more examples, see Convert CSV files.
.SH CSV RULES .SH CSV RULES
.PP .PP
The following seven kinds of rule can appear in the rules file, in any The following seven kinds of rule can appear in the rules file, in any

View File

@ -1,4 +1,4 @@
This is hledger_csv.5.info, produced by makeinfo version 6.1 from stdin. This is hledger_csv.5.info, produced by makeinfo version 6.0 from stdin.
 
File: hledger_csv.5.info, Node: Top, Next: CSV RULES, Up: (dir) File: hledger_csv.5.info, Node: Top, Next: CSV RULES, Up: (dir)
@ -6,16 +6,63 @@ File: hledger_csv.5.info, Node: Top, Next: CSV RULES, Up: (dir)
hledger_csv(5) hledger 1.4 hledger_csv(5) hledger 1.4
************************** **************************
hledger can read CSV files, converting each CSV record into a journal hledger can read CSV (comma-separated value) files as if they were
entry (transaction), if you provide some conversion hints in a "rules journal files, automatically converting each CSV record into a
file". This file should be named like the CSV file with an additional transaction. (To learn about _writing_ CSV, see CSV output.)
'.rules' suffix (eg: 'mybank.csv.rules'); or, you can specify the file
with '--rules-file PATH'. hledger will create it if necessary, with
some default rules which you'll need to adjust. At minimum, the rules
file must specify the 'date' and 'amount' fields. For an example, see
Cookbook: convert CSV files.
To learn about _exporting_ CSV, see CSV output. Converting CSV to transactions requires some special conversion
rules. These do several things:
* they describe the layout and format of the CSV data
* they can customize the generated journal entries using a simple
templating language
* they can add refinements based on patterns in the CSV data, eg
categorizing transactions with more detailed account names.
When reading a CSV file named 'FILE.csv', hledger looks for a
conversion rules file named 'FILE.csv.rules' in the same directory. You
can override this with the '--rules-file' option. If the rules file
does not exist, hledger will auto-create one with some example rules,
which you'll need to adjust.
At minimum, the rules file must identify the 'date' and 'amount'
fields. It may also be necessary to specify the date format, and the
number of header lines to skip. Eg:
fields date, _, _, amount
date-format %d/%m/%Y
skip 1
A more complete example:
# hledger CSV rules for amazon.com order history
# sample:
# "Date","Type","To/From","Name","Status","Amount","Fees","Transaction ID"
# "Jul 29, 2012","Payment","To","Adapteva, Inc.","Completed","$25.00","$0.00","17LA58JSK6PRD4HDGLNJQPI1PB9N8DKPVHL"
# skip one header line
skip 1
# name the csv fields (and assign the transaction's date, amount and code)
fields date, _, toorfrom, name, amzstatus, amount, fees, code
# how to parse the date
date-format %b %-d, %Y
# combine two fields to make the description
description %toorfrom %name
# save these fields as tags
comment status:%amzstatus, fees:%fees
# set the base account for all transactions
account1 assets:amazon
# flip the sign on the amount
amount -%amount
For more examples, see Convert CSV files.
* Menu: * Menu:
* CSV RULES:: * CSV RULES::
@ -270,33 +317,33 @@ one rules file will be used for all the CSV files being read.
 
Tag Table: Tag Table:
Node: Top74 Node: Top74
Node: CSV RULES810 Node: CSV RULES2165
Ref: #csv-rules920 Ref: #csv-rules2275
Node: skip1182 Node: skip2537
Ref: #skip1278 Ref: #skip2633
Node: date-format1450 Node: date-format2805
Ref: #date-format1579 Ref: #date-format2934
Node: field list2085 Node: field list3440
Ref: #field-list2224 Ref: #field-list3579
Node: field assignment2929 Node: field assignment4284
Ref: #field-assignment3086 Ref: #field-assignment4441
Node: conditional block3590 Node: conditional block4945
Ref: #conditional-block3746 Ref: #conditional-block5101
Node: include4642 Node: include5997
Ref: #include4774 Ref: #include6129
Node: newest-first5005 Node: newest-first6360
Ref: #newest-first5121 Ref: #newest-first6476
Node: CSV TIPS5532 Node: CSV TIPS6887
Ref: #csv-tips5628 Ref: #csv-tips6983
Node: CSV ordering5746 Node: CSV ordering7101
Ref: #csv-ordering5866 Ref: #csv-ordering7221
Node: CSV accounts6047 Node: CSV accounts7402
Ref: #csv-accounts6187 Ref: #csv-accounts7542
Node: CSV amounts6441 Node: CSV amounts7796
Ref: #csv-amounts6589 Ref: #csv-amounts7944
Node: CSV balance assertions7364 Node: CSV balance assertions8719
Ref: #csv-balance-assertions7548 Ref: #csv-balance-assertions8903
Node: Reading multiple CSV files7753 Node: Reading multiple CSV files9108
Ref: #reading-multiple-csv-files7925 Ref: #reading-multiple-csv-files9280
 
End Tag Table End Tag Table

View File

@ -7,16 +7,65 @@ NAME
CSV - how hledger reads CSV data, and the CSV rules file format CSV - how hledger reads CSV data, and the CSV rules file format
DESCRIPTION DESCRIPTION
hledger can read CSV files, converting each CSV record into a journal hledger can read CSV (comma-separated value) files as if they were
entry (transaction), if you provide some conversion hints in a "rules journal files, automatically converting each CSV record into a transac-
file". This file should be named like the CSV file with an additional tion. (To learn about writing CSV, see CSV output.)
.rules suffix (eg: mybank.csv.rules); or, you can specify the file with
--rules-file PATH. hledger will create it if necessary, with some
default rules which you'll need to adjust. At minimum, the rules file
must specify the date and amount fields. For an example, see Cookbook:
convert CSV files.
To learn about exporting CSV, see CSV output. Converting CSV to transactions requires some special conversion rules.
These do several things:
o they describe the layout and format of the CSV data
o they can customize the generated journal entries using a simple tem-
plating language
o they can add refinements based on patterns in the CSV data, eg cate-
gorizing transactions with more detailed account names.
When reading a CSV file named FILE.csv, hledger looks for a conversion
rules file named FILE.csv.rules in the same directory. You can over-
ride this with the --rules-file option. If the rules file does not
exist, hledger will auto-create one with some example rules, which
you'll need to adjust.
At minimum, the rules file must identify the date and amount fields.
It may also be necessary to specify the date format, and the number of
header lines to skip. Eg:
fields date, _, _, amount
date-format %d/%m/%Y
skip 1
A more complete example:
# hledger CSV rules for amazon.com order history
# sample:
# "Date","Type","To/From","Name","Status","Amount","Fees","Transaction ID"
# "Jul 29, 2012","Payment","To","Adapteva, Inc.","Completed","$25.00","$0.00","17LA58JSK6PRD4HDGLNJQPI1PB9N8DKPVHL"
# skip one header line
skip 1
# name the csv fields (and assign the transaction's date, amount and code)
fields date, _, toorfrom, name, amzstatus, amount, fees, code
# how to parse the date
date-format %b %-d, %Y
# combine two fields to make the description
description %toorfrom %name
# save these fields as tags
comment status:%amzstatus, fees:%fees
# set the base account for all transactions
account1 assets:amazon
# flip the sign on the amount
amount -%amount
For more examples, see Convert CSV files.
CSV RULES CSV RULES
The following seven kinds of rule can appear in the rules file, in any The following seven kinds of rule can appear in the rules file, in any

View File

@ -384,7 +384,26 @@ digit groups (thousands, or any other grouping) can be separated by
commas (in which case period is used for decimal point) or periods (in commas (in which case period is used for decimal point) or periods (in
which case comma is used for decimal point) which case comma is used for decimal point)
.PP .PP
You can use any of these variations when recording data, but when You can use any of these variations when recording data.
However, there is some ambiguous way of representing numbers like
\f[C]$1.000\f[] and \f[C]$1,000\f[] both may mean either one thousand or
one dollar.
By default hledger will assume that this is sole delimiter is used only
for decimals.
On the other hand commodity format declared prior to that line will help
to resolve that ambiguity differently:
.IP
.nf
\f[C]
commodity\ $1,000.00
2017/12/25\ New\ life\ of\ Scrooge
\ \ \ \ expenses:gifts\ \ $1,000
\ \ \ \ assets
\f[]
.fi
.PP
Though journal may contain mixed styles to represent amount, when
hledger displays amounts, it will choose a consistent format for each hledger displays amounts, it will choose a consistent format for each
commodity. commodity.
(Except for price amounts, which are always formatted as written). (Except for price amounts, which are always formatted as written).
@ -716,9 +735,9 @@ P\ 2010/1/1\ €\ $1.40
.SS Comments .SS Comments
.PP .PP
Lines in the journal beginning with a semicolon (\f[C];\f[]) or hash Lines in the journal beginning with a semicolon (\f[C];\f[]) or hash
(\f[C]#\f[]) or asterisk (\f[C]*\f[]) are comments, and will be ignored. (\f[C]#\f[]) or star (\f[C]*\f[]) are comments, and will be ignored.
(Asterisk comments make it easy to treat your journal like an org\-mode (Star comments cause org\-mode nodes to be ignored, allowing emacs users
outline in emacs.) to fold and navigate their journals with org\-mode or orgstruct\-mode.)
.PP .PP
Also, anything between \f[C]comment\f[] and \f[C]end\ comment\f[] Also, anything between \f[C]comment\f[] and \f[C]end\ comment\f[]
directives is a (multi\-line) comment. directives is a (multi\-line) comment.
@ -730,20 +749,22 @@ description and/or indented on the following lines (before the
postings). postings).
Similarly, you can attach comments to an individual posting by writing Similarly, you can attach comments to an individual posting by writing
them after the amount and/or indented on the following lines. them after the amount and/or indented on the following lines.
Transaction and posting comments must begin with a semicolon
(\f[C];\f[]).
.PP .PP
Some examples: Some examples:
.IP .IP
.nf .nf
\f[C] \f[C]
#\ a\ journal\ comment #\ a\ file\ comment
;\ also\ a\ journal\ comment ;\ also\ a\ file\ comment
comment comment
This\ is\ a\ multiline\ comment, This\ is\ a\ multiline\ file\ comment,
which\ continues\ until\ a\ line which\ continues\ until\ a\ line
where\ the\ "end\ comment"\ string where\ the\ "end\ comment"\ string
appears\ on\ its\ own. appears\ on\ its\ own\ (or\ end\ of\ file).
end\ comment end\ comment
2012/5/14\ something\ \ ;\ a\ transaction\ comment 2012/5/14\ something\ \ ;\ a\ transaction\ comment
@ -752,7 +773,7 @@ end\ comment
\ \ \ \ posting2 \ \ \ \ posting2
\ \ \ \ ;\ a\ comment\ for\ posting\ 2 \ \ \ \ ;\ a\ comment\ for\ posting\ 2
\ \ \ \ ;\ another\ comment\ line\ for\ posting\ 2 \ \ \ \ ;\ another\ comment\ line\ for\ posting\ 2
;\ a\ journal\ comment\ (because\ not\ indented) ;\ a\ file\ comment\ (because\ not\ indented)
\f[] \f[]
.fi .fi
.SS Tags .SS Tags
@ -1038,7 +1059,7 @@ commodity\-less amounts, or until the next D directive.
D\ $1,000.00 D\ $1,000.00
1/1 1/1
\ \ a\ \ \ \ \ 5\ \ \ \ #\ <\-\ commodity\-less\ amount,\ becomes\ $1 \ \ a\ \ \ \ \ 5\ \ \ \ ;\ <\-\ commodity\-less\ amount,\ becomes\ $1
\ \ b \ \ b
\f[] \f[]
.fi .fi

View File

@ -1,4 +1,4 @@
This is hledger_journal.5.info, produced by makeinfo version 6.1 from This is hledger_journal.5.info, produced by makeinfo version 6.0 from
stdin. stdin.
 
@ -361,7 +361,20 @@ commodity name. Some examples:
commas (in which case period is used for decimal point) or periods commas (in which case period is used for decimal point) or periods
(in which case comma is used for decimal point) (in which case comma is used for decimal point)
You can use any of these variations when recording data, but when You can use any of these variations when recording data. However,
there is some ambiguous way of representing numbers like '$1.000' and
'$1,000' both may mean either one thousand or one dollar. By default
hledger will assume that this is sole delimiter is used only for
decimals. On the other hand commodity format declared prior to that
line will help to resolve that ambiguity differently:
commodity $1,000.00
2017/12/25 New life of Scrooge
expenses:gifts $1,000
assets
Though journal may contain mixed styles to represent amount, when
hledger displays amounts, it will choose a consistent format for each hledger displays amounts, it will choose a consistent format for each
commodity. (Except for price amounts, which are always formatted as commodity. (Except for price amounts, which are always formatted as
written). The display format is chosen as follows: written). The display format is chosen as follows:
@ -684,8 +697,9 @@ File: hledger_journal.5.info, Node: Comments, Next: Tags, Prev: Prices, Up:
============= =============
Lines in the journal beginning with a semicolon (';') or hash ('#') or Lines in the journal beginning with a semicolon (';') or hash ('#') or
asterisk ('*') are comments, and will be ignored. (Asterisk comments star ('*') are comments, and will be ignored. (Star comments cause
make it easy to treat your journal like an org-mode outline in emacs.) org-mode nodes to be ignored, allowing emacs users to fold and navigate
their journals with org-mode or orgstruct-mode.)
Also, anything between 'comment' and 'end comment' directives is a Also, anything between 'comment' and 'end comment' directives is a
(multi-line) comment. If there is no 'end comment', the comment extends (multi-line) comment. If there is no 'end comment', the comment extends
@ -695,18 +709,19 @@ to the end of the file.
description and/or indented on the following lines (before the description and/or indented on the following lines (before the
postings). Similarly, you can attach comments to an individual posting postings). Similarly, you can attach comments to an individual posting
by writing them after the amount and/or indented on the following lines. by writing them after the amount and/or indented on the following lines.
Transaction and posting comments must begin with a semicolon (';').
Some examples: Some examples:
# a journal comment # a file comment
; also a journal comment ; also a file comment
comment comment
This is a multiline comment, This is a multiline file comment,
which continues until a line which continues until a line
where the "end comment" string where the "end comment" string
appears on its own. appears on its own (or end of file).
end comment end comment
2012/5/14 something ; a transaction comment 2012/5/14 something ; a transaction comment
@ -715,7 +730,7 @@ end comment
posting2 posting2
; a comment for posting 2 ; a comment for posting 2
; another comment line for posting 2 ; another comment line for posting 2
; a journal comment (because not indented) ; a file comment (because not indented)
 
File: hledger_journal.5.info, Node: Tags, Next: Directives, Prev: Comments, Up: FILE FORMAT File: hledger_journal.5.info, Node: Tags, Next: Directives, Prev: Comments, Up: FILE FORMAT
@ -992,7 +1007,7 @@ amounts, or until the next D directive.
D $1,000.00 D $1,000.00
1/1 1/1
a 5 # <- commodity-less amount, becomes $1 a 5 ; <- commodity-less amount, becomes $1
b b
 
@ -1087,61 +1102,61 @@ Node: Account names11207
Ref: #account-names11352 Ref: #account-names11352
Node: Amounts11839 Node: Amounts11839
Ref: #amounts11977 Ref: #amounts11977
Node: Virtual Postings14078 Node: Virtual Postings14568
Ref: #virtual-postings14239 Ref: #virtual-postings14729
Node: Balance Assertions15459 Node: Balance Assertions15949
Ref: #balance-assertions15636 Ref: #balance-assertions16126
Node: Assertions and ordering16532 Node: Assertions and ordering17022
Ref: #assertions-and-ordering16720 Ref: #assertions-and-ordering17210
Node: Assertions and included files17420 Node: Assertions and included files17910
Ref: #assertions-and-included-files17663 Ref: #assertions-and-included-files18153
Node: Assertions and multiple -f options17996 Node: Assertions and multiple -f options18486
Ref: #assertions-and-multiple--f-options18252 Ref: #assertions-and-multiple--f-options18742
Node: Assertions and commodities18384 Node: Assertions and commodities18874
Ref: #assertions-and-commodities18621 Ref: #assertions-and-commodities19111
Node: Assertions and subaccounts19317 Node: Assertions and subaccounts19807
Ref: #assertions-and-subaccounts19551 Ref: #assertions-and-subaccounts20041
Node: Assertions and virtual postings20072 Node: Assertions and virtual postings20562
Ref: #assertions-and-virtual-postings20281 Ref: #assertions-and-virtual-postings20771
Node: Balance Assignments20423 Node: Balance Assignments20913
Ref: #balance-assignments20594 Ref: #balance-assignments21084
Node: Prices21713 Node: Prices22203
Ref: #prices21848 Ref: #prices22338
Node: Transaction prices21899 Node: Transaction prices22389
Ref: #transaction-prices22046 Ref: #transaction-prices22536
Node: Market prices24202 Node: Market prices24692
Ref: #market-prices24339 Ref: #market-prices24829
Node: Comments25299 Node: Comments25789
Ref: #comments25423 Ref: #comments25913
Node: Tags26536 Node: Tags27155
Ref: #tags26656 Ref: #tags27275
Node: Directives28058 Node: Directives28677
Ref: #directives28173 Ref: #directives28792
Node: Account aliases28366 Node: Account aliases28985
Ref: #account-aliases28512 Ref: #account-aliases29131
Node: Basic aliases29116 Node: Basic aliases29735
Ref: #basic-aliases29261 Ref: #basic-aliases29880
Node: Regex aliases29951 Node: Regex aliases30570
Ref: #regex-aliases30121 Ref: #regex-aliases30740
Node: Multiple aliases30839 Node: Multiple aliases31458
Ref: #multiple-aliases31013 Ref: #multiple-aliases31632
Node: end aliases31511 Node: end aliases32130
Ref: #end-aliases31653 Ref: #end-aliases32272
Node: account directive31754 Node: account directive32373
Ref: #account-directive31936 Ref: #account-directive32555
Node: apply account directive32232 Node: apply account directive32851
Ref: #apply-account-directive32430 Ref: #apply-account-directive33049
Node: Multi-line comments33089 Node: Multi-line comments33708
Ref: #multi-line-comments33281 Ref: #multi-line-comments33900
Node: commodity directive33409 Node: commodity directive34028
Ref: #commodity-directive33595 Ref: #commodity-directive34214
Node: Default commodity34467 Node: Default commodity35086
Ref: #default-commodity34642 Ref: #default-commodity35261
Node: Default year35179 Node: Default year35798
Ref: #default-year35346 Ref: #default-year35965
Node: Including other files35769 Node: Including other files36388
Ref: #including-other-files35928 Ref: #including-other-files36547
Node: EDITOR SUPPORT36325 Node: EDITOR SUPPORT36944
Ref: #editor-support36445 Ref: #editor-support37064
 
End Tag Table End Tag Table

View File

@ -181,6 +181,7 @@ FILE FORMAT
description or posting account name, separated from it by a space, description or posting account name, separated from it by a space,
indicating one of three statuses: indicating one of three statuses:
mark status mark status
------------------ ------------------
unmarked unmarked
@ -206,6 +207,7 @@ FILE FORMAT
What "uncleared", "pending", and "cleared" actually mean is up to you. What "uncleared", "pending", and "cleared" actually mean is up to you.
Here's one suggestion: Here's one suggestion:
status meaning status meaning
-------------------------------------------------------------------------- --------------------------------------------------------------------------
uncleared recorded but not yet reconciled; needs review uncleared recorded but not yet reconciled; needs review
@ -276,7 +278,20 @@ FILE FORMAT
commas (in which case period is used for decimal point) or periods commas (in which case period is used for decimal point) or periods
(in which case comma is used for decimal point) (in which case comma is used for decimal point)
You can use any of these variations when recording data, but when You can use any of these variations when recording data. However,
there is some ambiguous way of representing numbers like $1.000 and
$1,000 both may mean either one thousand or one dollar. By default
hledger will assume that this is sole delimiter is used only for deci-
mals. On the other hand commodity format declared prior to that line
will help to resolve that ambiguity differently:
commodity $1,000.00
2017/12/25 New life of Scrooge
expenses:gifts $1,000
assets
Though journal may contain mixed styles to represent amount, when
hledger displays amounts, it will choose a consistent format for each hledger displays amounts, it will choose a consistent format for each
commodity. (Except for price amounts, which are always formatted as commodity. (Except for price amounts, which are always formatted as
written). The display format is chosen as follows: written). The display format is chosen as follows:
@ -521,9 +536,10 @@ FILE FORMAT
P 2010/1/1 $1.40 P 2010/1/1 $1.40
Comments Comments
Lines in the journal beginning with a semicolon (;) or hash (#) or Lines in the journal beginning with a semicolon (;) or hash (#) or star
asterisk (*) are comments, and will be ignored. (Asterisk comments (*) are comments, and will be ignored. (Star comments cause org-mode
make it easy to treat your journal like an org-mode outline in emacs.) nodes to be ignored, allowing emacs users to fold and navigate their
journals with org-mode or orgstruct-mode.)
Also, anything between comment and end comment directives is a Also, anything between comment and end comment directives is a
(multi-line) comment. If there is no end comment, the comment extends (multi-line) comment. If there is no end comment, the comment extends
@ -533,18 +549,19 @@ FILE FORMAT
description and/or indented on the following lines (before the post- description and/or indented on the following lines (before the post-
ings). Similarly, you can attach comments to an individual posting by ings). Similarly, you can attach comments to an individual posting by
writing them after the amount and/or indented on the following lines. writing them after the amount and/or indented on the following lines.
Transaction and posting comments must begin with a semicolon (;).
Some examples: Some examples:
# a journal comment # a file comment
; also a journal comment ; also a file comment
comment comment
This is a multiline comment, This is a multiline file comment,
which continues until a line which continues until a line
where the "end comment" string where the "end comment" string
appears on its own. appears on its own (or end of file).
end comment end comment
2012/5/14 something ; a transaction comment 2012/5/14 something ; a transaction comment
@ -553,7 +570,7 @@ FILE FORMAT
posting2 posting2
; a comment for posting 2 ; a comment for posting 2
; another comment line for posting 2 ; another comment line for posting 2
; a journal comment (because not indented) ; a file comment (because not indented)
Tags Tags
Tags are a way to add extra labels or labelled data to postings and Tags are a way to add extra labels or labelled data to postings and
@ -758,7 +775,7 @@ FILE FORMAT
D $1,000.00 D $1,000.00
1/1 1/1
a 5 # <- commodity-less amount, becomes $1 a 5 ; <- commodity-less amount, becomes $1
b b
Default year Default year
@ -803,6 +820,7 @@ EDITOR SUPPORT
These were written with Ledger in mind, but also work with hledger These were written with Ledger in mind, but also work with hledger
files: files:
Emacs http://www.ledger-cli.org/3.0/doc/ledger-mode.html Emacs http://www.ledger-cli.org/3.0/doc/ledger-mode.html
Vim https://github.com/ledger/ledger/wiki/Get- Vim https://github.com/ledger/ledger/wiki/Get-
ting-started ting-started

View File

@ -1,4 +1,4 @@
This is hledger_timeclock.5.info, produced by makeinfo version 6.1 from This is hledger_timeclock.5.info, produced by makeinfo version 6.0 from
stdin. stdin.
 

View File

@ -1,4 +1,4 @@
This is hledger_timedot.5.info, produced by makeinfo version 6.1 from This is hledger_timedot.5.info, produced by makeinfo version 6.0 from
stdin. stdin.
 

View File

@ -272,6 +272,11 @@ troubleshooting.
updated file. updated file.
This allows some basic data entry. This allows some basic data entry.
.PP .PP
\f[C]A\f[] is like \f[C]a\f[], but runs the hledger\-iadd tool, which
provides a curses\-style interface.
This key will be available if \f[C]hledger\-iadd\f[] is installed in
$PATH.
.PP
\f[C]E\f[] runs $HLEDGER_UI_EDITOR, or $EDITOR, or a default \f[C]E\f[] runs $HLEDGER_UI_EDITOR, or $EDITOR, or a default
(\f[C]emacsclient\ \-a\ ""\ \-nw\f[]) on the journal file. (\f[C]emacsclient\ \-a\ ""\ \-nw\f[]) on the journal file.
With some editors (emacs, vi), the cursor will be positioned at the With some editors (emacs, vi), the cursor will be positioned at the

View File

@ -1,4 +1,4 @@
This is hledger-ui.1.info, produced by makeinfo version 6.1 from stdin. This is hledger-ui.1.info, produced by makeinfo version 6.0 from stdin.
 
File: hledger-ui.1.info, Node: Top, Next: OPTIONS, Up: (dir) File: hledger-ui.1.info, Node: Top, Next: OPTIONS, Up: (dir)
@ -207,6 +207,10 @@ temporarily can be useful for troubleshooting.
'a' runs command-line hledger's add command, and reloads the updated 'a' runs command-line hledger's add command, and reloads the updated
file. This allows some basic data entry. file. This allows some basic data entry.
'A' is like 'a', but runs the hledger-iadd tool, which provides a
curses-style interface. This key will be available if 'hledger-iadd' is
installed in $PATH.
'E' runs $HLEDGER_UI_EDITOR, or $EDITOR, or a default ('emacsclient 'E' runs $HLEDGER_UI_EDITOR, or $EDITOR, or a default ('emacsclient
-a "" -nw') on the journal file. With some editors (emacs, vi), the -a "" -nw') on the journal file. With some editors (emacs, vi), the
cursor will be positioned at the current transaction when invoked from cursor will be positioned at the current transaction when invoked from
@ -369,15 +373,15 @@ Node: OPTIONS825
Ref: #options924 Ref: #options924
Node: KEYS3861 Node: KEYS3861
Ref: #keys3958 Ref: #keys3958
Node: SCREENS6754 Node: SCREENS6917
Ref: #screens6841 Ref: #screens7004
Node: Accounts screen6931 Node: Accounts screen7094
Ref: #accounts-screen7061 Ref: #accounts-screen7224
Node: Register screen9291 Node: Register screen9454
Ref: #register-screen9448 Ref: #register-screen9611
Node: Transaction screen11522 Node: Transaction screen11685
Ref: #transaction-screen11682 Ref: #transaction-screen11845
Node: Error screen12552 Node: Error screen12715
Ref: #error-screen12676 Ref: #error-screen12839
 
End Tag Table End Tag Table

View File

@ -195,6 +195,10 @@ KEYS
a runs command-line hledger's add command, and reloads the updated a runs command-line hledger's add command, and reloads the updated
file. This allows some basic data entry. file. This allows some basic data entry.
A is like a, but runs the hledger-iadd tool, which provides a
curses-style interface. This key will be available if hledger-iadd is
installed in $PATH.
E runs $HLEDGER_UI_EDITOR, or $EDITOR, or a default (emac- E runs $HLEDGER_UI_EDITOR, or $EDITOR, or a default (emac-
sclient -a "" -nw) on the journal file. With some editors (emacs, vi), sclient -a "" -nw) on the journal file. With some editors (emacs, vi),
the cursor will be positioned at the current transaction when invoked the cursor will be positioned at the current transaction when invoked

View File

@ -1,4 +1,4 @@
This is hledger-web.1.info, produced by makeinfo version 6.1 from stdin. This is hledger-web.1.info, produced by makeinfo version 6.0 from stdin.
 
File: hledger-web.1.info, Node: Top, Next: OPTIONS, Up: (dir) File: hledger-web.1.info, Node: Top, Next: OPTIONS, Up: (dir)

View File

@ -721,11 +721,32 @@ T{
T} T}
.TE .TE
.PP .PP
Note that \f[C]weekly\f[], \f[C]monthly\f[], \f[C]quarterly\f[] and
\f[C]yearly\f[] intervals will always start on the first day on week,
month, quarter or year accordingly, and will end on the last day of same
period, even if associated period expression specifies different
explicit start and end date.
.SS For example:
.PP
\f[C]\-p\ "weekly\ from\ 2009/1/1\ to\ 2009/4/1"\f[] \-\- starts on
2008/12/29, closest preceeding Monday
\f[C]\-p\ "monthly\ in\ 2008/11/25"\f[] \-\- starts on 2018/11/01
.PD 0
.P
.PD
\f[C]\-p\ "quarterly\ from\ 2009\-05\-05\ to\ 2009\-06\-01"\f[] \-
starts on 2009/04/01, ends on 2009/06/30, which are first and last days
of Q2 2009 \f[C]\-p\ "yearly\ from\ 2009\-12\-29"\f[] \- starts on
2009/01/01, first day of 2009
\-\-\-\-\-\-\-\-\-\-\-\-\-\-\-\-\-\-\-\-\-\-\-\-\-\-\-\-\-\-\-\-\-\-\-\-\-\-\-\-\-\-
.PP
The following more complex report intervals are also supported: The following more complex report intervals are also supported:
\f[C]biweekly\f[], \f[C]bimonthly\f[], \f[C]biweekly\f[], \f[C]bimonthly\f[],
\f[C]every\ N\ days|weeks|months|quarters|years\f[], \f[C]every\ day|week|month|quarter|year\f[],
\f[C]every\ Nth\ day\ [of\ month]\f[], \f[C]every\ N\ days|weeks|months|quarters|years\f[].
\f[C]every\ Nth\ day\ of\ week\f[]. .PP
All of these will start on the first day of the requested period and end
on the last one, as described above.
.PP .PP
Examples: Examples:
.PP .PP
@ -733,13 +754,56 @@ Examples:
tab(@); tab(@);
l. l.
T{ T{
\f[C]\-p\ "bimonthly\ from\ 2008"\f[] \f[C]\-p\ "bimonthly\ from\ 2008"\f[] \-\- periods will have boundaries
on 2008/01/01, 2008/03/01, ...
T} T}
T{ T{
\f[C]\-p\ "every\ 2\ weeks"\f[] \f[C]\-p\ "every\ 2\ weeks"\f[] \-\- starts on closest preceeding Monday
T} T}
T{ T{
\f[C]\-p\ "every\ 5\ days\ from\ 1/3"\f[] \f[C]\-p\ "every\ 5\ month\ from\ 2009/03"\f[] \-\- periods will have
boundaries on 2009/03/01, 2009/08/01, ...
T}
.TE
.PP
If you want intervals that start on arbitrary day of your choosing and
span a week, month or year, you need to use any of the following:
.PP
\f[C]every\ Nth\ day\ of\ week\f[], \f[C]every\ <weekday>\f[],
\f[C]every\ Nth\ day\ [of\ month]\f[],
\f[C]every\ Nth\ weekday\ [of\ month]\f[],
\f[C]every\ MM/DD\ [of\ year]\f[], \f[C]every\ Nth\ MMM\ [of\ year]\f[],
\f[C]every\ MMM\ Nth\ [of\ year]\f[].
.PP
Examples:
.PP
.TS
tab(@);
l.
T{
\f[C]\-p\ "every\ 2nd\ day\ of\ week"\f[] \-\- periods will go from Tue
to Tue
T}
T{
\f[C]\-p\ "every\ Tue"\f[] \-\- same
T}
T{
\f[C]\-p\ "every\ 15th\ day"\f[] \-\- period boundaries will be on 15th
of each month
T}
T{
\f[C]\-p\ "every\ 2nd\ Monday"\f[] \-\- period boundaries will be on
second Monday of each month
T}
T{
\f[C]\-p\ "every\ 11/05"\f[] \-\- yearly periods with boundaries on 5th
of Nov
T}
T{
\f[C]\-p\ "every\ 5th\ Nov"\f[] \-\- same
T}
T{
\f[C]\-p\ "every\ Nov\ 5th"\f[] \-\- same
T} T}
.TE .TE
.PP .PP

View File

@ -1,4 +1,4 @@
This is hledger.1.info, produced by makeinfo version 6.1 from stdin. This is hledger.1.info, produced by makeinfo version 6.0 from stdin.
 
File: hledger.1.info, Node: Top, Next: EXAMPLES, Up: (dir) File: hledger.1.info, Node: Top, Next: EXAMPLES, Up: (dir)
@ -125,6 +125,7 @@ File: hledger.1.info, Node: OPTIONS, Next: QUERIES, Prev: EXAMPLES, Up: Top
* Report start & end date:: * Report start & end date::
* Report intervals:: * Report intervals::
* Period expressions:: * Period expressions::
* For example::
* Depth limiting:: * Depth limiting::
* Pivoting:: * Pivoting::
* Cost:: * Cost::
@ -432,7 +433,7 @@ complex intervals may be specified with a period expression. Report
intervals can not be specified with a query, currently. intervals can not be specified with a query, currently.
 
File: hledger.1.info, Node: Period expressions, Next: Depth limiting, Prev: Report intervals, Up: OPTIONS File: hledger.1.info, Node: Period expressions, Next: For example, Prev: Report intervals, Up: OPTIONS
2.10 Period expressions 2.10 Period expressions
======================= =======================
@ -486,15 +487,54 @@ start/end dates (if any), the word 'in' is optional. Examples:
'-p "monthly in 2008"' '-p "monthly in 2008"'
'-p "quarterly"' '-p "quarterly"'
Note that 'weekly', 'monthly', 'quarterly' and 'yearly' intervals
will always start on the first day on week, month, quarter or year
accordingly, and will end on the last day of same period, even if
associated period expression specifies different explicit start and end
date.

File: hledger.1.info, Node: For example, Next: Depth limiting, Prev: Period expressions, Up: OPTIONS
2.11 For example:
=================
'-p "weekly from 2009/1/1 to 2009/4/1"' - starts on 2008/12/29, closest
preceeding Monday '-p "monthly in 2008/11/25"' - starts on 2018/11/01
'-p "quarterly from 2009-05-05 to 2009-06-01"' - starts on 2009/04/01,
ends on 2009/06/30, which are first and last days of Q2 2009 '-p "yearly
from 2009-12-29"' - starts on 2009/01/01, first day of 2009
----------------------------
The following more complex report intervals are also supported: The following more complex report intervals are also supported:
'biweekly', 'bimonthly', 'every N days|weeks|months|quarters|years', 'biweekly', 'bimonthly', 'every day|week|month|quarter|year', 'every N
'every Nth day [of month]', 'every Nth day of week'. days|weeks|months|quarters|years'.
All of these will start on the first day of the requested period and
end on the last one, as described above.
Examples: Examples:
'-p "bimonthly from 2008"' '-p "bimonthly from 2008"' - periods will have boundaries on 2008/01/01, 2008/03/01, ...
'-p "every 2 weeks"' '-p "every 2 weeks"' - starts on closest preceeding Monday
'-p "every 5 days from 1/3"' '-p "every 5 month from 2009/03"' - periods will have boundaries on 2009/03/01, 2009/08/01, ...
If you want intervals that start on arbitrary day of your choosing
and span a week, month or year, you need to use any of the following:
'every Nth day of week', 'every <weekday>', 'every Nth day [of
month]', 'every Nth weekday [of month]', 'every MM/DD [of year]', 'every
Nth MMM [of year]', 'every MMM Nth [of year]'.
Examples:
'-p "every 2nd day of week"' - periods will go from Tue to Tue
'-p "every Tue"' - same
'-p "every 15th day"' - period boundaries will be on 15th of each month
'-p "every 2nd Monday"' - period boundaries will be on second Monday of each month
'-p "every 11/05"' - yearly periods with boundaries on 5th of Nov
'-p "every 5th Nov"' - same
'-p "every Nov 5th"' - same
Show historical balances at end of 15th each month (N is exclusive Show historical balances at end of 15th each month (N is exclusive
end date): end date):
@ -507,9 +547,9 @@ start date and exclusive end date):
'hledger register checking -p "every 3rd day of week"' 'hledger register checking -p "every 3rd day of week"'
 
File: hledger.1.info, Node: Depth limiting, Next: Pivoting, Prev: Period expressions, Up: OPTIONS File: hledger.1.info, Node: Depth limiting, Next: Pivoting, Prev: For example, Up: OPTIONS
2.11 Depth limiting 2.12 Depth limiting
=================== ===================
With the '--depth N' option (short form: '-N'), commands like account, With the '--depth N' option (short form: '-N'), commands like account,
@ -521,7 +561,7 @@ less detail. This flag has the same effect as a 'depth:' query argument
 
File: hledger.1.info, Node: Pivoting, Next: Cost, Prev: Depth limiting, Up: OPTIONS File: hledger.1.info, Node: Pivoting, Next: Cost, Prev: Depth limiting, Up: OPTIONS
2.12 Pivoting 2.13 Pivoting
============= =============
Normally hledger sums amounts, and organizes them in a hierarchy, based Normally hledger sums amounts, and organizes them in a hierarchy, based
@ -578,7 +618,7 @@ $ hledger balance --pivot member acct:.
 
File: hledger.1.info, Node: Cost, Next: Market value, Prev: Pivoting, Up: OPTIONS File: hledger.1.info, Node: Cost, Next: Market value, Prev: Pivoting, Up: OPTIONS
2.13 Cost 2.14 Cost
========= =========
The '-B/--cost' flag converts amounts to their cost at transaction time, The '-B/--cost' flag converts amounts to their cost at transaction time,
@ -587,7 +627,7 @@ if they have a transaction price specified.
 
File: hledger.1.info, Node: Market value, Next: Regular expressions, Prev: Cost, Up: OPTIONS File: hledger.1.info, Node: Market value, Next: Regular expressions, Prev: Cost, Up: OPTIONS
2.14 Market value 2.15 Market value
================= =================
The '-V/--value' flag converts the reported amounts to their market The '-V/--value' flag converts the reported amounts to their market
@ -636,7 +676,7 @@ directives, not transaction prices (unlike Ledger).
 
File: hledger.1.info, Node: Regular expressions, Prev: Market value, Up: OPTIONS File: hledger.1.info, Node: Regular expressions, Prev: Market value, Up: OPTIONS
2.15 Regular expressions 2.16 Regular expressions
======================== ========================
hledger uses regular expressions in a number of places: hledger uses regular expressions in a number of places:
@ -2222,129 +2262,131 @@ Node: EXAMPLES1886
Ref: #examples1988 Ref: #examples1988
Node: OPTIONS3634 Node: OPTIONS3634
Ref: #options3738 Ref: #options3738
Node: General options4038 Node: General options4054
Ref: #general-options4165 Ref: #general-options4181
Node: Command options6484 Node: Command options6500
Ref: #command-options6637 Ref: #command-options6653
Node: Command arguments7035 Node: Command arguments7051
Ref: #command-arguments7191 Ref: #command-arguments7207
Node: Argument files7312 Node: Argument files7328
Ref: #argument-files7465 Ref: #argument-files7481
Node: Special characters7731 Node: Special characters7747
Ref: #special-characters7886 Ref: #special-characters7902
Node: Input files9305 Node: Input files9321
Ref: #input-files9443 Ref: #input-files9459
Node: Smart dates11406 Node: Smart dates11422
Ref: #smart-dates11549 Ref: #smart-dates11565
Node: Report start & end date12528 Node: Report start & end date12544
Ref: #report-start-end-date12700 Ref: #report-start-end-date12716
Node: Report intervals13766 Node: Report intervals13782
Ref: #report-intervals13931 Ref: #report-intervals13947
Node: Period expressions14332 Node: Period expressions14348
Ref: #period-expressions14494 Ref: #period-expressions14507
Node: Depth limiting16834 Node: For example16552
Ref: #depth-limiting16980 Ref: #for-example16697
Node: Pivoting17322 Node: Depth limiting18621
Ref: #pivoting17442 Ref: #depth-limiting18760
Node: Cost19118 Node: Pivoting19102
Ref: #cost19228 Ref: #pivoting19222
Node: Market value19346 Node: Cost20898
Ref: #market-value19483 Ref: #cost21008
Node: Regular expressions20783 Node: Market value21126
Ref: #regular-expressions20921 Ref: #market-value21263
Node: QUERIES22282 Node: Regular expressions22563
Ref: #queries22386 Ref: #regular-expressions22701
Node: COMMANDS26353 Node: QUERIES24062
Ref: #commands26467 Ref: #queries24166
Node: accounts27450 Node: COMMANDS28133
Ref: #accounts27550 Ref: #commands28247
Node: activity28543 Node: accounts29230
Ref: #activity28655 Ref: #accounts29330
Node: add29014 Node: activity30323
Ref: #add29115 Ref: #activity30435
Node: balance31773 Node: add30794
Ref: #balance31886 Ref: #add30895
Node: Flat mode35043 Node: balance33553
Ref: #flat-mode35170 Ref: #balance33666
Node: Depth limited balance reports35590 Node: Flat mode36823
Ref: #depth-limited-balance-reports35793 Ref: #flat-mode36950
Node: Multicolumn balance reports36213 Node: Depth limited balance reports37370
Ref: #multicolumn-balance-reports36424 Ref: #depth-limited-balance-reports37573
Node: Custom balance output41072 Node: Multicolumn balance reports37993
Ref: #custom-balance-output41256 Ref: #multicolumn-balance-reports38204
Node: Colour support43349 Node: Custom balance output42852
Ref: #colour-support43510 Ref: #custom-balance-output43036
Node: Output destination43683 Node: Colour support45129
Ref: #output-destination43841 Ref: #colour-support45290
Node: CSV output44111 Node: Output destination45463
Ref: #csv-output44230 Ref: #output-destination45621
Node: balancesheet44627 Node: CSV output45891
Ref: #balancesheet44765 Ref: #csv-output46010
Node: balancesheetequity46733 Node: balancesheet46407
Ref: #balancesheetequity46884 Ref: #balancesheet46545
Node: cashflow47673 Node: balancesheetequity48513
Ref: #cashflow47803 Ref: #balancesheetequity48664
Node: check-dates49715 Node: cashflow49453
Ref: #check-dates49844 Ref: #cashflow49583
Node: check-dupes49961 Node: check-dates51495
Ref: #check-dupes50088 Ref: #check-dates51624
Node: equity50225 Node: check-dupes51741
Ref: #equity50337 Ref: #check-dupes51868
Node: help50500 Node: equity52005
Ref: #help50603 Ref: #equity52117
Node: import51677 Node: help52280
Ref: #import51793 Ref: #help52383
Node: incomestatement52523 Node: import53457
Ref: #incomestatement52659 Ref: #import53573
Node: prices54612 Node: incomestatement54303
Ref: #prices54729 Ref: #incomestatement54439
Node: print54772 Node: prices56392
Ref: #print54884 Ref: #prices56509
Node: print-unique59730 Node: print56552
Ref: #print-unique59858 Ref: #print56664
Node: register59926 Node: print-unique61510
Ref: #register60055 Ref: #print-unique61638
Node: Custom register output64556 Node: register61706
Ref: #custom-register-output64687 Ref: #register61835
Node: register-match65984 Node: Custom register output66336
Ref: #register-match66120 Ref: #custom-register-output66467
Node: rewrite66303 Node: register-match67764
Ref: #rewrite66422 Ref: #register-match67900
Node: stats66491 Node: rewrite68083
Ref: #stats66596 Ref: #rewrite68202
Node: tags67477 Node: stats68271
Ref: #tags67577 Ref: #stats68376
Node: test67813 Node: tags69257
Ref: #test67899 Ref: #tags69357
Node: ADD-ON COMMANDS68267 Node: test69593
Ref: #add-on-commands68379 Ref: #test69679
Node: Official add-ons69666 Node: ADD-ON COMMANDS70047
Ref: #official-add-ons69808 Ref: #add-on-commands70159
Node: api69895 Node: Official add-ons71446
Ref: #api69986 Ref: #official-add-ons71588
Node: ui70038 Node: api71675
Ref: #ui70139 Ref: #api71766
Node: web70197 Node: ui71818
Ref: #web70288 Ref: #ui71919
Node: Third party add-ons70334 Node: web71977
Ref: #third-party-add-ons70511 Ref: #web72068
Node: diff70646 Node: Third party add-ons72114
Ref: #diff70745 Ref: #third-party-add-ons72291
Node: iadd70844 Node: diff72426
Ref: #iadd70960 Ref: #diff72525
Node: interest71043 Node: iadd72624
Ref: #interest71166 Ref: #iadd72740
Node: irr71261 Node: interest72823
Ref: #irr71361 Ref: #interest72946
Node: Experimental add-ons71439 Node: irr73041
Ref: #experimental-add-ons71593 Ref: #irr73141
Node: autosync71884 Node: Experimental add-ons73219
Ref: #autosync71998 Ref: #experimental-add-ons73373
Node: budget72237 Node: autosync73664
Ref: #budget72361 Ref: #autosync73778
Node: chart72427 Node: budget74017
Ref: #chart72546 Ref: #budget74141
Node: check72617 Node: chart74207
Ref: #check72721 Ref: #chart74326
Node: check74397
Ref: #check74501
 
End Tag Table End Tag Table

View File

@ -286,6 +286,7 @@ OPTIONS
format automatically based on the file extension, or if that is not format automatically based on the file extension, or if that is not
recognised, by trying each built-in "reader" in turn: recognised, by trying each built-in "reader" in turn:
Reader: Reads: Used for file extensions: Reader: Reads: Used for file extensions:
----------------------------------------------------------------------------- -----------------------------------------------------------------------------
journal hledger's journal format, also .journal .j .hledger journal hledger's journal format, also .journal .j .hledger
@ -323,14 +324,16 @@ OPTIONS
Examples: Examples:
2009/1/1, 2009/01/01, simple dates, several sep- 2009/1/1, 2009/01/01, simple dates, several sep-
2009-1-1, 2009.1.1 arators allowed 2009-1-1, 2009.1.1 arators allowed
2009/1, 2009 same as above - a missing 2009/1, 2009 same as above - a missing
day or month defaults to 1 day or month defaults to 1
1/1, january, jan, relative dates, meaning 1/1, january, jan, relative dates, meaning
this year january 1 of the current this year january 1 of the current
year year
next year january 1 of next year next year january 1 of next year
this month the 1st of the current this month the 1st of the current
month month
@ -355,6 +358,7 @@ OPTIONS
Examples: Examples:
-b 2016/3/17 begin on St. Patrick's -b 2016/3/17 begin on St. Patrick's
day 2016 day 2016
-e 12/1 end at the start of decem- -e 12/1 end at the start of decem-
@ -394,6 +398,7 @@ OPTIONS
long as you don't run two dates together. "to" can also be written as long as you don't run two dates together. "to" can also be written as
"-". These are equivalent to the above: "-". These are equivalent to the above:
-p "2009/1/1 2009/4/1" -p "2009/1/1 2009/4/1"
-p2009/1/1to2009/4/1 -p2009/1/1to2009/4/1
-p2009/1/1-2009/4/1 -p2009/1/1-2009/4/1
@ -401,6 +406,7 @@ OPTIONS
Dates are smart dates, so if the current year is 2009, the above can Dates are smart dates, so if the current year is 2009, the above can
also be written as: also be written as:
-p "1/1 4/1" -p "1/1 4/1"
-p "january-apr" -p "january-apr"
-p "this year to 4/1" -p "this year to 4/1"
@ -408,6 +414,7 @@ OPTIONS
If you specify only one date, the missing start or end date will be the If you specify only one date, the missing start or end date will be the
earliest or latest transaction in your journal: earliest or latest transaction in your journal:
-p "from 2009/1/1" everything after january -p "from 2009/1/1" everything after january
1, 2009 1, 2009
-p "from 2009/1" the same -p "from 2009/1" the same
@ -418,6 +425,7 @@ OPTIONS
A single date with no "from" or "to" defines both the start and end A single date with no "from" or "to" defines both the start and end
date like so: date like so:
-p "2009" the year 2009; equivalent -p "2009" the year 2009; equivalent
to "2009/1/1 to 2010/1/1" to "2009/1/1 to 2010/1/1"
-p "2009/1" the month of jan; equiva- -p "2009/1" the month of jan; equiva-
@ -432,19 +440,65 @@ OPTIONS
-Y flags. Between report interval and start/end dates (if any), the -Y flags. Between report interval and start/end dates (if any), the
word in is optional. Examples: word in is optional. Examples:
-p "weekly from 2009/1/1 to 2009/4/1" -p "weekly from 2009/1/1 to 2009/4/1"
-p "monthly in 2008" -p "monthly in 2008"
-p "quarterly" -p "quarterly"
Note that weekly, monthly, quarterly and yearly intervals will always
start on the first day on week, month, quarter or year accordingly, and
will end on the last day of same period, even if associated period
expression specifies different explicit start and end date.
For example:
-p "weekly from 2009/1/1 to 2009/4/1" -- starts on 2008/12/29, closest
preceeding Monday -p "monthly in 2008/11/25" -- starts on 2018/11/01
-p "quarterly from 2009-05-05 to 2009-06-01" - starts on 2009/04/01,
ends on 2009/06/30, which are first and last days of Q2 2009
-p "yearly from 2009-12-29" - starts on 2009/01/01, first day of 2009
------------------------------------------
The following more complex report intervals are also supported: The following more complex report intervals are also supported:
biweekly, bimonthly, every N days|weeks|months|quarters|years, biweekly, bimonthly, every day|week|month|quarter|year,
every Nth day [of month], every Nth day of week. every N days|weeks|months|quarters|years.
All of these will start on the first day of the requested period and
end on the last one, as described above.
Examples: Examples:
-p "bimonthly from 2008"
-p "every 2 weeks" -p "bimonthly from 2008" -- periods
-p "every 5 days from 1/3" will have boundaries on 2008/01/01,
2008/03/01, ...
-p "every 2 weeks" -- starts on closest
preceeding Monday
-p "every 5 month from 2009/03" --
periods will have boundaries on
2009/03/01, 2009/08/01, ...
If you want intervals that start on arbitrary day of your choosing and
span a week, month or year, you need to use any of the following:
every Nth day of week, every <weekday>, every Nth day [of month],
every Nth weekday [of month], every MM/DD [of year],
every Nth MMM [of year], every MMM Nth [of year].
Examples:
-p "every 2nd day of week" -- periods
will go from Tue to Tue
-p "every Tue" -- same
-p "every 15th day" -- period bound-
aries will be on 15th of each month
-p "every 2nd Monday" -- period bound-
aries will be on second Monday of each
month
-p "every 11/05" -- yearly periods with
boundaries on 5th of Nov
-p "every 5th Nov" -- same
-p "every Nov 5th" -- same
Show historical balances at end of 15th each month (N is exclusive end Show historical balances at end of 15th each month (N is exclusive end
date): date):