poster templates
with R, markdown, knitr, pandoc, yaml, lua, xelatex
Our School recently wanted to present our course offerings on a large printed display, so that prospective students could get a global view of the entire curriculum. I volunteered.
For the initial design phase I tried a few variations in Illustrator and Indesign, borrowing random ideas from google images 'design+poster+templates'
; it looked something like this:
Note: most images are anonymised in this post for copyright reasons. For the final version we chose freely-available images, or I made some myself, and cited the source.
I then applied the design manually to all 25-or-so posters. Some variations were required due to the unequal text content of all courses, and additional requests from colleagues. After some iterations, the final set of posters looked something like this,
While creating the 25 files from the original template was relatively easy, the subsequent tweaks and modifications were very time consuming, as each step, such as changing the font size, or the vertical alignment of an item, had to be repeated manually over all files for consistency.
I decided to try an automated template strategy for next time, separating the content and layout. The rest of this post explores this idea.
automated pipeline
The workflow is illustrated below; the input format is a quarto source file .qmd
with YAML front matter. The file along with its metadata is processed by quarto and pandoc to produce an abstract syntax tree (AST). A custom lua filter is used to make small cosmetic changes in this AST, which is then converted by pandoc into LaTeX with a custom template. Finally the XeLaTeX engine produces a pdf.
Although I start here with a markdown document, the same chain of commands can start with an Rmarkdown file, allowing even greater control if some portions of the document are to be generated on-the-fly with arbitrary R code.
Thanks to the powerful Rstudio IDE, all these steps are abstracted away in the click of a button; behind the scenes pandoc is called with a command line such as,
/usr/local/bin/pandoc poster.md \
--to latex \
--from markdown \
--output poster.tex \
--template _template.tex \
--pdf-engine xelatex \
--lua-filter=filter.lua
The poster’s content and its various elements (image, QR code, background colours, etc.) are all selected automatically based on the YAML metadata, using various tricks that I’ll describe below.
background layer
The courses belong to 4 different years; I decided to group them by colour.
In the TeX template, the colours for bullets etc. are automatically selected through the YAML course code’s first digit (year 1—4),
\usepackage{etoolbox}
\StrLeft{$code$}{1}[\Year]
\ifnumequal{\Year}{1}{\definecolor{col}{RGB}{232,152,18}}{}
\ifnumequal{\Year}{2}{\definecolor{col}{RGB}{0,183,235}}{}
\ifnumequal{\Year}{3}{\definecolor{col}{RGB}{68,161,43}}{}
\ifnumequal{\Year}{4}{\definecolor{col}{RGB}{255,0,147}}{}
This trick is used for various elements, such as the course code,
\color{col}$discipline$}\thinskip\scshape $code$ {
For pragmatic reasons I’ve found it easier to generate background layers separately, and place them in the background using the wallpaper
TeX package,
\usepackage{wallpaper}
\ULCornerWallPaper{1}{bkg/bkg\Year}
where the 4 bkg files are created with tikz code along those lines,
(the hard-coded dimensions were found from the original Indesign template).
The next important step is to assign a layout to the various elements that will be included in the poster.
layout
The flowfram package seems a good way to define rectangular regions, or frames, where content will be placed. Again I took the hard-coded measurements from the Indesign version. Note that each frame is given a identifier, and an optional vertical justification setting.
The template’s body then simply refers to the appropriate metadata element, for example the graphic:
\begin{staticcontents*}{pic}
\includegraphics[width=230mm,height=115mm]{images/$discipline$$code$}
\end{staticcontents*}
qr code
The QR code at the bottom refers to each course’s unique URL; I batch-generated those with an R script,
(I wrote qrGrob
as basically a wrapper around rectGrob
, since unfortunately rasterGrob
produces output that bewilders pdf readers).
For a couple of elements, the pandoc template approach was not sufficient; I needed the extra processing step provided by filters.
lua filter
The bullets appear as simple •
in the original markdown and YAML; I wanted them coloured according to the course code. However, the •
character is not a LaTeX macro, and doesn’t lend itself to a programmatic redefinition. I therefore processed the intermediate AST with the following replacement rule defined as a lua filter,
Another use for lua filters came from the progress bar displayed at the bottom: a specific course is highlighted, as provided in the YAML metadata, but if the bar was hard-coded in the LaTeX template I wouldn’t know where to place the relevant variable \textbf{$highlight$}
, since its position changes from poster to poster. I therefore generate this progress bar on-the-fly, substituting via a regex the relevant course code with a bold version,
The \equispace
LaTeX macro was kindly contributed on StackOverflow.
\makeatletter \def\zz#1{\@for\z:=#1\do{\z}}\makeatother
\newcommand\equispace[8]{%
\begin{center}%
\begin{tabular}{@{}c@{}}\zz{#1}\\\zz{#2}\end{tabular}\hfill
\begin{tabular}{@{}c@{}}\zz{#3}\\\zz{#4}\end{tabular}\hfill
\begin{tabular}{@{}c@{}}\zz{#5}\\\zz{#6}\end{tabular}\hfill
\begin{tabular}{@{}c@{}}\zz{#7}\\\zz{#8}\end{tabular}%
\end{center}%
}
This is probably a case where R+knitr would be an alternative way to go; it’s possible to execute R code within the metadata before it’s passed to pandoc, so R could create the appropriate TeX string with the specific course highlighted within it.
: "!expr highlighted_bar('phys222')" highlight
where highlighted_bar
would be sourced from somewhere and would be designed to produce,
\equispace{phys115,phys131}{...}{...}{\textbf{phys222},phys209}{...}
The YAML is a bit nicer with the pandoc filter route though, but it requires knowing a bit of lua (and pandoc’s data structures).
final look
Putting all this together, and with much fiddling around with XeLaTeX and typography, the end result looks like the original,
(slightly nicer, actually, since I made a few improvements to font styles).
a success?
This was an interesting experiment, and I learned a lot of new tricks (lua, but also LaTeX). The combination of rmarkdown
, (knitr
), pandoc
, yaml
, lua
(or other filters), xelatex
is very powerful and allows fine customisation of the output based on fully-automated steps. For this particular task I was only interested in printed versions, but the same workflow could be adapted to also produce a html page from the same input data. Having all the content in plain text has huge benefits, not just for ease of editing: for the last poster I wanted to make a word-cloud — all it took was a one-liner script to scrape the text from all posters.
I wasn’t familiar with lua, so the filter part proved a little bit tricky when trying to go beyond the examples provided in pandoc’s documentation. It may be easier to use another language — maybe even R — for such filters, but it is nice to have a lightweight and portable solution. The pandoc mailing list was also very helpful.
It is worth noting that this whole strategy is only good for tasks that require automation; when I had to produce an extra poster I first returned to the Indesign file, since this particular poster was a bit different from the others (combined two similar courses). I’ve since realised that having access to the intermediate LaTeX source is a great benefit, as it lets me easily fine-tune the output for such a one-off adaption that doesn’t generalise easily.