The fall 2020 LaTeX release is available

This release introduces a number of important enhancements.

A general hook management system for LaTeX

Most LaTeX users and package writers will know the handful of hooks that LaTeX has been offering until now, the most important one perhaps being \AtBeginDocument. These are important hooks, but they are far too few so that in many cases package developers had to directly patch the internals of LaTeX. This resulted in many problems.

With the new hook management system, LaTeX will get many more hooks that package writers (and authors) can use to add code in a controlled and reliable way. New hooks have been added in a number of places by using the new system and more will follow over time. Available now are:

• Hooks to add code before and after environments (formerly offered through the etoolbox package);
• Hooks used when loading files, packages, or classes (similar to what the filehook package now provides);
• Hooks in the page-building process (e.g., functionality previously available through packages such as atbegshi or atveryend and a few others).

The important point here is not so much that the functionality of these packages has been integrated into the LaTeX kernel, but that the hook management system provides a single structured way for different packages to reliably add and order code. This will resolve many of the inter-package interoperability issues which formerly could be resolved (if at all) only by loading the packages in a specific order, or by the use of complex and fragile code inside the packages to account for various scenarios in user documents.

The hook management system is currently described in these three documents:

• texdoc lthooks — The description of the interfaces and the core hooks already added to the kernel.
• texdoc ltshipout — The documentation of the hooks available during the page production process.
• texdoc ltfilehook — hooks that can be used before or after a file gets loaded.

Providing xparse as part of the format

In the previous release we added the LaTeX3 programming layer to the LaTeX format to improve the loading speed when packages using expl3 are used. In this release we are now extending this support by integrating xparse so that the extended interface for defining document-level commands becomes available out of the box.

This enables users and most importantly package developers to easily define LaTeX commands with multiple optional arguments or other syntax features with ease. For details, check out the xparse documentation, e.g., via texdoc xparse.

Improving the font series handling

In the previous release we extended NFSS (the new font selection scheme) to better support modern fonts that offer different font faces, e.g., condensed, semi-bold, etc., and make them work seamlessly with each other. Experiences with the extended interface showed that for some use cases adequate support was still missing or that in special setups the algorithms sometimes selected a wrong font series value. These cases have now been resolved and additional support commands have been added. For example, with

\IfFontSeriesContextTF{〈context〉} {〈true code〉}{〈false code〉}


you can now define commands that behave differently depending on the current font series context. The 〈context〉 to check has to be specified as either bf or md. The command then chooses the 〈true code〉 or the 〈false code〉 based on where it is used (e.g., inside \textbf (or \bfseries) or not).

A large number of other enhancements and corrections

There are nearly fifty other enhancements and corrections that we documented in the ltnews article for this release (and a few very minor ones that only made it into the changes.txtfile) The most important ones from a user perspective are:

• Support for length expression in coordinates of picture commands as an alternative to decimal numbers denoting a multiple of \unitlength, e.g., \put(0,.2\textheight){...}
• New commands to make copies of robust commands (\let wouldn’t work for them)
• A new \ShowCommand command to display the full definition of a robust command (works also on other commands)

But read the whole ltnews article because there may be other gems that are useful for you.

The new features and most the important bug fixes made in this release are documented in “LaTeX2e News Issue 32”. This document can be found on the LaTeX2e news page where you will also find release information for earlier LaTeX releases.

Happy LaTeXing — Frank

2020-09-27

Uwes kleines Technikblog - Kategorie LaTeX (by Uwe Ziegenhagen)

Vermutlich ist es nicht so bekannt, dass man LaTeX-Pakete auch per REST-API auf CTAN hochladen kann. Das ist insbesondere dann praktisch, wenn man öfter Pakete aktualisieren muss, wie ich es beispielsweise mit der DTK Bibliografie mehrmals im Jahr mache.

Manfred Lotz vom CTAN-Team hat dazu ein Python-Skript geschrieben (https://gitlab.com/Lotz/pkgcheck/blob/master/ctan_upload.py), das diese API befüttert.

Ich habe sein Skript noch ein wenig angepasst (ich erstelle auch die ZIP-Datei damit und kopiere die richtigen Dateien an ihren Platz), mein Skript findet ihr im Github unter https://github.com/dante-ev/dtk-bibliography/blob/master/pack_and_upload_to_ctan.py. Für den Upload selbst benötigt man noch eine TOML-Datei, in der die Upload-Informationen als Key-Value-Paare stehen. Ein Beispiel für eine solche TOML-Datei findet ihr bei Manfred unter https://gitlab.com/Lotz/pkgcheck/-/blob/master/pkgcheck.toml.

Uwe

Uwe Ziegenhagen likes LaTeX and Python, sometimes even combined. Do you like my content and would like to thank me for it? Consider making a small donation to my local fablab, the Dingfabrik Köln. Details on how to donate can be found here Spenden für die Dingfabrik.

üääü­üöü–üöü–üöü–

2020-09-22

TUG

TUGboat 41:2 published

TUGboat volume 41, number 2, the TUG 2020 (online) proceedings, has been mailed to TUG members. It is also available online and from the TUG store. In addition, prior TUGboat issue 41:1 is now publicly available. Submissions for the next issue are welcome; the deadline is October 15. Please consider joining or renewing your TUG membership if you haven't already (you'll get this issue immediately), and thanks.

Talks from the online TUG Conference 2020 are now available on YouTube

The first TUG online conference in July was quite a success with many interesting talks on many subjects of TeX usage and related topics. The talks are now appearing one after another on YouTube, thanks to the the work of a few individuals who have prepared them (and the question/answer sessions) for publication.

If you have missed the conference (or if you want listen to one of the talks again) you will find them on the TeX Users Group Channel on YouTube. So far only a subset of the talks have been added; the remaining ones will probably appear pretty soon.

There are also several talks by members of the LaTeX Project team. These are:

Enjoy — Frank

The third and final LaTeX pre-release for 2020-10-01 is available for testing

A few days ago we submitted a new LaTeX development format1 to CTAN and by now this should be available to all users using MiKTeX or TeX Live (on any operating system).

Main features of the final pre-release for 2020-10-01

In the previous pre-releases, distributed in May and July, the following main features (besides bug fixes) were made available:

• The functionality of the xparse package was added directly to the LaTeX kernel.
• LaTeX’s font series handling was improved.
• A general hook management system was added to improve package interoperability and enable easier customization.

In this final prerelease we have fully integrated the hook management system (previously it was added at the end, overwriting earlier definitions in the kernel). In addition the following features were added:

• The picture environent now supports dimensions (and not only numbers multiplied by \unitlength) in places where coordinates have to be specified. This is basically incorporating into the kernel the functionality of the picture.sty package by Heiko Oberdiek.
• A \ShowCommand was added to show the definition of robust commands in an easy manner.
• A \NewCommandCopy was added to allow making copies of robust commands (which can’t be done using \let).
• docstrip and l3docstrip were merged so that one can now always use docstrip even with expl3-based sources.

Other fixes and improvements

There have been a larger number of smaller bug fixes and enhancements. A full list of all fixes and additions is given in a draft version of ltnews32, which you should be able to read by running

texdoc ltnews32


on the command line (or by any other means available with your operating system—somewhere there should be a file called ltnews32.pdf that you can open with a PDF reader). The draft version of this file is also available from our website as LaTeX2e News Issue 32 draft.

A general hook management system for LaTeX (the main feature of the 2020 fall release)

Most LaTeX users and package writers will know the handful of hooks that LaTeX has been offering until now, the most important one perhaps being \AtBeginDocument. These are important hooks, but they are far too few so that in many cases package developers had to directly patch the internals of LaTeX. This resulted in many problems.

With the new hook management system, LaTeX will get many more hooks that packages writers (and authors) can use to add code in a controlled and reliable way. New hooks have been added in a number of places by using the new system and more will follow over time. Available now are:

• Hooks to add code before and after environments (formerly offered through the etoolbox package);
• Hooks used when loading files, packages, or classes (similar to what the filehook package now provides);
• Hooks in the page-building process (e.g., functionality previously available through packages such as atbegshi or atveryend and a few others).

The important point here is not so much that the functionality of these packages has been integrated into the LaTeX kernel, but that the hook management system provides a single structured way for different packages to reliably add and order code. This will resolve many of the inter-package interoperability issues which formerly could be resolved (if at all) only by loading the packages in a specific order, or by the use of complex and fragile code inside the packages to account for various scenarios in user documents.

The hook management system is currently described in these three documents:

• texdoc lthooks — The description of the interfaces and the core hooks already added to the kernel.
• texdoc ltshipout — The documentation of the hooks available during the page production process.
• texdoc ltfilehook — hooks that can be used before or after a file gets loaded.

Outlook

We expect this to be the final pre-release matching the code that goes into the 2020-10-01 release, except for any bug fixes that are found between now and the release date.

We are issuing this final pre-release now in the hope that you will help us by making sure that all the enhancements and fixes we have provided are safe and that they do not have any undesired side effects, so please help with the testing if you can.

This development format allows you to test the upcoming LaTeX release scheduled for 2020-10-01 with your documents or packages. Such testing is particularly important for package maintainers to verify that changes to the core LaTeX haven’t introduced incompatibilities with existing code. We try to identify any such problems beforehand but such an undertaking is necessarily incomplete, which is why we are asking for user testing.

Besides developers, we also ask ordinary users to try out the new release, because the more people that test the new format, the higher the chances that any hidden problems are identified before the final release hits the streets.

Processing your documents with the pre-release is straight forward. All you have to do is to replace the invocation command by appending -dev to the executable, e.g., on the command line you would run

pdflatex-dev myfile    or    lualatex-dev myfile    or    xelatex-dev myfile


instead of using pdflatex, lualatex or xelatex. If you use an integrated editing environment, then it depends on the system how to configure it to use an alternative format; but in any case the necessary modification should be straight forward.

Enjoy — Frank

1. The internal version number for this pre-release is LaTeX2e <2020-10-01> pre-release-8, the first 5 pre-releases just mirrored the patch releases we did for 2020-02-02.

2020-08-30

Uwes kleines Technikblog - Kategorie LaTeX (by Uwe Ziegenhagen)

Mailman Spammer mit Python blocken

Für Dante e.V. betreue ich einige E-Mail-Listen auf mailman-Basis, die seit einigen Tagen von Spammern geflutet werden. Jeden Tag sind dutzende bis hunderte Aufnahme-Requests in der Liste, die ich manuell wegwerfen müsste. Nachdem ich dies einmal händisch getan hatte, musste eine automatische Lösung gefunden werden.

Die Lösung bestand darin, einen Treiber für Firefox („geckodriver“) zu installieren, der das Fernsteuern des Browsers erlaubt. Dann kann mittels selenium Modul die Steuerung aus Python heraus erfolgen. Unten der wesentliche Quellcode als Basis für eigene Arbeiten, den Teil zum Erkennen von legitimen Anfragen habe ich weggelassen.

# -*- coding: utf-8 -*-
"""
"""

from selenium.webdriver import Firefox
from selenium.webdriver.firefox.options import Options

opts = Options()
#opts.set_headless() # Ich will sehen, wie selenium arbeitet
browser.implicitly_wait(3)

# einloggen
browser.get('<url panels')="" mailman="" search_form="browser.find_element_by_name('&lt;passwortfeld_ID" admin="" des="">')
search_form.submit()

#wegwerfen Button pro Zeile
fields = browser.find_elements_by_xpath("//input[@value='3']")
emails = browser.find_elements_by_xpath('//td[contains(text(),"@")]')

if len(fields) == len(emails):
zipped_list = list(zip(emails, fields))

for i in zipped_list:
email, field = i
field.click()

</adminpasswort></url>

Uwe

Uwe Ziegenhagen likes LaTeX and Python, sometimes even combined. Do you like my content and would like to thank me for it? Consider making a small donation to my local fablab, the Dingfabrik Köln. Details on how to donate can be found here Spenden für die Dingfabrik.

2020-07-23

TUG

TUG 2020 conference starts July 24

The regular talks for the TUG'20 conference will start tomorrow (as we write this), at 9am US/Pacific (16:00 UTC) with a keynote talk by Steve Matteson, creative type director at Monotype. The conference's link for Zoom clients (meeting ID 95051714640), as well as the link for streaming via YouTube are posted on the conference web pages, along with the full program and abstracts, and more. The YouTube link will change occasionally during the conference, unfortunately, so if it doesn't work, check back. Hope everyone enjoys!

The second LaTeX pre-release for 2020-10-01 is available for testing

A few days ago we submitted a new LaTeX development format1 to CTAN and by now this should be available to all users using MiKTeX or TeX Live (on any operating system).

Main features of the second pre-release for 2020-10-01

The first pre-release, distributed at the end of May, had the following two main features (beside bug fixes):

• the functionality of the xparse package was added directly to the LaTeX kernel
• LaTeX’s font series handling was improved

This second pre-release adds one major new component to LaTeX: a general hook management system to improve package interoperability and enable easier customization and extension of LaTeX.

A general hook management system for LaTeX

Most LaTeX users and package writers will know the handful of hooks that LaTeX has been offering until now, the most important one perhaps being \AtBeginDocument. These are important hooks, but they are far too few so that in many cases package developers had to directly patch the internals of LaTeX. Thus resulted in many problems.

With the new hook management system, LaTeX will get many more hooks that package writers (and authors) can use to add code in a controlled and reliable way. New hooks have been added in a number of places by using the new system and more will follow over time. Available now are:

• Hooks to add code before and after environments (formerly offered through the etoolbox package);
• Hooks used when loading files, packages, or classes (similar to what the filehook package now provides);
• Hooks in the page-building process (e.g., functionality previously available through packages such as atbegshi or atveryend and a few others).

The important point here is not so much that the functionality of these packages has been integrated into the LaTeX kernel, but that the hook management system provides a single structured way for different packages to reliably add and order code. This will resolve many of the inter-package interoperability issues which formerly could be resolved (if at all) only by loading the packages in a specific order, or by the use of complex and fragile code inside the packages to account for various scenarios in user documents.

The hook management system is currently described in these three documents (for the final release they will be consolidated):

• texdoc lthooks — The description of the interfaces and the core hooks already added to the kernel.
• texdoc ltshipout — The documentation of the hooks available during the page production process.
• texdoc ltfilehook — hooks that can be used before or after a file gets loaded.

Other fixes and improvements

A full list of all fixes and additions is given in a draft version of ltnews32 which you should be able to read by running

texdoc ltnews32


on the command line (or by any other means available at your operating system—somewhere there should be a file called ltnews32.pdf that you can open with a PDF reader). The draft version of this file is also available from our website as LaTeX2e News Issue 32 draft.

Outlook

We expect to produce a third and final pre-release incorporating the user feedback we receive and consolidating some of the documentation. A few additional outstanding issues are expected to get fixed as well, but nothing major — the main functionality planned for the fall release is available already now with the second pre-release.

We are issuing this second pre-release now in the hope that you will help us by making sure that all the enhancements and fixes we have provided are safe and that they do not have any undesired side effects, so please help with the testing if you can.

This development format allows you to test the upcoming LaTeX release scheduled for 2020-10-01 with your documents or packages. Such testing is particularly important for package maintainers to verify that changes to the core LaTeX haven’t introduced incompatibilities with existing code. We try to identify any such problems beforehand but such an undertaking is necessarily incomplete, which is why we are asking for user testing.

Besides developers, we also ask ordinary users to try out the new release, because the more people that test the new format, the higher the chances that any hidden problems are identified before the final release in October hits the streets.

Processing your documents with the pre-release is straight forward. All you have to do is to replace the invocation command by appending -dev to the executable, e.g., on the command line you would run

pdflatex-dev myfile    or    lualatex-dev myfile    or    xelatex-dev myfile


instead of using pdflatex, lualatex or xelatex. If you use an integrated editing environment, then it depends on the system how to configure it to use an alternative format; but in any case the necessary modification should be straight forward.

Enjoy — Frank

1. The internal version number for the pre-release is LaTeX2e <2020-10-01> pre-release-7, the first 5 pre-releases just mirrored the patch releases we did for 2020-02-02.

New articles by project members

We have added five articles recently published in TUGboat to the site.

• Enrico discusses aspects of mathematical typesetting,
• Frank’s article discusses a new package that implements an improved algorithm for float handling,
• Joseph looks a case changing in the Unicode world and the LaTeX3 tools available,
• Ulrike’s article gives a sort overview about the work going on in providing tagged PDF documents and
• Ulrike and Marcel discuss the use of LuaLaTeX with Harfbuzz in a case study.

You will find all of them on the Publications from 2020 page.

2020-07-19

Uwes kleines Technikblog - Kategorie LaTeX (by Uwe Ziegenhagen)

Appendixnotes für Beamer-Präsentationen

Remark: An English version of this article has been published on latex.net.

Stellen wir uns vor, dass wir in einer Präsentation eine Abschlussarbeit verteidigen müssen. Es kann sein, dass weitergehende Fragen kommen, die dann tiefergehend beantwortet werden müssen.

Wir könnten das entsprechende Material direkt in die Präsentation setzen, müssen dann beim Vortrag aber eventuell Folien überspringen, was auch kein guter Präsentationsstil ist. Die zusätzlichen Inhalte einfach nur ans Ende setzen ist auch schlecht, im schlimmsten Fall muss man dann während der Präsentation hin- und herspringen.

Eine elegante Lösung gibt es mit dem beamerappendixnote Paket. Es nutzt zwei Befehle, \appxnote und \printappxnotes, um Material für den Anhang zu integrieren.

Der erste Befehl erstellt den Link zum Anhang und den anzuzeigenden Link-Text, der zweite Befehl setzt alle Appendix-Notizen.

\documentclass[12pt,ngerman]{beamer}
\usepackage[T1]{fontenc}
\usepackage{babel}
\usepackage{beamerappendixnote}

\begin{document}

\begin{frame}
\frametitle{Some Beamer Slide}

\begin{itemize}
\item Some stuff
\item that requires
\item more background
\end{itemize}

\appxnote{Proof}{We can easily prove this.}

\end{frame}

\printappxnotes

\end{document}


Uwe

Uwe Ziegenhagen likes LaTeX and Python, sometimes even combined. Do you like my content and would like to thank me for it? Consider making a small donation to my local fablab, the Dingfabrik Köln. Details on how to donate can be found here Spenden für die Dingfabrik.

üääü­üöü–üöü–üöü–

LaTeX.net

Sorting Glossaries with bib2gls

In previous posts I discussed the glossaries package (which can be extended with the glossaries-extra package) and also using makeindex or xindy to sort the terms that had been indexed in the document. This post describes another tool, bib2gls, which can be used instead of makeindex or xindy.

Makeindex and xindy are both general purpose indexing applications, but bib2gls was designed specifically for use with glossaries-extra and works in a very different way. With makeindex and xindy, commands like \gls add indexing information to an external file. This information consists of the sort value, hierarchical information (if the term has a parent entry), the associated formatting code to display the entry in the glossary and the location (normally the page number) where the indexing occurred. The indexing application sorts and collates this information and writes all the code needed to format the glossary in another file, which is input by \printglossary.

By contrast, bib2gls doesnâ€™t create a file containing the code to typeset the glossary. Instead, it parses bib files containing definitions of terms, symbols and abbreviations, then selects the entries that are required by the document and writes the LaTeX code (using commands provided by glossaries-extra.sty) that defines these terms in a file thatâ€™s input by the preamble-only command \GlsXtrLoadResources. All the indexing information (such as the locations) is stored in internal fields associated with each entry. If a glossary is required, it can be displayed with \printunsrtglossary (or \printunsrtglossaries, which does \printunsrtglossary for each defined glossary).

This means that you can have a large database of all entries defined in a bib file (or across multiple bib files) that can be managed in an application such as jabref. If you have a document that only requires, say, 10 from a database of 1000 entries, then LaTeX only needs to use the resources required to define those 10 entries, which can improve the document build time.

\printunsrtglossary

The \printunsrtglossaries command was briefly mentioned inÂ an earlier post. When using bib2gls it helps to understand how \printunsrtglossary works. The â€œunsrtâ€ part of the name stands for â€œunsortedâ€ because this command simply iterates over all defined entries in the order in which they were defined. There is no attempt to sort entries or gather child entries, although letter group headings are inserted whenever a change in group is detected. As was previously illustrated, this can cause strange results. Consider the following example:

\documentclass{article}

\usepackage[style=indexgroup]{glossaries-extra}

\newglossaryentry{parrot}{name={parrot},
description={mainly tropical bird with bright plumage}
}

description={nocturnal insectivore with large claws}
}

\newglossaryentry{zebra}{name={zebra},
description={wild African horse with black-and-white stripes}
}

\newglossaryentry{duck}{name={duck},
description={a waterbird with webbed feet}
}

\newglossaryentry{aardvark}{name={aardvark},
description={nocturnal African burrowing mammal}
}

\newglossaryentry{macaw}{name={macaw},
parent={parrot},
description={long-tailed, colourful parrot}
}

\newglossaryentry{mallard}{name={mallard},
parent={duck},
description={dabbling duck with long body and broad bill}
}

\newglossaryentry{ara}{name={Ara},
parent={macaw},
description={neotropical genus of macaw}
}

\newglossaryentry{northernpintail}{name={northern pintail},
parent={duck},
description={long-necked, small-headed duck with curved back}
}

\newglossaryentry{anodorhynchus}{name={Anodorhynchus},
parent={macaw},
description={genus of large blue macaws}
}

\begin{document}
\printunsrtglossary
\end{document}

This produces a strange result (click to view larger version).

The list is in the order of definition. The parent key simply provides the hierarchical level for the benefit of the glossary style. The level starts at 0 for top-level (parentless) entries, then 1 for an entry that has a parent but no grandparent, 2 for an entry with a grandparent but no great-grandparent, etc. The glossary style may (or may not) apply a different format for entries according to their hierarchical level. In the case of the indexgroup style used here, different indenting is applied. In this example, this has led to a rather strange visual appearance that makes it look as though â€œmacawâ€, â€œmallardâ€ and â€œnorthern pintailâ€ are sub-items of â€œaardvarkâ€, and the macaw genera appear to be sub-items of â€œmallardâ€ and â€œnorthern pintailâ€.

The default behaviour of bib2gls is to hierarchically sort all the required entries before writing them to the file thatâ€™s input by \GlsXtrLoadResources. This means that \printunsrtglossary should then list them in an appropriate order.

While \printunsrtglossary iterates over all the entries for the given glossary, it inserts the group headers as follows:

1. If the current entry doesnâ€™t have a parent, then:
1. find the group label associated with the current entry
2. if the group label is different to the previous group label then:
1. if thereâ€™s no previous group label just insert \glsgroupheading{label}
2. if there is a previous group label insert \glsgroupskip\glsgroupheading{label}

The way the group label is obtained depends on the glossaries-extra package options and on bib2gls settings. Whether or not the group heading is actually displayed depends on the glossary style. Some styles simply redefine \glsgroupheading to do nothing (but \glsgroupskip may still create a vertical space).

Switching to bib2gls

Letâ€™s rewrite the above example so that it uses bib2gls. The first thing that needs to be done is to create a bib file that contains all the entry definitions. If you already have an existing .tex file that contains all your entries then bib2gls comes with a convenient command-line application called convertgls2bib, which can parse a .tex file for commands like \newglossaryentry and \newabbreviation and create a bib file suitable for use with bib2gls. For example, if the above document is contained in the file myDoc.tex then run the following from a command prompt:

convertgls2bib -p myDoc.tex entries.bib

(The -p switch indicates that convertgls2bib should only parse the document preamble.) This will create the file entries.bib that contains the following:

% Encoding: UTF-8
@entry{parrot,
name = {parrot},
description = {mainly tropical bird with bright plumage}
}

description = {nocturnal insectivore with large claws}
}

@entry{zebra,
name = {zebra},
description = {wild African horse with black-and-white stripes}
}

@entry{duck,
name = {duck},
description = {a waterbird with webbed feet}
}

@entry{aardvark,
name = {aardvark},
description = {nocturnal African burrowing mammal}
}

@entry{macaw,
parent = {parrot},
name = {macaw},
description = {long-tailed, colourful parrot}
}

@entry{mallard,
parent = {duck},
name = {mallard},
description = {dabbling duck with long body and broad bill}
}

@entry{ara,
parent = {macaw},
name = {Ara},
description = {neotropical genus of macaw}
}

@entry{northernpintail,
parent = {duck},
name = {northern pintail},
description = {long-necked, small-headed duck with curved back}
}

@entry{anodorhynchus,
parent = {macaw},
name = {Anodorhynchus},
description = {genus of large blue macaws}
}

The definitions now need to be removed from myDoc.tex and the record package option is required. The stylemods option is also useful (but not essential). This will load the glossaries-extra-stylemods package, which modifies the predefined styles provided by glossaries.sty to make them easier to adjust and also to make them integrate better with bib2gls.

\usepackage[record,stylemods,style=indexgroup]{glossaries-extra}

The \GlsXtrLoadResources command is also required. This writes information to the aux file for the benefit of bib2gls as well as inputting the file created by bib2gls (if it exists). By default, bib2gls will only select entries that have been indexed in the document (using commands like \gls) and any dependent entries. In this example, I want to select all entries that have been defined in my entries.bib file, which means I need to use the selection=all option.

\GlsXtrLoadResources[src=entries,selection=all]

The src option indicates the name or names of the bib files where the entry data is defined. In this case, thereâ€™s only one file. If I have additional files, they need to be specified in a comma-separated list. Note that this means that braces will be required to prevent the comma from being parsed as part of the key=value list. For example, if I also have entries in the file abbreviations.bib, then I would need:

\GlsXtrLoadResources[src={entries,abbreviations},selection=all]

The complete document is now:

\documentclass{article}

\usepackage[record,stylemods,style=indexgroup]{glossaries-extra}

\begin{document}
\printunsrtglossary
\end{document}

The document build process is similar to using makeglossaries, but bib2gls is used instead:

pdflatex myDoc
bib2gls myDoc
pdflatex myDoc

(Alternatively, use xelatex etc instead of pdflatex.) This produces the glossary shown below:

The ordering is now correct: aardvark, armadillo, duck (with sub-entries mallard and northern pintail), parrot (with sub-entry macaw that has sub-entries Anadorhynchus and Ara) and zebra. However, there are no letter group headings, even though the indexgroup style has been used. This is because bib2gls doesnâ€™t set the group labels by default (not all glossaries require them). If you need the glossary separated into letter groups then you should instruct bib2gls to set the group labels using the --group (or -g) switch:

pdflatex myDoc
bib2gls --group myDoc
pdflatex myDoc

If you are using arara you need to add the following lines to your document:

arara: pdflatex
arara: bib2gls: { group: on }
arara: pdflatex

(For other ways of integrating bib2gls into your document build see Incorporating makeglossaries or makeglossaries-lite or bib2gls into the document build.) With the --group switch, the glossary now looks like:

Naturally you also need to use a glossary style that supports groups.

If the document source is in the file myDoc.tex then bib2gls will create a file called myDoc.glstex that corresponds to the first \GlsXtrLoadResources command. If there are multiple instances of this command then myDoc-1.glstex will correspond to the second \GlsXtrLoadResources, myDoc-2.glstex will correspond to the third \GlsXtrLoadResources etc. These files simply contain LaTeX code and so can be inspected in a text editor.

In order to make it easier to perform minor adjustments, the glstex files provide wrapper commands. For example, terms that are defined with @entry in the bib file are defined in the glstex file using \bibglsnewentry, which has the definition provided before the first instance of its use:

\providecommand{\bibglsnewentry}[4]{%
\longnewglossaryentry*{#1}{name={#3},#2}{#4}%
}

This uses \longnewglossaryentry* instead of \newglossaryentry to allow for paragraph breaks in the description. If you try the above example document and look at the glstex file, you will see the entry definitions. These include the sort key for informational purposes. If youâ€™re not sure why bib2gls ordered the entries a certain way, checking the sort key in the glstex file can show you the value used by the selected bib2gls sort method. For example, the â€œnorthern pintailâ€ entry is defined in the glstex file as follows:

\bibglsnewentry{northernpintail}%
{parent={duck},
sort={northern|pintail|}}%
{northern pintail}%
{long-necked, small-headed duck with curved back}

Note that the sort value is northern|pintail| because the default sort method is a locale-sensitive word sort that discards spaces and (some) punctuation and marks up the end of each word with a marker (the pipe character | by default). Any remaining non-letter characters may by ignored by the comparison function (the comparator). This is appropriate for entries where the name is a word or phrase but may not be appropriate for other types of entries, such as symbols.

Sort Fallbacks

In general, you typically shouldnâ€™t set the sort field in the bib file, but instead use bib2glsâ€™s fallback system to choose the most appropriate field. If the sort field is explicitly set then the fallback isnâ€™t used. The fallback depends on the entry type used in the bib file. For example, if a term was defined in the bib file using @entry then the sort fallback is obtained from the name field. However, if the term was defined using @symbol (or @number) then the sort fallback is obtained from the label, and if the term was defined using @abbreviation (or @acronym) then the sort fallback is obtained from the short field

Suppose now that in addition to my entries.bib file described above, I also have a file called abbreviations.bib that contains:

% Encoding: UTF-8

@abbreviation{xml,
short={XML},
long={extensible markup language},
description={a markup language that defines a set of rules for
encoding documents}
}

@abbreviation{html,
short={HTML},
long={hypertext markup language},
description={the standard markup language for creating web pages}
}

and a file called constants.bib that contains:

% Encoding: UTF-8

@number{pi,
description={ratio of circumference of a circle to its diameter},
name={\ensuremath{\pi}},
user1={3.14159}
}

@number{e,
description={Euler's number},
name={\ensuremath{e}},
user1={2.71828}
}

@number{root2,
description={Pythagoras' constant},
name={\ensuremath{\surd2}},
user1={1.41421}
}

@number{gelfondscons,
description={Gelfond's constant},
name={\ensuremath{e^\pi}},
user1={23.140692}
}

@number{zero,
description={zero},
name={\ensuremath{0}}
}

@number{one,
description={one},
name={\ensuremath{1}}
}

I can now modify my document as follows:

\documentclass{article}

\usepackage[record,stylemods,style=indexgroup]{glossaries-extra}

\setabbreviationstyle{long-short-desc}

\begin{document}
\printunsrtglossaries
\end{document}

Note that I have to set the abbreviation style before \GlsXtrLoadResources. This produces the following glossary:The sort value for the mathematical constants has been obtained from the label. For example, Pythagorasâ€™ constant has the label root2 and so ends up in the â€œRâ€ letter group, whereas Ď€ (which has the label pi) is in the â€œPâ€ letter group. The sort value for the extensible markup language (XML) term has been obtained from the short field (XML) so it ends up in the â€œXâ€ letter group. The fallback values can be changed. For example:

\documentclass{article}

\usepackage[record,stylemods,style=indexgroup]{glossaries-extra}

\setabbreviationstyle{long-short-desc}
abbreviation-sort-fallback=long,
symbol-sort-fallback=description,
selection=all]

\begin{document}
\printunsrtglossaries
\end{document}

This will now use the long field as the fallback for abbreviations (@abbreviation or @acronym) and the description field as the fallback for symbols (@symbol or @number).The â€œextensible markup language (XML)â€ entry is now in the â€œEâ€ letter group and Pythagorasâ€™ constant is now in the â€œPâ€ letter group, but Ď€ (pi) is now in the â€œRâ€ letter group.

Sub-Blocks

Iâ€™m now going to make a strange modification to the above document:

\documentclass{article}

\usepackage[record,stylemods,style=indexgroup]{glossaries-extra}

\setabbreviationstyle{long-short-desc}

sort={en-reverse},
symbol-sort-fallback=description,
selection=all]

abbreviation-sort-fallback=long,
selection=all]

\begin{document}
\printunsrtglossary
\end{document}

This splits the data in the three bib files into three separate resource commands (\GlsXtrLoadResources).

The first \GlsXtrLoadResources selects all the terms defined in entries.bib and sorts them according to my locale (since thereâ€™s no language set in the document) which happens to be en-GB (British English). These terms are written to the myDoc.glstex file, which is input by the first resource command on the next LaTeX run.

The second command selects all the terms defined in constants.bib and sorts them according to en-reverse (reverse English, that is, Zâ€“A) and writes them to myDoc-1.glstex, which is input by the second resource command.

The third command selects all the terms defined in abbreviations.bib and sorts them according to my locale and writes them to myDoc-2.glstex, which is input by the third resource command.

Remember that \printunsrtglossary simply iterates over all entries in the order in which they are defined (via commands like \longnewglossaryentry or \newabbreviation, not the order in which they are defined in the bib files). This results in the order: (first glstex file) aardvark and armadillo (both assigned to the â€œAâ€ letter group), duck (assigned to the â€œDâ€ letter group) followed by its child entries, parrot (assigned to the â€œPâ€ letter group) followed by its descendants, zebra (assigned to the â€œZâ€ letter group) then (second glstex file) zero (assigned to the â€œZâ€ letter group), Ď€ (assigned to the â€œRâ€ letter group), Pythagorasâ€™ constant (assigned to the â€œPâ€ letter group), one (assigned to the â€œOâ€ letter group), Gelfondâ€™s constant (assigned to the â€œGâ€ letter group), and Eulerâ€™s number (assigned to the â€œEâ€ letter group) then (third glstex file) extensible markup language (assigned to the â€œEâ€ letter group) and hypertext markup language (assigned to the â€œHâ€ letter group).

Note that thereâ€™s no visual indication between the sub-blocks. The â€œZâ€ letter group contains zebra from the end of the first sub-block and zero from the start of the second sub-block. The â€œEâ€ letter group contains Eulerâ€™s number from the end of the second sub-block and extensible markup language from the start of the third sub-block. Note also that there are two â€œPâ€ letter groups: one from the first sub-block and the other from the second.

This is a contrived example. Itâ€™s more typical to use multiple resource commands to process multiple glossaries that require different settings or to gather together all entries in a sub-block into a custom group. (See also Logical Glossary Divisions (type vs group vs parent).)

The following modification demonstrates custom groups:

\documentclass{article}

\usepackage[record,stylemods,style=indexgroup]{glossaries-extra}

\setabbreviationstyle{long-short-desc}

\glsxtrsetgrouptitle{cons}{Mathematical Constants}

group=cons,% group label
symbol-sort-fallback=description,
selection=all]

\glsxtrsetgrouptitle{abbrvs}{Abbreviations}

group=abbrvs,% group label
abbreviation-sort-fallback=long,
selection=all]

\begin{document}
\printunsrtglossary
\end{document}

This assigns the constants to a group labelled cons with the title â€œMathematical Constantsâ€ and assigns the abbreviations to a group labelled abbrvs with the title â€œAbbreviationsâ€.

Multiple Glossaries

The following demonstrates multiple glossaries:

\documentclass{article}

\usepackage[abbreviations,symbols,record,stylemods=mcols]{glossaries-extra}

\setabbreviationstyle{long-short-desc}

type=symbols,
symbol-sort-fallback=description,
selection=all]

type=abbreviations,
abbreviation-sort-fallback=long,
selection=all]

\begin{document}
\printunsrtglossary[style=indexgroup]
\printunsrtglossary[type=symbols,style=mcolindex,nogroupskip]
\printunsrtglossary[type=abbreviations,style=index]
\end{document}

This uses the abbreviations and symbols package options to create the additional glossaries. Note that the use of multiple glossaries means that I can apply a different style to each glossary. (The stylemods=mcols option indicates to load glossary-mcols.sty as well as loading glossaries-extra-stylemods.sty.) The mcolindex style doesnâ€™t display the group headings but it does still create a vertical gap with \glsgroupskip. Iâ€™ve added the nogroupskip option to suppress it. The index style likewise doesnâ€™t show the group heading but does create a vertical gap. I havenâ€™t used the nogroupskip option for the abbreviations glossary so the inter-group gap is visible.

The type=abbreviations resource option wasnâ€™t actually necessary in the above example. Entries defined in the bib file with @abbreviation are written to the glstex file using the provided command:

\providecommand{\bibglsnewabbreviation}[4]{%
\newabbreviation[#2]{#1}{#3}{#4}%
}

Note that this uses \newabbreviation which automatically assigns the entry to the â€œabbreviationsâ€ glossary if the abbreviations package option is used. This means that both the main glossary and the abbreviations can be processed in one resource command:

\GlsXtrLoadResources[
src={entries,abbreviations},
abbreviation-sort-fallback=long,
selection=all]

This only needs to be split into two resource commands if there are conflicting settings. For example, suppose I want bib2gls to ignore the description fields in the abbreviations.bib file but not the entries.bib file. Then I would need:

\GlsXtrLoadResources[src={entries},selection=all]

\setabbreviationstyle{long-short}
ignore-fields={description},
selection=all]

Categories

In an earlier post I mentioned categories, which can be assigned using the category key. Itâ€™s possible to assign this key using the resource command with the same name. For example, I could make all my entries in the constants.bib file have the category â€œsymbolâ€ like this:

\GlsXtrLoadResources[src={constants},
type=symbols,
category=symbol,
symbol-sort-fallback=description,
selection=all]

Alternatively I could use category={same as entry}, which would make the category the same as the bib entry type without the leading @ (in this case number), or category={same as type}, which would make the category the same as the type (in this case symbols), or category={same as base}, which would make the category the same as the basename of the bib file without the .bib extension (in this case constants).

The other post described how to define a post-link category hook using \glsdefpostlink. This allows additional content to be added after commands like \gls for a particular category. Thereâ€™s a similar hook thatâ€™s implemented after the description is displayed with the standard glossary styles.

\glsdefpostdesc{category}{definition}

This is something you can also use with makeindex and xindy, but the bib2gls resource options make it easier to change the category for specific documents without having to alter the entry definition.

Some of the terms defined in constants.bib have the user1 field set to the approximate numerical value. Assigning a category and post-description hook makes it easy to display this value in the glossary. Note that the post-description hook comes before the punctuation inserted with the postdot or postpunc package options.

\documentclass{article}

\usepackage[postdot,abbreviations,symbols,record,stylemods={mcols}]{glossaries-extra}

\glsdefpostdesc{number}{%
\glsxtrifhasfield{useri}{\glscurrententrylabel}%
{\space ($\approx\glscurrentfieldvalue$)}{}%
}

type=symbols,
category={same as entry},
symbol-sort-fallback=description,
selection=all]

\setabbreviationstyle{long-short}
ignore-fields={description},
selection=all]

\begin{document}
\printunsrtglossary[style=indexgroup]
\printunsrtglossary[type=symbols,style=mcolindex,nogroupskip]
\printunsrtglossary[type=abbreviations,style=index]
\end{document}

For example, Eulerâ€™s number is now followed by â€œ (â‰ˆ2.71828)â€. The closing full stop (automatically inserted by the postdot package option) comes after the parenthetical material. Note that zero and one donâ€™t have the user1 field set as the value given in the name field is the exact numerical value.

Numeric Sorting

There are some numerical sorting methods available to use with bib2gls, such as sort=integer for integers or sort=double for double-precision floating point values. For locale-sensitive numbers (that include thousand separators) you need to use sort=numeric or sort=numberformat instead. Letâ€™s suppose that I now want the mathematical constants ordered according to their approximate value. As before I can use symbol-sort-fallback to change the fallback field to user1, but that will leave a blank sort value for zero and one, which donâ€™t have that field set.

With bib2gls version 2.7 or above, I can use the field concatenation operator symbol-sort-fallback=user1+name. This will construct the sort fallback value from a combination of the user1 value and the name. This means that, for example, Eulerâ€™s number will have the sort value 2.71828 \ensuremath{e} and â€œoneâ€ will have the sort value  \ensuremath{1}. The non-numeric content can then be stripped off using:

sort-replace={{[^0-9\string\.\string\-]+}{}}

The following is a modification of the previous example document. Iâ€™ve decided to provide my own custom glossary for the mathematical constants instead of using the one provided by the symbols package option for illustrative purposes.

\documentclass{article}

\usepackage[postdot,abbreviations,record,stylemods={mcols}]{glossaries-extra}

\glsdefpostdesc{number}{%
\glsxtrifhasfield{useri}{\glscurrententrylabel}%
{\space ($\approx\glscurrentfieldvalue$)}{}%
}

\newglossary*{constants}{Mathematical Constants}

type=constants,
category={same as entry},
sort=double,
symbol-sort-fallback=user1+name,
sort-replace={{[^0-9\string\.\string\-]+}{}},
selection=all]

\setabbreviationstyle{long-short}
ignore-fields={description},
selection=all]

\begin{document}
\printunsrtglossary[style=indexgroup]
\printunsrtglossary[type=constants,style=mcolindex,nogroupskip]
\printunsrtglossary[type=abbreviations,style=index]
\end{document}

The list of mathematical constants is now ordered according to the associated numeric value: 0, 1, Pythagorasâ€™ constant (1.41421), Eulerâ€™s number (2.71828), Ď€ (3.14159) and Gelfondâ€™s constant (23.140692). Note that the main glossary and list of abbreviations are still ordered according to the locale-sensitive word sort method.

(If you decide to change the glossary label, you may need to delete the glstex files before you can rebuild the document.)

The TeX Parser Library

\text_lowercase:n { \NoCaseChange { iPhone } ~ iPhone }
}


Titlecasing

Commonly, people think about uppercasing the first character of some text then lowercasing the rest, for example to use it at the start of a sentence. Unicode describe this operation as titlecasing, as there are some situations where the ‘first character’ is handled in a non-standard way. Perhaps the best example is IJ in Dutch: it’s treated as a single ‘letter’, so both letters have to be uppercase at the start of a sentence. (We’ll come to language-dependence in a second.)

Depending on the exact nature of the input, we might want to titlecase the first ‘character’ then lowercase everything else, or we might want just to titlecase the first ‘character’ and leave everything else unchanged. These are called \text_titlecase:n and \text_titlecase_first:n, respectively.

\exp_args:Ne \tl_show:n
{
\text_titlecase:n { some~text } ~
\text_titlecase:n { SOME~TEXT } ~
\text_titlecase_first:n { some~text } ~
\text_titlecase_first:n { SOME~TEXT }
}


As we are not simply grabbing the first token of the input, non-letters are ignored and the first real text is case-changed.

\exp_args:Ne \tl_show:n
{
\text_titlecase:n { 'some~text' }
}


Language-dependent functions

One important context for case changing text is the language the text is written in: there are special considerations for Dutch, Lithuanian, Turkic languages and Greek. That’s all handled by using versions of the case-changing functions that take a second argument: a BCP 47 string which can determine the path taken.

\exp_args:Ne \tl_show:n
{
\text_uppercase:n { Ragıp~Hulûsi~Özdem } ~
\text_uppercase:nn { tr } { Ragıp~Hulûsi~Özdem }
}
\exp_args:Ne \tl_show:n
{
\text_uppercase:n { ὈΔΥΣΣΕΎΣ} ~
\text_uppercase:nn { el } { ὈΔΥΣΣΕΎΣ }
}


Over time, mechanisms to link this behaviour to babel will be developed.

Conclusions

Case-changing functions in expl3 are now mature and stable, and ready for wider use. It’s likely that they will be made available as document-level commands in the medium term, but programmers can use them now to make handling this important aspect of text manipulation easier.

2020-01-07

Weblog von Markus Kohm

Das neue Jahr bringt nicht nur Vorsätze

Wer auf komascript.de aufmerksam mitliest, weiß schon länger, dass einige Entscheidungen anstanden und teilweise noch anstehen. Diese sollen dringend notwendige Veränderungen für mich mit sich bringen, haben aber natürlich auch Auswirkungen auf die Anwender. Mit dem neuen Jahr – oder eigentlich bereits kurz vor Weihnachten – sind die meisten dieser Entscheidungen gereift.

Seit mehr als 25 Jahren entwickle und pflege ich KOMA-Script quasi als One-Man-Show. So ganz stimmt das natürlich nicht, denn schon sehr früh bekam ich erste Unterstützung. Damals übernahm Luzia den Upload von KOMA-Script auf den CTAN-Server, weil es für mich extrem umständlich war, per Modem und VT220-Terminal-Zugang zur Uni das damals schon nicht kleine Paket zuerst auf meinen Benutzeraccount an der Uni zu übertragen und dann per ftp auf den CTAN-Server. Axel/Harald schrieb die erste vollständige Anleitung zu KOMA-Script. Der andere Axel steuerte die erste Briefklasse bei. Jahre später investierte Jens-Uwe sehr viel Zeit, die Überarbeitung der Anleitung und die Veröffentlichung als Buch zu betreuen und auch einige Kapitel der neuen Anleitung beizusteuern. Torsten wurde in dieser Zeit unverzichtbar als Ratgeber bei der Entwicklung einer komplett neuen Briefklasse aber auch bei der fortgesetzten Korrektur der Anleitung. Wenige Jahre später wurde Elke für die Weiterentwicklung des Pakets und die Korrektur der Anleitung und der Buchfassung aber auch für den Support im Allgemeinen unverzichtbar. Falk wiederum findet seit Jahren die Fehler, die so tief in KOMA-Script vergraben sind, dass ihre Beseitigung mir immer wieder erhebliche Kopfschmerzen bereitet. Dasselbe gilt für die vielen Anregungen und Erweiterungen, die auf ihn zurück gehen. Uwe hat seit der Jubiläumstagung von DTANE in Heidelberg die Verwaltung der CTAN-Uploads übernommen. Nach Elke, die die komplette Familie dafür eingespannt hat, haben zunächst unter anderem Gernot, dann Krickette und schließlich Karl sich der Übersetzung der KOMA-Script-Anleitung angenommen und diese massiv überarbeitet. Viele weitere stießen im Laufe der Jahrzehnte zum inoffiziellen Team, manche sind wieder in der Versenkung verschwunden. Ohne die ganzen Mitstreiter wäre die Arbeit aber auch der immer wieder entstehende Frust nicht zu bewältigen gewesen. Eine wichtige Rolle haben dabei natürlich auch diejenigen gespielt, die mich auf unterschiedliche Weise aufgemuntert und ermutigt haben. Eine zentrale Rolle spielte auch meine Familie, die mir überhaupt ermöglicht haben, Jahre in die Entwicklung und Pflege dieses Monsters zu stecken.

Neben den ganzen Arbeiten an KOMA-Script habe ich seit mehr als 30 Jahren auch allgemeinen LaTeX-Support geleistet. Auch dieser Bereich unterlag in den 25 Jahren vielfältiger Veränderung. Anfangs erfolgte der Support noch per Post und in einem Mailbox-Netz, in dem Laufzeiten von 12–24 Stunden für E-Mails und öffentliche Beiträge normal waren. Damals hat kaum jemand erwartet, innerhalb weniger Stunden oder gar Minuten eine fertige Lösung zu erhalten. Inzwischen findet der Support hauptsächlich in WWW-Foren statt. Davon gibt es Dutzende, deren Bezug zu LaTeX teilweise sehr klar, teilweise aber auch sehr undeutlich ist. Zeitweilig war ich in bis zu neun dieser Foren gleichzeitig aktiv. Dazu kamen Mailinglisten und Usenet-Gruppen. In mehreren Foren war ich auch als Moderator oder Administator tätig. Gerade die WWW-Foren verleiten extrem dazu, sich mit Dingen zu beschäftigen, die vor allem Zeit kosten und mit meiner Kernkompetenz – so ich eine solche überhaupt habe – wenig zu tun haben. Persönlich lagen und liegen mir dort Anfänger sehr am Herzen. Immer wieder habe ich auch gegen besseres Wissen versucht, solche an die Hand zu nehmen und ihnen bei ihrem mehr als holprigen Start in die Welt der Foren und die LaTeX-Welt zu helfen. Manchmal zerrte das erheblich an den Nerven. Denn genau genommen bin ich dabei komplett fehlt am Platz. Ich nehme mir viel zu sehr zu Herzen, wenn ich es nicht schaffe, jemanden auf die Erfolgsspur zu bringen.

Auch die Arbeit am Monster hat sich über die Jahre verändert. Auch da wird inzwischen teilweise erwartet, dass die Lösung eines Problems nicht nur vom Himmel fällt, sondern den Boden bereits in dem Moment erreicht, in dem ein wage Ahnung des Problems wie ein laues Lüftchen durch die Baumwipfel der Kalahari zieht. Würde mich nicht hin und wieder jemand auf das Rauschen der Blätter hinweisen oder mir den Wind hinter der Luftbewegung zur Kenntnis bringen, würde ich oftmals nicht einmal die Bäume finden. Auf der anderen Seite hat beispielsweise die Entwicklung von LaTeX nach Jahren des Stillstands derart an Fahrt aufgenommen, dass ich ständig hinter irgendwelchen Veränderungen, die Auswirkungen auf KOMA-Script haben, her hechle. In den Anfangsjahren von KOMA-Script hatte ich mich noch schlicht geweigert, Interna von LaTeX anzutasten. Trotzdem wurde ich schon damals von Änderungen an LaTeX manchmal überfahren, wenn Dinge, die KOMA-Script selbst anbot, plötzlich in den LaTeX-Kern übernommen wurden. Bei KOMA-Script 3 war es unvermeidbar, einige Interna von LaTeX anzupacken, obwohl ich schon damals befürchtete, dass mir das irgendwann auf die Füße fallen könnte. Das geschieht inzwischen. Gleichzeitig ist KOMA-Script so riesig geworden, dass es erheblichen Aufwand bedeutet, allein schon eine Ahnung davon zu bekommen, was sich wo auswirken könnte. Die Art der Auswirkung und ggf. die Lösung von daraus entstehenden Problemen ist dann noch lange nicht gefunden.

Trotz beruflicher Reduktion und schließlich komplettem Rückzug hat all das schon seit Jahren zu einem latenten Gefühl der ständigen Überlastung geführt. Ein Zustand, den ich durchaus schon einmal kannte und den ich inzwischen dringend zu vermeiden suche. Schon länger weiß ich, dass sowohl meine Zeit als auch meine Energie nicht genügen, um all die Dinge, die ich gefühlt tun sollte, die man von mir erwartet und die ich tatsächlich tun will, auch wirklich zu schaffen. Schon vor längerer Zeit habe ich deshalb diversen Foren den Rücken gekehrt. Manchmal fiel das eher schwer, manchmal wurde es durch unschöne Vorfälle wie den Nazivorwurf an mehr oder weniger alle deutschen Teilnehmer auf TeX.SX sehr erleichtert. Eine Verletzung, die ich gerne als Sportverletzung bezeichnen würde, die aber tatsächlich eher meiner Eitelkeit und Ungeduld zuzuschreiben ist, behindert mich darüber hinaus leider nach über einem Jahr noch immer bei meiner Arbeit am Computer. Ende Oktober, Anfang November musste ich mir dann eingestehen, dass alle Kompensationsversuche und alle bisherigen Reduktionsversuche nicht genügten. Die überraschend dringlichen Arbeiten an der siebten Auflage des KOMA-Script-Buchs reichte aus, mich ins Straucheln zu bringen. Dass ich in dieser Situation auch noch mit dem Wiederaufflammen von alten Anfeindungen konfrontiert wurde, deren tiefere Ursache ich in den Lizenzdiskussion – einem der unsäglichen Nebenschauplätze, welche die KOMA-Script-Entwicklung leider mit sich brachte – um die Jahrtausendwende vermute, war nicht unbedingt hilfreich.

Jedenfalls war ab dem Zeitpunkt klar, dass es eine radikale Veränderung geben muss. Ich kann KOMA-Script nicht mehr weiterentwickeln, pflegen, Support dafür und für alle möglichen LaTeX-Fragen leisten, mich in die Verwaltung von TeX-Seiten einmischen, DTK-Artikel schreiben, Anleitungen in zwei Sprachen auf dem Laufenden halten, ein Buch zu KOMA-Script alle durchschnittlich 18–24 Monate komplett überarbeiten und einen Großteil der Administrationsarbeit für komascript.de erledigen. Und ich will es auch nicht mehr.

Wie einige wissen dürfte, liegt mein Interesse tatsächlich in der Entwicklung. Wie oben erklärt, fühle ich mich außerdem verantwortlich, Anwendern Hilfestellung zu leisten. Die schönste Entwicklung nützt wenig, wenn sie mangels Dokumentation und auch darüber hinaus gehender Hilfe keine Anwendung findet. Die schönste Entwicklung nutzt nichts, wenn darin enthaltene Fehler ihre Verwendung verhindern. Die schönste Entwicklung nutzt nichts, wenn sie durch externe Entwicklungen unbrauchbar wird. Gleichzeitig bedeutet neue Entwicklung auch immer neue Fehler und immer neuen Bedarf für Dokumentation und Hilfestellung. Die Stelle, an der ich recht einfach mit erheblichen Auswirkungen etwas ändern kann, ist also die Entwicklung. Da das leider mein Hauptinteresse ist, habe ich mich viel zu lange schwer damit getan, die logische Konsequenz zu akzeptieren. Die Entwicklung neuer Dinge in KOMA-Script, so schön sie sein mögen, so weit die Ideen dazu bereits gereift sein mögen, muss aufhören!

Mit dem jüngst veröffentlichten KOMA-Script 3.28 ist daher ein für mich grundlegender Wechsel markiert: Alles, was angedacht, begonnen oder auch schon ansatzweise oder als Prototyp vorhanden ist, ist ab sofort eingestellt. Erweiterungen wird es in KOMA-Script in der Regel nicht mehr geben. Kosmetik und Detailverbesserungen sind möglich, wenn sie wohl begründet sind. Alpha- und Beta-Pakete und überholtes wird aus KOMA-Script verschwinden. Das betrifft ausdrücklich auch tocstyle und scrpage2. Bei komamarks ist es bereits erfolgt. scrlayer-fancyhdr ist ein Wackelkandidat. scrhack als eines der Pakete, bei denen ich ständig der Entwicklung anderer hinterher hecheln muss, steht ebenfalls zur Disposition. Bereits bei KOMA-Script 3.27 habe ich den Workaround für titlesec aus den Klassen entfernt. Damit wurde (erneut) eine seit Jahren verbreitete Warnung Realität. Workarounds für andere Pakete werden eventuell folgen. Erwähnt sei beispielsweise der für das ohnehin veraltete caption2 (von dem ich gar nicht weiß, ob der noch irgend einen Sinn hat). Die meisten dieser Workarounds waren nur als Übergangslösung gedacht, verbunden mit der Hoffnung, dass die entsprechenden Paketautoren selbst irgendwann KOMA-Script-Klassen unterstützen würden, was im Falle von caption2 in Form von caption3 und caption ja auch erfolgt ist. Wäre dies generell erfolgt, wäre übrigens die Verwendung von scrlfile durch die Klassen überflüssig und mir ein GAU des letzten Jahres erspart geblieben.

Da inzwischen durch externe Entwicklungen eine Kompatibilität alter Dokumente mit neuen TeX-Installationen ohnehin nicht mehr gewährleistet werden kann, stehen auch Kompatibilitätseinstellungen wie Option version auf dem Prüfstand. Derartige komplett zu entfernen reduziert einerseits den Pflegeaufwand und das Risiko bei zukünftigen Änderungen etwas zu übersehen. Andererseits birgt das Entfernen selbst ein gewisses Fehlerrisiko. Allerdings nützt die vorhandene Option zunehmend wenig und würde zudem ad-absurdum geführt, wenn sie bei zukünftigen Änderungen unbeachtet bliebe. Ein erster Schritt könnte daher sein, die Dokumentation der Option zu entfernen und sie als veraltet zu markieren. Die logische Konsequenz aus einer solchen Markierung ist dann im nächsten Schritt aber in der Tat das Entfernen.

Mein ohnehin nur wenig bekanntes öffentliche Engagement im allgemeinen LaTeX-Support werde ich ganz nach Lust und Laune ebenfalls zeitweilig bis auf Null reduzieren und jedenfalls einer ständigen, kritischen Prüfung unterziehen. Dass ich oftmals über eine Stunde investiere, um alle erkannten Fehler und Ungereimtheiten in einem gezeigten, nicht minimalen Code aufzuführen, ist letztlich verschwendete Zeit. Die Mehrzahl der entsprechenden Fragesteller ignoriert diese Hinweise ohnehin oder verschwindet sang- und klanglos und schneller als ich meine Hinweise liefern kann.

Ziel des ganzen ist, den Support und die Pflege von KOMA-Script weiterhin leisten zu können. Ideal wäre, wenn dabei auch noch etwas Zeit für andere Dinge abfallen würde.

So bringt das neue Jahr für mich also bereits erste Veränderungen. Weitere liegen im Bereich der Vorsätze.

Bis demnächst
Markus

2019-12-30

Thomas Schramm

Fussball-Halbjahreskalender 2020 Januar bis Juni mit ics-Datei

Der Halbjahreskalender Januar bis Juni 2020 A4 Querformat mit den Terminen der 1. Bundesliga, CL, EL, Länderspielen und DFB-Pokal als PDF mit zugehörigem LaTeX-Quelltext. Dazu die ics-Kalenderdatei zum Einbinden in Kalenderprogramme mit allen Terminen der 1. Bundesliga, Champions League, Europa League, Pokal und Länderspielen. Der TeX-Quelltext beinhaltet aus praktischen Gründen auch die Termine Juli bis […]

2019-12-15

TUG

G21C conference: Grapholinguistics in the 21st Century

A biennial conference bringing together disciplines concerned with grapholinguistics and more generally the study writing systems and their representation in written communication. Submission deadline: January 13, 2020.

2019-12-11

Typeroom

Print is back! The renowned culture brand to be relaunched in 2020

This December, Print magazine, an online authority on graphic, interactive and brand design and their influences on visual culture, announces new ownership and its continued presence in the graphic arts industry.

Print began publishing in 1940 as Print, A Quarterly Journal of the Graphic Arts led by William Edwin Rudge to demonstrate, in his words, the far-reaching importance of the graphic arts.

Through the decades Print became an iconic design and visual culture brand serving as the go-to industry resource for design dialogue and inspiration, design education, profiles of leading design minds and everything in between via a top-ranking website and social media platforms, The Daily Heller column, and one of the most well-respected design competitions in the industry—the Print Regional Design Awards—Print has been the leading authority on all things design today.

The Print brand and Printmag.com was acquired by Print Holdings LLC comprised of industry veterans, Debbie Millman (Design Matters), Steven Heller (The Daily Heller), Andrew Gibbs (Dieline), Jessica Deseo (Dieline), Jessica Deseo (D’NA Company) and Laura Des Enfants (D’NA Company).

Together, this team is committed to offering the design industry a hub for inspiration, education, and community, while remaining committed to Print’s original mission of commenting on, critiquing, discussing and documenting the work, thinking and business of design.

The site will continue to publish its legacy site including up-to-date Daily Heller columns until 2020 when a new format will launch under the vision and guidance of this ownership team.

“While we are thrilled to rescue this piece of design history,” said Debbie Millman, “we are even more excited about the opportunity to reinvent Printmag.com”

Relaunched in 2020, Print aims to continue building a dialogue about design by detailing the intersections of art and culture. “Rather than focusing on the how-to of design, Print’s content covers the why—why the world of design looks the way it does, how it has evolved, and why the way it looks matters and while Print will continue to identify and analyze important trends, history and insights from thought leaders in the industry, the online magazine will begin to include an expanded 21st-century view of design and its contribution to our world” notes the team.

“While we are thrilled to rescue this piece of design history,” said Debbie Millman, “we are even more excited about the opportunity to reinvent Printmag.com as a reflection of what designers are interested in reading today.”

“The magazine we all call Print has had a half dozen different names since its inception in June of 1940” writes J. J. Sedelmaier. “It was originally a limited-edition periodical that discussed the endless techniques used in the graphic arts industry, and even included original prints and tipped-in features within. From its first edition up to Volume VII, Number 6, in March 1953, it was a 7 1/4-by-10-inch journal-sized publication.”

“William Edwin Rudge was the publication’s original publisher and managing editor. He was the third generation of a family of printers (all named William Edwin Rudge), and he worked out of his publishing business office in New Haven, Connecticut. From the first issue in 1940 through Volume VI, Number 4, the publication was called Print: A Quarterly Journal Of The Graphic Arts. Beginning with the following edition, Volume VII, Number 1, it changed its name to Print after combining its previous title with another magazine called The Print Collector’s Quarterly. Rudge continued as the publisher and managing editor, but the publishing office had moved to Burlington, Vermont, with the editorial offices in Hartsdale, New York. By the spring of 1953, Rudge was the president/editor, and the publishing and editorial offices had moved to 17 West 44th Street in Manhattan.”

Volume 1, Number 1. Cover by Howard Trafton.

An editorial introduction to the journal’s mission (with an interesting mention of television in the footnote)

Explore more of Print's legacy here and reengage with Print here

2019-12-10

Typeroom

Social Poster Exhibition: time for graphic design to teach us kindness

Promoting tolerance among nations through the art of the poster is a mission Typeroom approves and Gabriel Benderski is an artivist whose portfolio speaks of the importance of human rights in times of need .

Born in Montevideo, Uruguay into a Jewish home in 1988, Benderski started his formal training by studying graphic design at ORT University where he obtained a BA. Prior to going freelance, he worked for five years in design studios where he discovered his enthusiasm for editorial and brand design -his visual identity of Campeón del Siglo FIFA Stadium for Club Atlético Peñarol is one of the projects he highlights.

Yet it is not just his talent but his agenda that makes Benderski a creative to follow.

“Recently, I started to focus on the social aspect of graphic design and as a result, the Social Poster Exhibition is part of The Uruguayan Plan for Human Rights Education and it was declared of Educational Interest by the Uruguayan Ministry of Education and Culture” he notes of his latest exhibition.

From New York through Warsaw, Toronto, Ottawa and in four cities around Uruguay Benderski demonstrates how graphic design allows us to use it as a tool that promotes tolerance.

Tolerance Poster Show: Mirko Ilić campaigns for humanity with Milton Glaser & more

While the posters are been displayed in embassies and consulates around the world this man on a mission “analyzes the background of poster design and it’s social function.”

Human type: Gabriel Benderski's posters is letterpress activism for a better world

“The aim is to transmit to the participants an introduction to graphic design that allows them to use it as a tool to promote tolerance” he adds.

In case you interested in holding the exhibition get in touch with Benderski here.

“Thanks to the hand of Max Phillips (Signal Type) an old sketch I made a few years ago is now finished. Infinite Union is to understand that we are all together in this world”

“I was invited by Mirko Ilić to participate in The Tolerance Show, a traveling exhibition that brings together artists from around the world to create posters of Tolerance. To tolerate is to behave as one”

“The aim is to transmit to the participants an introduction to graphic design that allows them to use it as a tool to promote tolerance”

2019-12-06

Typeroom

Graphic Design Festival Scotland 2019: Lamm & Kirch & more winning posters to inspire

Since launching in 2014 Graphic Design Festival Scotland has received more than 38,000 poster submissions from over 100+ countries fo its International Poster Competition from established agencies and budding designers alike from all corners of the earth.

This year's International Poster Competition winners are inspiring as their designs “alter perceptions or ways of thinking, offer creative solutions to problems, contribute to current affairs, open dialogues for debate, provoke discussions and make innovative use of media or medium” notes GDFS

The 2019 competition received 8,567 submissions from 86 countries with Lamm & Kirch, Mark Bohle & Nam Huynh and Formes Vives! winning big.

You can visit the International Poster Exhibition at an all-new space dedicated to graphic design aka Olympia Gallery – where the exhibition opened on November 22nd, 2019.

The following are the top three winning posters with the jury's comments as noted on GDFS

Ruhr Ding: Territorien and Tony Cokes – Mixing Plant by Lamm & Kirch, 3rd place

“The bold factual typography combined with soft, fading colors and blended imagery melts together a unique fusion of industrial communication and creative expression”

The posters were commissioned by Urbane Kuenst Ruhr, an institution for contemporary art, to promote two exhibitions. One titled Territories which explores territorial definitions and the territorial aspects of establishing an identity. The other, an exhibition of artist Tony Cokes’ work which “reflects on capitalism, subjective perceptions, knowledge transfer and (visual) stimuli”. The posters are designed within a wider identity as part of Lamm & Kirch's on-going work with the institution.

Aiming to reflect the “unstable” identity of the institution which operates in the “decentralised" post-industrial area of Ruhr in West Germany, Lamm & Kirch utilise “a tool kit” of “variable components”; photography, abstract imagery, artist material, formless shapes, graphic symbols and a custom variable font built with Dinamo Typefaces. The elements come together in seemingly haphazard compositions; images layer over each other, lines and shapes intersect and information appears sporadically dotted around in perceptively chaotic compositions.

Bigger and Better: Graphic Design Festival Scotland 2018 in 230 posters!

The bold factual typography combined with soft, fading colors and blended imagery melts together a unique fusion of industrial communication and creative expression.

“The posters are successful on many levels; effective as stand-alone pieces of work, perform seamlessly as part of a wider identity, visually exciting at first glance with various levels of interest up to small details within the typography, an exciting variety of visual elements, all of the elements connect and disconnect with one another in an interesting way, the posters strongly reflect the concept and approach of the institution, exceptionally high quality of finish, typography is solid and the aesthetic is original” notes the jury of this year's competition.

The Big Bang by Mark Bohle and Nam Huynh, 2nd place

The poster was commissioned by community-focused student space ODAS to promote a doors-open day which offered the opportunity to visit the private studios of various creatives.

Through the poster, creative spaces were compared to children’s bedrooms, places “of imagination and pure freedom”. Continuing the idea of child-like imagination, inspiration was drawn from a Joan Miro illustration for the central motif – a star. The area around the star is embellished with a scattering of pop-culture icons, characters, and symbols – comparable to children plastering stickers all over belongings as a way of personalizing or customizing.

“Posters do not always have to be direct or immediate. A successful poster may not be straight forward and could be open to interpretation”

Custom 3D typography is paired with a geometric sans at the bottom of the poster to communicate the title and event information.

“The poster is instantly memorable; strange, playful and engaging. The concept comparing the children’s bedrooms to creative studios is original and communicated in a fascinating way. The “stickers”, Joan Miro inspired 3D-rendered toothpaste, thick blobby custom lettering and light grotesk geometric typeface at the bottom are an unforgettable combination of elements and assembled together in a tactile, unexpected way. The poster is highly contemporary but well-grounded and formed. Very enjoyable to look at” comments the jury members of the winning project.

Extra Ball and Golden Ball by Formes Vives!, 1st place

The posters were commissioned by art/craft/DIY design publisher Ultra for public display next to their offices in Brittany, France, and Formes Vives were offered complete creative control.

Formes Vives aimed to "support" the Gilet Jaune, a populist, political movement for economic justice started in France during 2018, and “talk about police repression”. An illustrated pinball machine is combined with typographic cues relating to the "Gilet Jaune” in a tactile frenzy of colors, lines, and forms. Some of the scrawled cues on the posters include “gasoline”, “fire”, “capital gains”, “tax-free”, “power” and “play”. The posters do not explicitly comment but perhaps compare the actions between the Gilet Jaune and the police on the streets of Paris to a game of fatal pinball.

The posters were created using hand-drawn illustrations, layered digital collage and printed digitally before being displayed publicly in Brittany.

“Posters which engage with current affairs or on-going global topics are important, particularly within graphic design where designers have such a strong position to communicate ideas and educate those around them. The posters are totally original, visually engaging, spark curiosity and contribute to on-going political discussion” notes this year's jury members which included The Rodina from the Czech Republic, Spassky Fischer from France and Warriors Studio from the United Kingdom.

Koos Breen knows how to design a typographic poster for the win

“Formes Vives approach feels human, organic and personal which implies a level of authenticity and expressive emotion which strengthens the message and the connection between the poster and the subject. Posters do not always have to be direct or immediate. A successful poster may not be straight forward and could be open to interpretation. This may be atypical of how we expect “good” design to behave but hopefully, through the competition we can open up new ways of thinking and demonstrate that immediacy and directness are not always an indicator of quality.”

“The posters hit the sweet spot in many ways: they contribute to on-going political discussions, are visually striking, the use of colors and forms in the composition is strong, the tactility of the work is quite unique. The approach of Formes Vives is inimitable with real emotion and craft. The posters demonstrate that the artistic aspect and the craft of poster design are alive and well.”

All 252 posters selected are featured inside the International Poster Book edition. The limited publication of 1,000 copies available hits all the right notes so do make sure to grab your own copy here.

2019-12-05

Typeroom

From the New York Times to Pentagram sky is the limit for multi-awarded Matt Willey

I was born with severe hearing loss. My parents were told I would never be able to speak properly or go to a normal school. That, more than anything else, shaped my childhood” says Matt Willey, arguably one of the most influential art directors of the industry and an official Pentagram partner as announced earlier this week in his latest interview with Creative Review.

Born in Bristol, Willey has managed to be the talk of the city that never sleeps as the art director at The New York Times Magazine -a position he has held since 2014 before becoming Pentagram's latest MVP.

Willey studied graphic design at Central St. Martins in London, graduating in 1997. In 2002, he joined Vince Frost at Frost Design London, ultimately rising to the position of creative director and three years later he co-founded the London-based firm Studio 8 Design with Zoë Bather. After the company's closure in 2012 Willey relocated to New York for more adventures in type and design.

“In his editorial design, Willey combines strong typography and photography to create powerful settings for the content at hand” notes Pentagram

Currently the creative director and senior editor of Port, the gentlemen’s quarterly he co-founded with editor Dan Crowe, Willey has art directed the arts magazine Elephant, the travel and culture magazine Avaunt (which he co-founded) and in 2013 he completed the game-changing redesign of UK newspaper The Independent.

A magazine lover since his early days in the industry Willey felt “for a long time quite ambivalent about the idea of being a graphic designer... I maybe gravitated towards magazines because they offered proximity to other worlds – to writers and editors, to art and photography and film, and illustration, and music and so on” says Willey to Crowe for WePresent.

“Design is almost always – or should almost always be – just a small part of something else. Usually something bigger and more interesting. I don’t think design is much in and of itself; it’s only interesting when it’s in service to something else. But I felt comfortable with magazines, they made some sort of sense to me” he adds.

Feeling “a small cog in a big and incredibly impressive machine” by the name New York Times, Willey has just swapped one of the industry's most coveted jobs for another.

A type designer as well, Willey's custom typefaces such as AType, BWord, NewPort, and NSW01 to name a few are created for the specific context of the brands, publications, stories and other projects in which they appear.

Willey, obviously a creative with a kind agenda, has made several of the fonts commercially available and proceeds from select fonts are donated to charity. “His MFred and TIMMONS typefaces have helped raise more than £70,000 for Cancer Research UK and Macmillan Cancer Support through the Buy Fonts Save Lives initiative, while all money made from the sale of Blakey Slab goes to the American Civil Liberties Union (ACLU)” notes Pentagram.

“I don’t think design is much in and of itself; it’s only interesting when it’s in service to something else”

Named Designer of the Year by Creative Review in 2014 Willey’s work has been recognized with numerous awards. “During his five-year tenure at The New York Times Magazine, the publication was honored with SPD Magazine of the Year in 2016 (and Finalist for 2017-2019) and SPD Brand of the Year in 2019, and more than 40 gold medals; five D&AD yellow pencils; more than 20 Gold and Silver cubes from the Art Directors Club; and awards from the Creative Review Annual and the Type Directors Club. Avaunt was a finalist for SPD Magazine of the Year and received a D&AD yellow pencil, and Port won SPD Independent Magazine of the Year in 2019.”

Willey’s titles for Phoebe Waller-Bridge’s spy series Killing Eve were nominated for a BAFTA Award and received a Royal Television Society Craft Award and obviously sky is the limit for this son of a poet and the latest creative to join Pentagram's New York office.

Explore more of Matt Willey's instinctively visual brilliance here

Willey, obviously a creative with a kind agenda, has made several of the fonts commercially available and proceeds from select fonts are donated to charity

All images via Matt Willey

PlayStation turns 25: the good, the bad and the glorious logos revealed

Tags

On December 3rd, 1994, history happened with a little help from SONY. On that day, the PlayStation launched in Japan and the company entered the video game industry with Playstation -changing the core of it almost overnight.

Twenty-five years later the Playstation saga continues and SONY's influential product being undoubtedly one of the world's biggest modern franchises in entertainment ever.

The PlayStation impact upon our culture is evident -from the iconic games aka Final Fantasy through its sartorial influence per The Verge- and that little gray box which started it all came with a logo not to be forgotten.

Conceptualized in 1994 by Manabu Sakamoto, the Japanese designer, who had also worked on other logos for SONY aka VAIO, the PlayStation logo was not an easy task to accomplish.

The final logo, which comprises the letter ‘P’ standing over the letter ‘S’, resulted after numerous concepts being put forward for selection revealed a compilation of rejected PlayStation logo iterations that surfaced recently online on Reddit.

All of the concepts were created by Sakamoto himself who played with variations in color and typefaces in search of the perfect emblem.

Shortened to just two letterforms the logo, strong as the console itself, carried the product to its dominance and is one of the most recognized logos of our times.

Sakamoto also designed the custom typeface used in the PS logo and the font, simple and clean, is part of SONY’s global success story that was launched a quarter of a century ago.

Eventually the PlayStation (officially abbreviated as PS and commonly known as the PS1 or its codename PSX) was first released on 3 December 1994 in Japan, on 9 September 1995 in North America, on 29 September 1995 in Europe, and on 15 November 1995 in Australia, and was the first of the PlayStation lineup of video game consoles.

Photo by Hello I'm Nik on Unsplash

“Starting from a humble beginning as an upstart within SONY, Ken Kutaragi and team delivered on a vision to elevate video games as a form of entertainment that everyone could enjoy, and to make a platform for game developers to express their creativity,” writes Jim Ryan, President & CEO, SIE.

The original PlayStation sold 100,000 units in Japan on its first day and went on to become the first-ever home console to surpass 100 million units sold globally.

On 19 September 2018, Sony unveiled the PlayStation Classic to mark the 24th anniversary of the original console. This miniature recreation of the original console was released on 3 December 2018, the exact date the console was released in Japan in 1994.

SONY has already confirmed that the PlayStation 5 will launch during the 2020 holiday season because what started with a bang will go volcanic.

Playstation wall with hundreds of signatures at Gamescom 2019, Cologne, Germany, photo by Andreas Heimann on Unsplash

2019-12-04

Typeroom

Remembering Gerard Unger: thanks to Ashler Design every type design of the legend is back online

Tags

From Markeur, his first-ever professional type design for Joh back in 1972 to Sanserata, his latest type design in 2016 Gerard Unger’s legacy is alive and kicking in full force again online.

Spain based Ashler Design, a collaboration between two people, Elena Ramírez, web designer and UX expert, and Octavio Pardo, graphic and type designer decided to pay a very welcoming tribute to Unger, resurrecting his website back from oblivion.

“Gerard Unger died on 23 November 2018. His personal website went down a few months later, which I found out while talking about him in a type design course” explains Ashler of Remembering Gerard Unger

The website is up and running, almost exactly-ish the way it was. As Ashler puts it, “almost like he was still among us”

“It was a personal shock, like a sudden realization that he was really gone. Using web archive, we have brought his website back to life as a small homage to a professor that not only inspired several generations of type designers but was also an outstanding human being. Gerry Leonidas describes him beautifully here.”

In Memoriam: Gerard Unger 1942-2018

“To be precise, this is not exactly his original website. We have removed a subtle but charming animation he created in his logo because it was using Flash technology. We also added another slot in his type design collection because Sanserata, his latest release, was missing from the collection. We thought it would be great to have it there as well. The rest is exactly-ish the way it was. Almost like he was still among us.”

A perfect gift to all by Ashler aka one grateful student of a type design legend is live here.

Unger on Markeur, 1972

My first professional type design was Markeur, for Joh. Enschedé & Zonen, Haarlem. By 1970 Enschedé’s last punchcutter, Henk Drost, had few opportunities to practice his original craft, so the type foundry had looked for a new sort of work: signage, with letters, cut into laminated plastic sheets. For this purpose, Sem Hartz had designed Panture (1971), a series of seriffed capitals.”

“Markeur was designed as a sans serif alternative. Usually engraved lettering of this kind is the same thickness all over, like din letters. Drost pointed out that if he wanted to produce perfect letters he had to go through the groove twice anyway.”

“ If these two tracks were made not to overlap each other exactly but were slightly offset, it was possible to create letters with different thicknesses. The rounded corners are the result of using rotating bits.”

Unger on Sanserata, 2016

“Sanserata's originality does not overtly present itself at text sizes. Rather, at those sizes, it draws upon its enormous x-height, short extenders, and articulated terminals to improve readability, especially on screens.”

“Having articulated terminals means characters flare as they near their end, but readers likely won’t notice. What they would notice is that their ability to take in more content in a line of text is improved because the letter shapes are more defined. Articulation also makes clearer text from digital sources, where rectangular endings tend to get rounded by the emission of light from the screen.”

Gerard Unger's Theory of Type Design is the must-buy book of the season

“Lately there seems a whispered discontent with the lack of progress in the sans serif category. Designs can either stretch too far beyond what is accepted or be too bland to be considered new. Sanserata’s strength is in being vivid and unique without being off-putting. This bodes well for designers of paragraphs and of branding schemes since, with Sanserata’s two flavors, it is well able to capture attention or simply set the tone.”

“Sanserata’s first voice is a generous, friendly, and even cheerful sans serif. But when using the alternate letterforms its voice becomes more businesslike, though still with nice curves, generous proportions, and a pleasant character.”

2019-12-03

Typeroom

FENTY for the win: how Commission Studio branded Rihanna’s urban luxe label

After just six months in business, Rihanna's historic luxury fashion house FENTY has been presented with the Urban Luxe award by the British Fashion Council at this year’s Fashion Awards 2019.

“Contextualising Rihanna isn’t easy – which meant the role of creating the branding for her first fashion house had to be handled by the experts” writes Leanne Cloudsdale. Founded by creative directors David McFarline and Christopher Moorby, Commission Studio’s heavyweight portfolio of luxury clients was enough proof for the synergy to happen.

“The new wordmark was drawn from scratch. The distinctive letter ‘F’ with the strokes overlapping references Rihanna’s own handwriting and the reverse ‘N’ is a legacy nod to the Fenty Beauty logo” said McFarline describing the inspiration behind the design of an entirely new wordmark for the brand.

“We used a special adaptation of the Grilli typeface GT America Compressed Light for the brand typeface with the ‘N’s reversed as part of the standard character set. Everything is clean and modern – specifically designed to work well on small scale devices such as phones and tablets. Across the packaging, the logos were applied with gold foil and a 3D sculpted diamond emboss to give that luxe feel” he added.

“Typically, monograms have always been an integral communication tool for luxury brands and Commission Studio wanted to develop a contemporary version that could be used throughout the Fenty collection. ‘The Mazelogo was a fresh interpretation of this and includes every letter of the brand name. At first glance, it could be a QR code, electronic circuitry, a Chinese character, or a Greek key – what makes ‘The Maze’ unique is how it manages to be modern and familiar at the same time.”

Alongside the visual identity, London based design and branding consultancy Commission created the complete packaging suite and garment branding for Robyn Rihanna Fenty's awarded fashion house.

“The Maze logo includes every letter of the brand name. What makes ‘The Maze’ unique is how it manages to be modern and familiar at the same time”

“The distinctive letter ‘F’ with the strokes overlapping references Rihanna’s own handwriting and the reverse ‘N’ is a legacy nod to the Fenty Beauty logo”

Explore more of FENTY’s graphic branding, reflective of the “artistic director's complex style, creative vision, and progressive attitude across many digital and physical applications” here.

All photos via Commission Studio @ Luke Evans and Rihanna's official twitter account

2019-12-02

Typeroom

Nouvelle type: Jean Fouchet’s title design legacy will make you weep

Long before La La Land, there was a French anti-musical which launched Catherine Deneuve’s iconic career and made filmgoers weep for love does not conquer all in Jacques Demy’s movie musical Les parapluies de Cherbourg (The Umbrellas of Cherbourg).

Set in northwest France in the late 50s the drama of Geneviève, a lovesick 17-year-old and Guy, played by Deneuve and the equally charming Nino Castelnuovo, broke boundaries with its aesthetics.

Per Nouvelle Vague’s pioneer Demy his Oscar-nominated and Palme d’Or-winning masterpiece “used color like a singing Matisse” and thanks to restorations pushed forward by Demy’s widow Agnès Varda, The Umbrellas of Cherbourg’s enchanting palette is today just as vivid 55 years after its original release.

Eventually the title design of the soon to be re-released in UK oh-so-melancholic-it-hurts musical, set in Michel Legrand’s poetic and haunting melodies, movie by the French New Wave director Jacques Demy is a fine example of Jean Fouchet’s typography and talent.

The opening credits, with it’s bird’s eye view of pastel bikes and titular umbrellas on a rainy day in Cherbourg, are frequently acclaimed as some of the best of all time. “Brilliantly, title designer Jean Fouchet uses the cobbles of the street as a grid that guides the typography” notes Yale Daily News of Fouchet’s pop type. Deneuve, Demy, Legrand and Fouchet were reunited in Les Demoiselles de Rochefort, another movie which displays Fouchet’s talent in introducing us to a story through typography and design.

Fouchet was a film titles designer born in France in 1918. One of France’s most well-known title designers during the 1960s and ’70s, Fouchet began his career in film as a set designer, decorator, and assistant operator.

In 1950, he was hired as a designer of special effects and credits for the Lax company and soon he founded his own company, F.L., with which he created title sequences, trailers, and special effects per Art of The Title.

Fouchet designed special effects for films such as The Longest Day (1962), The Train (1964), and Up To His Ears (1965) and his title sequence credits include Last Year at Marienbad (1961), Phaedra (1962),The Umbrellas of Cherbourg (1964), Topkapi (1964), Les Demoiselles de Rochefort (1967) and more.

Following is a tribute to Fouchet’s title design legacy before the brand new tears flood the movie theatres again.

The Umbrellas of Cherbourg is released in the UK on 6 December.

Slider images captions: Phaedra by Jules Dassin (1962),  Le Président by Henri Verneuil (1961). All images via Annyas

2019-12-01

Typeroom

World AIDS Day: brutal and humane, the US national AIDS crisis in posters

Tags

In 1981, a new disease appeared in the United States. As it spread, fear and confusion pervaded the country. The infectious “rare cancer” bewildered researchers and bred suspicion, but the worry was not the same for everyone. Many feared contact with those who were ill. Others, particularly but not exclusively gay men, feared for their lives and the lives of loved ones.

Reactions to the disease, soon named AIDS (acquired immune deficiency syndrome), were as varied as the uncertainties about it. Early responders cared for the sick, fought homophobia, and promoted new practices to keep people healthy. Scientists and public health officials struggled to understand the disease and how it spread. Politicians remained largely silent until the epidemic became too big to ignore. Activists demanded that people with AIDS be part of the solution.

Takings its title from “Surviving and Thriving,” the book written in 1987 by and for people with AIDS that insisted people could live with AIDS, not just die from it, “Surviving and Thriving: AIDS, Politics and Culture” is a traveling exhibit and online adaptation curated by the National Library of Medicine that explores the rise of AIDS in the early 1980s, as well as the medical and social responses to the disease since.

Posters, comic books, and postcards focused on AIDS are as old as the disease itself” notes the exhibition's curators. “Representing some of the best forms of public health outreach, this ephemera contains images and slogans that communicate a range of ideas about AIDS—from how to prevent it from spreading, to how to care for people with AIDS, to how to talk to children about the disease.”

“Inexpensively printed and distributed, these colorful materials performed a great deal of social and political work over the course of the 1980s and 1990s. Whether originally wheat-pasted on bus stops or the sides of buildings, hung in municipal office spaces, or placed in the waiting rooms of doctors’ offices, these documents provide powerful historical evidence about how people did and did not confront HIV/AIDS.”

“Posters, comic books, and postcards focused on AIDS are as old as the disease itself”

Safe sex is hot sex, 1991. Developed in 1990 by the Red Hot Organization, a leading international initiative dedicated to fighting AIDS through popular culture, this campaign featured diverse people locked in intimate poses. The posters combined text and visuals to normalize and eroticize safe sex. Both provocative and instructional, the carefully positioned subjects aimed to encourage viewers to change their sexual practices. The voyeuristic presentation worked in conjunction with the message: sex can be enjoyable and safe for all couples regardless of sexual orientation

Stop worrying about how you won’t get AIDS and worry about how you can, 1980s. Misinformation ran rampant through much of the 1980s as people struggled to interpret the new information about risk behaviors and safe practices. By providing the beginnings of a list of safe practices, this poster from American Responds to AIDS helped spread accurate information and dispel some of the stigma and fear associated with people with AIDS.

“Please Be Safe” by the Northwest AIDS Foundation. In 1987, with funding from the U.S. Conference of Mayors, the Seattle-based Northwest AIDS Foundation launched the “Please Be Safe” campaign to help gay and bisexual men reimagine their sexual behaviors.  Using a different creative visual strategy than the sexually charged imagery of some contemporaneous public health efforts, this campaign used road signs—a straightforward, familiar set of symbols—to discuss and advertise sexual safety. The “Please be Safe” or “Rules of the Road” campaign used road signs and compelling, straightforward, community-specific language to help gay men engage in safer sex. The campaign sought to establish these practices as the new norm for all. The “Sexual Safety Card” featured on many of the posters provided quick and accessible information on activities at every level of safety.

The more you score, the greater your chances of losing the game, 1980s. This fear-mongering poster offered a simple solution for preventing AIDS: Just say no. . Advice like this ignored the complexity of human behavior and, as such, missed out on the opportunity to educate people on realistic, alternative strategies.

Some people think you can catch AIDS from a glass, you can’t, 1980s by the California Medical Association. Evidence that AIDS was spread by the exchange of “bodily fluids” provided many opportunities for misunderstanding. Even after researchers proved that saliva could not transmit AIDS, the fear of used drinking glasses, shared eating utensils and kissing persisted well into the 1980s.

Perform a death-defying act, 1987 by the Oregon Health Division. This huge headline communicated a simple solution. By depicting condoms as a common, easy-to-use solution, this poster from the Cascade AIDS Project in Portland made protection approachable.

AIDS is a white man’s disease, famous last words, 1980s by People of Color Against AIDS. Directed to the black community, this poster used straightforward language to debunk the all-too-widespread idea that AIDS was a gay, white disease.

You Won’t Get AIDS From. With misinformation too readily available, the You Won’t Get AIDS From campaign from America Responds to AIDS attempted to reach a wide audience with neutral images and straightforward information on ways people could not “catch” AIDS.

From the “Safe Sex is Hot Sex” campaign dubbed too hot to handle during the Reagan era to the “America Responds to AIDS” campaign which promoted the “everyone is at risk” message of AIDS prevention, generalizing the epidemic beyond its core targeted communities which suffered the most, through the “Fear Mongering” ominous posters which declared in big bold type the danger of “sleeping around” whilst providing little -if any- information on how to prevent the spread of AIDS epidemic, the exhibition presents through posters and visual elements the US national AIDS crisis in pictures.

Explore more here

Slider images captions: ACT UP is watching: Photo from one of thousands of demonstrations nationwide, reminding officials that activists were watching. ACT UP (The AIDS Coalition to Unleash Power) was founded in March 1987. As of 2012, ACT UP chapters in nearly every major city continued to champion for rights for people with AIDS. Ignorance = fear, silence = death, 1989: In the 1980s and 90s, artist Keith Haring’s widely recognized figures championed AIDS education and compassion, what was then a new cause. Today, Haring’s foundation continues to support AIDS organizations nationwide, including AIDS Project Los Angeles, the Elizabeth Glaser Pediatric AIDS Foundation, and Gay Men’s Health Crisis. Read my lips, 1988Gran Fury, an artists’ collective within ACT UP (The AIDS Coalition to Unleash Power), created iconic materials, including postcards, to spread information and promote education about HIV and AIDS. The mantra featured here subverts President George H. W. Bush’s notable quip about no new taxes, delivered during the 1988 Republican National Convention, to call attention to his ambivalent support on AIDS outreach, education, research, and support.

2019-11-28

Typeroom

2024 Paris Olympic Games: Graphéine's visual identity is better than yours

Tags

Almost a month ago Paris unveiled a gold-medal-shaped emblem for the 2024 Olympics and Paralympic Games. According to the official press release “the emblem embraces the shape and color of the most beautiful medal of all to express one of the core values of sport: striving for excellence. That same commitment also informs every step that Paris 2024 is taking in organizing the Olympic and Paralympic Games Paris 2024, so that it can fulfill the pledges it has made to stage a different, grounded, sustainable and inclusive Games.”

Eventually, Twitter-verse had its own take of the Paris 2024 Olympic logo which was mercilessly mocked -even though the gold design (apart from representing gold medals), the variable custom-designed typeface and Art Deco styling is a nod to the last time Paris hosted the games, in 1924.

"Its pure, understated lines and its original typeface take their inspiration from Art Deco, the first complete artistic movement, which reached its height at the 1924 Games in Paris," says a statement announcing the logo which per Creative Bloq's Rosie Hilder “looks like either a) the Tinder app logo, aka a sexy flame b) a condom packet or c) a hair salon logo. There's nothing about it that makes us want to jump off the sofa/out of bed/away from our screen to do the javelin or the long jump or pursue some other Olympic dream.”

After all the buzz another visual identity emerged with rave reviews from many -even though it was not selected. Graphéine's team studied the question of the visual identity of the 2024 Paris Olympic Games and its answer is presented in the agency's blog.

“Imagine an impossible timetable (only 3 weeks to send a response....) and all the conditions that we usually denounce (no financial compensation for submitting a project). Nevertheless... it's the Olympic Games! What's more, in Paris, our city capital! So we offered a creative sprint to our teams. 48 hours to find an idea. Here is the project we have collectively imagined” writes Graphéine.

“For a branding project to be successful, the logo must be based on an idea that is sufficiently 'fertile' to inspire the multitude of necessary variations. Indeed, it is in the polysemy of a sign that the richness of the declinations will be able to sprout.  Thus some signs can easily be imagined in movement, volume, space... while others will seem 'static and uninspiring'. In our opinion, this visual polysemy is a guarantee of sustainability. Indeed, the strong repetition of a sign inevitably causes a form of habit, even weariness. A sign that has several levels of reading will be much more resistant to time.”

“Baudelaire used to define modernity as the meeting of fashion ("mode" in French) and eternity.  It is precisely in the conjunction of these two temporal notions that a logo must be found. On the one hand, he must be in tune with the times to make himself desirable in the eyes of its contemporaries. On the other hand, it is destined to go down in the annals of sport and Paris, and it will continue to live for many years after the end of the Games” notes the agency of its proposal.

Building the logo around the concept of an Eiffel Tower sports track the proposal is sporty with “fluid curves, fast lines. A sports track emerges before our eyes, telling the story of movement and speed.”

Solidly placed on the ground, a 3D Eiffel Tower is emerging and “the magic of this sign comes from its possible double reading. In turn sports track and Eiffel Tower. The eye seems not to want to choose.”

This is a logo “that tells a story of encounter and convergence. It is a profoundly humanistic and optimistic sign that offers a perspective of collective progress.”

The agency's open invitation to the world with five colors for the five continents of our world echoes “the IOC motto with the symbol of the Eiffel Tower. It affirms the audacity to go further, thus illustrating the idea of surpassing oneself.”

“For a branding project to be successful, the logo must be based on an idea that is sufficiently 'fertile' to inspire the multitude of necessary variations”

“The two logos are from the same shape” notes the agency of its two logos for a shared vision.

”A simple change of angle of view makes it possible to switch from the logo of the Olympic Games to that of the Paralympic Games. This symbolic rapprochement between the two logos is a strong gesture that can act as a bridge between the world of able-bodied athletes and that of athletes with disabilities.”

“This project was not selected... But, as Pierre de Coubertin said, the most important thing is to participate!”

”A simple change of angle of view makes it possible to switch from the logo of the Olympic Games to that of the Paralympic Games. This symbolic rapprochement between the two logos is a strong gesture that can act as a bridge between the world of able-bodied athletes and that of athletes with disabilities.”

Check more here

All images via Graphéine

The first LaTeX prerelease for 2020-02-02 is available for testing

A few days ago we have submitted a new LaTeX development format to CTAN and by now it should be available to all users using MikTeX or TeX Live (on any operating system).

This format allows you to test the upcoming LaTeX release scheduled for 2020-02-02 with your documents or packages. Such testing is particularly important for package maintainers to verify that changes to the core LaTeX haven’t introduced incompatibilities with existing code. We try to identify any such problem beforehand, but such an undertaking is necessarily incomplete, which is why we ask for user testing.

Besides developers we also ask ordinary users to try out the new release candidate, because the more people are testing the new format, the higher the chances that any hidden problems are identified before the final release in February hits the streets.

Processing your documents with the prerelease is straight forward. All you have to do is to replace the invocation command by appending -dev to the executable, e.g., on the command line you would run

pdflatex-dev myfile    or    lualatex-dev myfile    or    xelatex-dev myfile


instead of using pdflatex, lualatex or xelatex. If you use an integrated editing environment, then it depends on the system how to configure it to use an alternative format; but in any case the necessary modification should be straight forward.

Main features of 2020-02-02 prerelease 2

For technical reasons this first prerelease is labeled “prerelease 2”. Prerelease 1 existed only for a few hours on CTAN, as it was missing some files due to a problem in the build process, but because it was already on CTAN we had to increase the prerelease number.

A new LuaTeX engine coming up …

In TeXLive 2020 the LuaLaTeX format will always use the new LuaHBTeX engine, which is LuaTeX with an embedded HarfBuzz library. HarfBuzz can be used by setting a suitable renderer in the font declaration. An interface for that is provided by the fontspec package. This additional font renderer will greatly improve the shaping of various scripts, which are currently handled correctly only by XeTeX.

To simplify the testing of the new engine, the necessary executables have been added to MiKTeX and TeXLive 2019 and both have changed the LuaLaTeX-dev format to use it. This means you can already now test the new engine by using the prerelease!

Even if you have no need for the new HarfBuzz functionality it might be worthwhile running some tests, because, from 2020 onwards this will be the only LuaTeX engine for which a LaTeX format will be distributed in TeXLive and MikTeX from 2020 onwards.

Improved load-times for expl3

The LaTeX3 programming layer, expl3, has over the past decade moved from being largely experimental to broadly stable. It is now used in a significant number of third-party packages, either explicitly or implicitly (e.g., by loading xparse to define user command interfaces), so that many documents load the expl3 code at some point in the preamble. Most LaTeX documents compiled using XeTeX or LuaTeX load fontspec, which is written using expl3 so in these engines it is nearly always used.

The expl3 layer contains a non-trivial number of macros, and when used with the XeTeX and LuaTeX engines, it loads a large body of Unicode data. This means that even on a fast computer, there is a relatively large load time penalty whenever expl3 is needed.

For this release, the team have made adjustments in the LaTeX 2e kernel to pre-load a significant portion of expl3 when the format is built. This is transparent at the user level, other than the significant decrease in document processing time: there will be no “pause” for loading Unicode data files. Loading of expl3 in documents and packages can be done as usual; it will, at some future time, become possible to omit

\RequirePackage{expl3}


entirely, but, to support older formats, this explicit loading is at present still recommended.

Please test now — even though there is more to come

We ask you to test now, because (unless we made some mistakes) the above changes should be transparent to the user — other than speeding up the load process or allowing to handle complicated scripts with LuaLaTeX that before would not work correctly.

Probably early January we expect to distribute another prerelease which will most likely contain improvements to the font support for all engines.

The prereleases will never go stale in the future (we hope)

Starting with this prerelease we will make sure that the prerelease in the distributions either matches the main release code or will be newer than the current main release. This was not the case in the last few weeks when the main release was 2019-10-01 with a few hot fixes added, but the -dev format was still from some day prior to October.

In the future we intend to ensure that this does not happen, so that, to avoid confusion, the distributed development release is always newer or the same as the current main release.

More details on prereleases please …

More details and some background information about these concepts and the process of the development formats is available in a TUGboat article:

The LaTeX release workflow and the LaTeX dev formats

• Frank Mittelbach
• TUGboat 40:2, 2019
• Abstract

How do you prevent creating banana software (i.e., software that gets ripe at the customer site)? By proper testing! But this is anything but easy.

The paper will give an overview of the efforts made by the LaTeX Project Team over the years to provide high-quality software and explains the changes that we have made this summer to improve the situation further.

Enjoy — Frank

The first LaTeX prerelease for 2020-02-02 is available for testing

A few days ago we have submitted a new LaTeX development format to CTAN and by now it should be available to all users using MikTeX or TeX Live (on any operating system).

This format allows you to test the upcoming LaTeX release scheduled for 2020-02-02 with your documents or packages. Such testing is particularly important for package maintainers to verify that changes to the core LaTeX haven’t introduced incompatibilities with existing code. We try to identify any such problem beforehand, but such an undertaking is necessarily incomplete, which is why we ask for user testing.

Besides developers we also ask ordinary users to try out the new release candidate, because the more people are testing the new format, the higher the chances that any hidden problems are identified before the final release in February hits the streets.

Processing your documents with the prerelease is straight forward. All you have to do is to replace the invocation command by appending -dev to the executable, e.g., on the command line you would run

pdflatex-dev myfile    or    lualatex-dev myfile    or    xelatex-dev myfile


instead of using pdflatex, lualatex or xelatex. If you use an integrated editing environment, then it depends on the system how to configure it to use an alternative format; but in any case the necessary modification should be straight forward.

Main features of 2020-02-02 prerelease 2

For technical reasons this first prerelease is labeled “prerelease 2”. Prerelease 1 existed only for a few hours on CTAN, as it was missing some files due to a problem in the build process, but because it was already on CTAN we had to increase the prerelease number.

A new LuaTeX engine coming up …

In TeXLive 2020 the LuaLaTeX format will always use the new LuaHBTeX engine, which is LuaTeX with an embedded HarfBuzz library. HarfBuzz can be used by setting a suitable renderer in the font declaration. An interface for that is provided by the fontspec package. This additional font renderer will greatly improve the shaping of various scripts, which are currently handled correctly only by XeTeX.

To simplify the testing of the new engine, the necessary executables have been added to MiKTeX and TeXLive 2019 and both have changed the LuaLaTeX-dev format to use it. This means you can already now test the new engine by using the prerelease!

Even if you have no need for the new HarfBuzz functionality it might be worthwhile running some tests, because, from 2020 onwards this will be the only LuaTeX engine for which a LaTeX format will be distributed in TeXLive and MikTeX from 2020 onwards.

Improved load-times for expl3

The LaTeX3 programming layer, expl3, has over the past decade moved from being largely experimental to broadly stable. It is now used in a significant number of third-party packages, either explicitly or implicitly (e.g., by loading xparse to define user command interfaces), so that many documents load the expl3 code at some point in the preamble. Most LaTeX documents compiled using XeTeX or LuaTeX load fontspec, which is written using expl3 so in these engines it is nearly always used.

The expl3 layer contains a non-trivial number of macros, and when used with the XeTeX and LuaTeX engines, it loads a large body of Unicode data. This means that even on a fast computer, there is a relatively large load time penalty whenever expl3 is needed.

For this release, the team have made adjustments in the LaTeX 2e kernel to pre-load a significant portion of expl3 when the format is built. This is transparent at the user level, other than the significant decrease in document processing time: there will be no “pause” for loading Unicode data files. Loading of expl3 in documents and packages can be done as usual; it will, at some future time, become possible to omit

\RequirePackage{expl3}


entirely, but, to support older formats, this explicit loading is at present still recommended.

Please test now — even though there is more to come

We ask you to test now, because (unless we made some mistakes) the above changes should be transparent to the user — other than speeding up the load process or allowing to handle complicated scripts with LuaLaTeX that before would not work correctly.

Probably early January we expect to distribute another prerelease which will most likely contain improvements to the font support for all engines.

The prereleases will never go stale in the future (we hope)

Starting with this prerelease we will make sure that the prerelease in the distributions either matches the main release code or will be newer than the current main release. This was not the case in the last few weeks when the main release was 2019-10-01 with a few hot fixes added, but the -dev format was still from some day prior to October.

In the future we intend to ensure that this does not happen, so that, to avoid confusion, the distributed development release is always newer or the same as the current main release.

More details on prereleases please …

More details and some background information about these concepts and the process of the development formats is available in a TUGboat article:

The LaTeX release workflow and the LaTeX dev formats

• Frank Mittelbach
• TUGboat 40:2, 2019
• Abstract

How do you prevent creating banana software (i.e., software that gets ripe at the customer site)? By proper testing! But this is anything but easy.

The paper will give an overview of the efforts made by the LaTeX Project Team over the years to provide high-quality software and explains the changes that we have made this summer to improve the situation further.

Enjoy — Frank

2019-11-26

Typeroom

Typography as raw material: République Studio on their AJAP 2018 visual fest

Simple is not boring -at least not in the portfolio of one of our favorite design studios made in France aka République Studio, who infused their modernity to AJAP 2018.

AJAP” (Albums des Jeunes Architectes et Paysagistes) is a biennial competition organized by the French Ministry of Culture. They distinguish young European architects and landscape designers under 35 who have realized a project or participated in a competition in France.

The exhibition presents the work of the twenty winners constituting the 2018 promotion, fifteen teams of architects, five landscaping teams.

République Studio, a team of creatives “often inspired by the zeitgeist” did all the design, catalogs, and signage for AJAP 2018 which will be exhibited in all the French Institutes around the world for the next couple of years.

Typeroom's Loukas Karnis caught up with the award-winning creative direction and graphic design practice based in Paris to discover how type makes the life of creatives a place of wonders and versatility.

How did you approach the design of AJAP 2018?

As the exhibition is traveling all around the world, we discussed with the scenographer on which kind of printed matter would be best to showcase the architects and their works.

It had to be lightweight, easy to build, easy to print, in short, easy to travel. Gaspard came up with the idea of a wooden module, easy to assemble, that could carry two posters aboard.

The traveling exhibit is bilingual, French and the native language of the country where it is shown. We figured that the posters would display only big images and the explanatory texts would be shown in small booklets on the board. Therefore the visitor could take a booklet with him and bring a small part of the exhibit home.

As for the design, we wanted something that would work well with the wooden modules, unique and light.

Which are the most important elements in AJAP 2018’s visual language?

Type was definitely the most important part. The typeface Past Perfect, used for titles, emphasizes the wooden modules. The letters look like wooden sticks and give to the whole project an «architectural and playful vibe».

Plus, it contrasts with Basis Grotesque, used for body text, which is more shapely yet more comfortable for the reader to focus on content.

Which was the most difficult part of this project?

Keeping things simple without being boring.

How long have you been working on it?

It took us around 3 weeks to design all the booklets, catalog, signage and make the layout of the posters.

How important is type in creating a unique voice for the branding of a project?

At Republique Studio every work begins with a typeface. We think a typeface is a big part of the identity of the project because it can provoke different emotions. That’s why you really have to choose well the fonts you will use.

We always want to find the typefaces which will tell the good story, and which speak the same language as the project itself.

A typeface, and the way you use it, has real power on the identity of a brand. By looking at a typeface, it can remind you of a brand, or a movie poster, or an album, etc. That’s why when you start working with a new client, it’s important to choose a typeface that doesn’t have a recognizable history already.

The typeface has a real impact on your memory whether you want it or not. That’s why we try not to use the same typeface on many projects.

Would you consider bespoke typefaces the most important factor to express attitude though letterforms?

Typography is not just about choosing a typeface, what is most important is also how you use it, where, for which audience, in what size, in which color, etc. If you choose a well-known typeface but use it in a unique and intelligent way, then your message can be just as bold.

What is Republique Studio’s motto to live and work by?

Don’t overuse the same typefaces, because you don’t want the whole world to look the same! Or at least, change the way you are using them.

We don’t want to see the same identity everywhere. Diversity makes our world beautiful!

What are you working on now?

We are working on signage for museums, which is really exciting because we love to see big type in action. Also websites, branding and poster/communication for exhibitions.

How would you describe the French typographic scene? What's your take on it?

There are many good type designers in France! Too many to name all of them. Sure thing is that the type game is global today, and you can find good designers all around the world.

We are living in a liberating era where type design became very accessible and talented people can very easily make their point and live from their art. Typography is our raw material and type designers our best friends :)

“The typeface has a real impact on your memory whether you want it or not. ”

All images via République Studio. Set design by Gaspard Pinta / Photos by Julien Lelièvre

2019-11-25

Typeroom

It stops now! Design for social good rules in Piquant's anti-violence campaign

Design can inform, educate and promote social change” notes Piquant Media of its “It Stops Now” campaign, which aims to end sexual violence and harassment in third level institutions.

Piquant Media worked with Ireland's National Women’s Council of Ireland (NWCI), the country's leading women’s membership organization seeking equality for women, on the branding, video, web design and development of “It Stops Now”, an international campaign which aims to raise awareness of the prevalence of gender-based harassment and violence in third-level education and to engage both staff and students in combatting all forms of harassment and violence against women students throughout Europe.

NWCI lead the campaign with project partners in Cyprus, Lithuania, Scotland and Germany. Each project partner worked with higher education institutions, statutory agencies and NGOs.

Piquant Media worked with the team at NWCI to create a brand identity and project content that would resonate with students and staff at third level institutions around Europe.

“This brand had to have the ability to capture people’s attention and provoke a response. The goal of the project is to generate awareness but also connect with the third-level community to show them that they have the power to make the change” notes Piquant.

“As graphic designers, we should be aware of the responsibility we have in our community as we don’t only design nice and attractive shapes, we are social communicators, no matter what type of client we’re working for” writes Victoria Brunetta, the studio's graphic designer of the task to design for good.

“How we develop and visualize a communication strategy will eventually impact on people’s perception. For better or worse, graphic designers have become key actors in the consumption chain development.”

“I hold the opinion that our mission when visualizing a social movement is also to bring in to question the perceptions most people have around issues such as feminism, homelessness, immigration, poverty, racism, education, environment… In other words, how can we challenge dominating stereotypes that harm the dignity of the most vulnerable and communicate a clear, powerful message that encourages people to take action? Or, eventually, how can that message give the audience a different perspective on community development and social change? In my eyes, ultimately, the challenge when trying to visualize a social movement or any human rights-related issue, is how can I, as a graphic designer, help people to live in a more equal society?”

“How we develop and visualize a communication strategy will eventually impact on people’s perception. For better or worse, graphic designers have become key actors in the consumption chain development.”

Check out the It Stops Now project here.

Beazley Designs of the Year 2019: Artificial Intelligence, stencil & more winners

Drum-roll, please! The overall winner of Beazley Designs of the Year is “Anatomy of an AI System” and Artificial Intelligence is under the microscope for this groundbreaking “anatomical case study of the Amazon Echo as an artificial intelligence system made of human labour, data and planetary resources.”

“You will never look at your smart home hub the same way again,” said chair of the judges and Royal College of Art vice-chancellor Paul Thompson of the project.

“The consensus among climate scientists is that human activity is the root cause of an ongoing planetary crisis. The way in which everyday decisions and the devices we buy can add to this issue is sometimes difficult to comprehend. Taking a consumer’s conversation with Alexa, Amazon’s voice-activated assistant, as its starting point, designers and researchers Kate Crawford and Vladan Joler created a map and essay to represent the impact of the creation, use and disposal of just one of the many Amazon Echo units that have been purchased to date.”

Whilst the “Anatomy of an AI System”  charts the birth, life and death of a voice assistant and its impact on our planet Pentagram's Sascha Lobe and his team won the Graphics award for Beazley Designs of the Year for their “clever way-finding system that uses pictograms designed for Amorepacific, one of the world’s largest cosmetic companies, operating over thirty health, beauty, and personal care brands including Etude House, Innisfree and Laneige.

Sascha Lobe and his team have designed the architectural branding, environmental graphics, and signage for Amorepacific's new corporate headquarters in Seoul, South Korea.

Lobe's team worked closely with David Chipperfield Architects who designed the building. Reflecting the headquarters’ unique geometry, the team devised a matrix that uses Hangul-inspired pictograms to orientate visitors notes Pentagram

“Taking the form of a grid divided into nine sub-squares, level position is communicated through the middle square, with the framing eight squares providing information relative to position, as defined by the ‘corner number’ and ‘geographic location’ hierarchies.”

“Lobe and his team created a Hangul-inspired Latin typeface specifically for the Amorepacific Headquarters. As with the numerals and pictograms, the pared-back, stencilled font is constructed using the Hangul method, resulting in a balanced and consistent visual language”

“English, Chinese and Korean are all used throughout the building and presented a unique challenge for the designers. These systems each possess different complexities, and while Korean and Chinese scripts have some formal relation to one another, it is difficult to achieve a satisfactory integration of Latin characters with a line thickness and font size of similar appearance.”

“Since no typeface existed that would fulfill these criteria completely, Lobe and his team created a Hangul-inspired Latin typeface specifically for the Amorepacific Headquarters. As with the numerals and pictograms, the pared-back, stencilled font is constructed using the Hangul method, resulting in a balanced and consistent visual language throughout the space.”

A total of 76 innovative designs including clothing, buildings, apps, books, homewares and posters that have made a strong global impact were nominated for this year's award, which is given by London's Design Museum.

Following are all the winners of this year's competition.

Beazley Design of the Year Overall Winner: Anatomy of an AI System by Kate Crawford and Vladan Joler

Beazley Architecture Design of the Year: Maya Somaiya Library

Beazley Graphic Design of the Year: Amorepacific architectural branding by Pentagram and David Chipperfield Architects

Beazley Product Design of the Year: CATCH by Hans Ramzan

Beazley Fashion Design of the Year: Adidas Originals by Ji Won Choi

Beazley Transport Design of the Year: GACHA Self-driving Shuttle Bus by MUJI and Sensible 4

Beazley Digital Design of the Year: Anatomy of an AI System by Kate Crawford and Vladan Joler

Beazley Design of the Year - People's Choice: MySleeve by Marie Van den Broeck

Explore more Beazley Designs of the Year 2019 here.

2019-11-24

Uwes kleines Technikblog - Kategorie LaTeX (by Uwe Ziegenhagen)

Kalender mit tikz-calendar erstellen

Hier ein Beispiel, wie man mit tikz-calendar Jahreskalender erstellen kann. Kalenderereignisse müssen in einer externen Datei abgelegt werden, im Beispiel ist das die meineevents.events.

Die Farbnamen für die einzelnen Elemente lassen sich in https://www.sciencetronics.com/greenphotons/wp-content/uploads/2016/10/xcolor_names.pdf nachlesen.

\documentclass{tikz-kalender}

\setup{%
lang=german,
year=2020,
showweeknumbers=true,
title={Urlaub},
xcoloroptions={x11names},
titleColor=cyan,
eventColor=brown,
periodColor=lime,
monthBGcolor=red,
monthColor=Purple0,
workdayColor=yellow,
saturdayColor=magenta,
sundayColor=orange,
events={meineevents} % Einbinden der events-Datei
}

\begin{document}
\makeKalender
\end{document}


Hier der Inhalt der meineevents.events:

\event{\year-10-09}{John Lennon (1940)}
\event{2020-10-03}{Tag d. dt. Einheit}
\event*{2020-04-12}{Ostersonntag}
\period{2020-02-01}{2020-02-06}[color=Gray0,name={Urlaub}];


Uwe

Uwe Ziegenhagen likes LaTeX and Python, sometimes even combined. Do you like my content and would like to thank me for it? Consider making a small donation to my local fablab, the Dingfabrik Köln. Details on how to donate can be found here Spenden für die Dingfabrik.

üääü­üöü–üöü–üöü–

2019-11-15

Typeroom

It's all in the A! The Atlantic is stunningly redesigned with a bespoke typeface & a monogram

Led by creative director Peter Mendelsund and senior art director Oliver Munday, the well-respected book publishing design team who came on board full-time at The Atlantic almost a year ago, the magazine unveiled a new visual identity complete with a new logo, custom typeface, updated website, and iOS app with its December issue.

“It is the most dramatic new look for our magazine in its 162-year history, and one that, we hope, reflects boldness, elegance, and urgency,” writes Jeffrey Goldberg, the editor in chief of The Atlantic, author and a recipient of the National Magazine Award for Reporting.

“The interesting thing to me about the first cover in 1857 is how clear the hierarchy of information is. The forthrightness, the omitting of needless information, the seriousness of purpose and mission—I would say those are all components of the design that represent what The Atlantic, as an institution, does well” notes Mendelsund of his team effort to redesign the historic title.

“The most notable change in this redesign is the new nameplate, the move to the A as representative of the whole” adds Goldberg of the wordmark aka “an emblem—a logo.”

Set in the all-new bespoke typeface Atlantic Condensed based on the type forms that the founders chose for the first issue the font speaks of the magazine's legacy as well.

“We started with a condensed capital A that pointed toward an old version of The Atlantic logo drawn by Boston Type Foundry in the mid-19th century,” Munday explains to AdAge. With some adjustments for something "that felt weightier or slightly more bespoke” the design team added a notch to the top of the A, adjusted the feet and then hired typographer Jeremy Mickel for final refinement.

Eventually, the process led to the creation of a new custom condensed typeface, a full alphabet based on the original Boston Type Foundry sample used for the A.

This serif typeface known also as a “Scotch” face -a term describing the way the serifs are designed- is “an extremely legible, classical kind of typography, but also transmits a certain kind of vehemence and urgency that works nicely for our contemporary purposes.”

Atlantic Condensed, the bespoke and extremely legible serif typeface, aims to transmit a certain kind of vehemence and urgency that works nicely for The Atlantic's contemporary purposes

Per Mendelsund the new design had to be “readerly” and “to feel confident” without “clamoring for your attention in too many ways.” The task has been accomplished per the designers in a number of different ways.

“One is through good grids, making sure that the page itself has a rigorous, almost Euclidean logic to the way it’s laid out. Another is by ensuring that the type is interrupted as little as possible and that when it is interrupted by imagery, the imagery is contained within its own cordoned-off space.”

With the single-letter logo being the most striking aspect of this year's redesign -the magazine’s last full redesign happened in 2009- the A is set in roman, or normal, type. The upright letter conveys more authority, Mendelsund says. “Italics are typically not meant to work as a main graphical component but rather call attention in a body of text that is upright.”

Accompanied by a whole ecosystem of “engraved” nautical emblems to serve as visual elements of the Atlantic's latest redesign this is by far one of the most daring and visual pleasing rebranding projects of the year for a title which was launched in the autumn of 1857 by Boston publisher Moses Dresser Phillips

The first issue of The Atlantic Monthly was published in November 1857 and quickly gained fame as one of the finest magazines in the English-speaking world. The magazine, which billed itself as a "journal of literature, politics, science, and the arts," was an immediate success and within two years the circulation of the magazine had risen above 30,000.

Two centuries later, on July 28, 2017, The Atlantic announced that multi-billionaire investor and philanthropist Laurene Powell Jobs (the widow of former Apple Inc. chairman and CEO Steve Jobs) had acquired majority ownership through her Emerson Collective organization. Now the magazine, subscribed to by over 500,000 readers, publishes ten times a year. Obviously the saga continues with an A.

Discover more here.

All images via The Atlantic

2019-11-11

Typeroom

Rüdiger Schlömer will teach us how to type knit our lives for good

Swiss designer Rüdiger Schlömer is the kind of creative you can count on even under the worst and most extreme cold weather conditions. After all, he is a man who knows how to infuse type into knitting.

Schlömer's book, Pixel, Patch und Pattern: Typeknitting (published by Verlag Hermann Schmidt) has been recently awarded the «Certificate of Typographic Excellence» from Type Directors Club, is currently on display in The World’s Best Typography exhibition (TDC65) and as Pentagram's Eddie Opara notes, is filled with surprises.

Innovative, insightful and playful, the book -which has also been nominated for Design Prize Switzerland 2019/20 for «Communication Design»- challenges the readers to learn to knit a variety of typefaces modeled on digital designs by well-known type foundries including Emigre, Lineto, Parachute, and Typotheque and emblazon ones hats, scarves, and sweaters with smartly designed monograms, letters, or words.

“At first glance, the cover is extremely unassuming and without further inspection you would walk right past it. With a closer examination, its apparent pixelated, title in all caps wears a distorted, woven-textured effect with its added hybrid brew of Anglo-German: Pixel, Patch und Pattern. As your eyes peer down to the bottom of the cover, innocuously set in Futura, Typeknitting leaves you intrigued and you are overcome with curiosity” notes TDC's Communication Design Judge Opara of the publication which was published last winter by Verlag Hermann Schmidt.

“When opening the book, behold the incredible surprises that wait! You are taken into an alternative culture of typography that is knitted! The different stitching techniques are endless and transformative. The open, playful, thought-provoking, effortless, enterprising, and dynamic qualities are profound. You relish the fact that every design is made by hand. The end results are awe-inspiring. It brings life to modularity and systems. After viewing this book, you have to ask yourself, why do you sit at your computer, hour after hour? Are you really making something that is tangible?” he adds.

Typeroom's Loukas Karnis asked Schlömer some questions between his own type knitting lessons now that winter marches on in Europe.

So please introduce yourself to us.

I grew up in Paris, France, and Bremen, Germany and studied Visual Communication in Aachen and Art in Context in Berlin. In my studies, I became interested in experimental communicative formats that combined analog and digital principles, like hacking, reverse engineering or programming.

I find that many of these "new" principles can be found in older media or techniques, in music notation or textiles. I have been living in Zurich since 2011, Switzerland, where I work mostly on exhibition design, books, and self-initiated projects.

How did you come up with the idea of mixing type with knitting?

For the occasion of the soccer World Cup in Germany, I developed a "Fan-Scarf-Remix"-Webtool together with my friend, designer and programmer Jan Lindenberg.

On the project website, you could remix existing fan-scarfs into individual messages and then download your remix as a knitting pattern. It was a playful, deconstructive answer -a hack- to the idea of national identity and its instrumentalization.

I started the project without any practical knitting skills, thinking that the knitting part would be just like printing out the final result ... which was very wrong. Suddenly I discovered the multitude of personal and regional knitting techniques and got interested in their practical exploration. Eventually I started a private knitting circle in my flat in Berlin with a couple of friends.

Which was the first knitted typographic product you came up with?

The first machine-knit product I developed was remix-fan scarves editions, which I released in small editions. There are big differences between hand and machine knitting.

Hand knitting is very accessible, you just need two needles and some yarn. Machine knitting allows larger editions, but it needs a technical infrastructure and brings you back to the computer. I like to keep switching between both, analog and digital.

“My Remix Fan Scarf editions are made of low-quality Jpegs found on the Internet. I started with soccer scarves only, now they contain letters of various source and context. Each edition come with an open-source pattern”

You have called your project “generative typography”. Please elaborate on this genre.

When looking at all the different hand knitting techniques, I see a lot of parallels to digital tools, layout programs or plugins. “Patchwork Knitting” for example, is based on geometric patches as basic elements. Instead of following a pixel-by-pixel knitting pattern, Patchwork Knitting gives you a basic grid, which you can freely combine into color and pattern combinations.

It's basically a modular, generative approach to pattern design, just very slow, because you knit it by hand. And since it's based on patches, it's perfect for knitting modular typography.

“Patchwork knitting is basically a modular, generative approach to pattern design, just very slow, because you knit it by hand. And since it's based on patches, it's perfect for knitting modular typography”

What is your book "Pixel, Patch und Pattern – Typeknitting" all about?

My book shows typographers how to knit letters, and knitters how to include typography. It's a systematic introduction to how different types of letters can be constructed using different hand knitting techniques.

The four main chapters, PIXEL, PATTERN, PATCH, and MODULE go from intarsia knitting to slip-stitch patterns to patchwork knitting. These approaches are shown through knitted prototypes with typefaces from type designers like Andrea Tinnes or Christian Schmalohr, and type foundries like Emigre, Lineto, Nouvelle Noire, Parachute and Typotheque. It also contains instructions for some specific knitting projects, like pullovers, Selbu mittens or cushions.

How did you decide to collaborate with typographers on this project of yours?

When developing the book, I wanted to show "Typeknitting" as an open practice, not a collection of finished results. I saw my role as a mediator and translator between knitting and typography, two fields I know well enough to initiate this dialogue, without being a hardcore specialist in either.

As Typeknitting is a lot about communication through knitting and letters, the very process of it becomes communicative as well. The exchange with the contributing knitters and typographers was and still is essential in this process.

In most cases, I started with the different knitting techniques, and their structural, constructive characteristics. From there I looked for typefaces that would suit theses characteristics and then I knitted prototypes to illustrate the different principles. TypeJockey by Andrea Tinnes (Typecuts) turned this process around: it inspired an approach of combining structural and color patterns, which can be applied to many other fonts.

Typefaces are most knittable when they are pixel-based or have a strong constructive character. My book contains a collection of more than 30 typefaces, from bitmap classics like Emigre's Lo-Res and Oblong, to contemporary interpretations like Fidel Peugeot's Walking Chair. Some can be used for a specific technique, for example, dot-matrix fonts like Panos Vassiliou's Online One are perfect for slip-stitch-knitting.

Once you start exploring, the amount of knittable typefaces, knitting techniques, and the possible combinations is almost endless.

What are the similarities between the craftsmanship of knitting and type design if any?

I think that the knitting structure has many parallels to a piece of text -even if you are not knitting letters. Loops form a texture, just like letters form a text. Roland Barthes described the text as “a tissue [or fabric] of quotations”. Vice versa, you can also look at the knitting fabric as a text or code. But I would rather compare knitting to digital practice in general. The whole Typeknitting approach is somehow about the relationship between digital craft and analog programming, which form a kind of digital craftsmanship.

“When developing the book, I saw my role as a mediator and translator between knitting and typography, two fields I know well enough to initiate this dialogue”

Lots of people compare knitting to the protocols of software and the “Zen” qualities of the loops themselves. What's your take?

It depends whether you look at the process or at the physical result. When looking at finished knitwear you only see the visual characteristics of knitting textures and materials. This is like looking at a finished painting compared to experiencing the single brushstrokes. When you get into the process of knitting, the subjective experience is very repetitive, meditative, almost hypnotic. It's a very logical, algorithmic process, which definitely shows parallels to software. And as knitting is mostly based on two variables (knit and purl) it can be seen as a form of manual programming. I see many similarities and think it's no coincidence that knitting patterns look like code.

Do you consider yourself a fashion or a typography hacker?

“Hacking” as I originally understand it, describes a subversive intervention of a system from the outside. Today you find so-called “Life Hacks” in every mainstream magazine. Using your leftover coffee grounds for plants has become a hack, or using empty shampoo bottles to store valuables. Most of these tricks my grandmother used a long time ago, they were just basic household knowledge.

Which makes you think, either hacking wasn't as subversive in the first place ... or that the imaginative usage of older things and practices (like knitting) have a lot of potentials to be discovered, which is helpful to many. I see my book as a “Plug-In”, a joint between typography and knitting.

If you were a knitted typographic element which one would you be and why?

My favorite element right now it the slip-stitch. Instead of knitting a loop, you simply slip the yarn over from the left to the right needle. This makes a pre-writing, pre-typographic element, which later can become a sign (or part of a sign), although it's basically a pause in the structure. I like the simplicity of this. It also looks to me like it could have evolved as a mistake at first, and was later systematized into a method.

What are you working on right now?

Right now I am preparing a workshop program. The next will be at the Amsterdam knitting academy "De Amsterdamse Steek", and another at the Swiss Yarn Festival. I am really excited about working with experienced knitters, to get practical feedback on my methods and to exchange ideas for further methods and patterns.

Parallel to that, I am constantly researching typefaces and ways of graphic applications, which I plan to explore in workshops on the graphic side. Typeknitting is made for dialogue, it's a process of constant translation between the two fields, knitting, and typography. And besides the physical projects you make with it, the evolving interdisciplinary communication is one of its most interesting side effects. I am very curious about where this will lead to.

Learn how to loop your creativity with type here (German edition) and here (English edition).