TeX & Friends (Planet by DANTE e.V.)

2020-10-01

LaTeX Project

Fall 2020 LaTeX release available

The fall 2020 LaTeX release is available

This release introduces a number of important enhancements.

A general hook management system for LaTeX

Most LaTeX users and package writers will know the handful of hooks that LaTeX has been offering until now, the most important one perhaps being \AtBeginDocument. These are important hooks, but they are far too few so that in many cases package developers had to directly patch the internals of LaTeX. This resulted in many problems.

With the new hook management system, LaTeX will get many more hooks that package writers (and authors) can use to add code in a controlled and reliable way. New hooks have been added in a number of places by using the new system and more will follow over time. Available now are:

  • Hooks to add code before and after environments (formerly offered through the etoolbox package);
  • Hooks used when loading files, packages, or classes (similar to what the filehook package now provides);
  • Hooks in the page-building process (e.g., functionality previously available through packages such as atbegshi or atveryend and a few others).

The important point here is not so much that the functionality of these packages has been integrated into the LaTeX kernel, but that the hook management system provides a single structured way for different packages to reliably add and order code. This will resolve many of the inter-package interoperability issues which formerly could be resolved (if at all) only by loading the packages in a specific order, or by the use of complex and fragile code inside the packages to account for various scenarios in user documents.

The hook management system is currently described in these three documents:

  • texdoc lthooks — The description of the interfaces and the core hooks already added to the kernel.
  • texdoc ltshipout — The documentation of the hooks available during the page production process.
  • texdoc ltfilehook — hooks that can be used before or after a file gets loaded.

Providing xparse as part of the format

In the previous release we added the LaTeX3 programming layer to the LaTeX format to improve the loading speed when packages using expl3 are used. In this release we are now extending this support by integrating xparse so that the extended interface for defining document-level commands becomes available out of the box.

This enables users and most importantly package developers to easily define LaTeX commands with multiple optional arguments or other syntax features with ease. For details, check out the xparse documentation, e.g., via texdoc xparse.

Improving the font series handling

In the previous release we extended NFSS (the new font selection scheme) to better support modern fonts that offer different font faces, e.g., condensed, semi-bold, etc., and make them work seamlessly with each other. Experiences with the extended interface showed that for some use cases adequate support was still missing or that in special setups the algorithms sometimes selected a wrong font series value. These cases have now been resolved and additional support commands have been added. For example, with

\IfFontSeriesContextTF{〈context〉} {〈true code〉}{〈false code〉}

you can now define commands that behave differently depending on the current font series context. The 〈context〉 to check has to be specified as either bf or md. The command then chooses the 〈true code〉 or the 〈false code〉 based on where it is used (e.g., inside \textbf (or \bfseries) or not).

A large number of other enhancements and corrections

There are nearly fifty other enhancements and corrections that we documented in the ltnews article for this release (and a few very minor ones that only made it into the changes.txtfile) The most important ones from a user perspective are:

  • Support for length expression in coordinates of picture commands as an alternative to decimal numbers denoting a multiple of \unitlength, e.g., \put(0,.2\textheight){...}
  • New commands to make copies of robust commands (\let wouldn’t work for them)
  • A new \ShowCommand command to display the full definition of a robust command (works also on other commands)

But read the whole ltnews article because there may be other gems that are useful for you.

Where to learn more …

The new features and most the important bug fixes made in this release are documented in “LaTeX2e News Issue 32”. This document can be found on the LaTeX2e news page where you will also find release information for earlier LaTeX releases.

Happy LaTeXing — Frank

Thursday, 2020-10-01 00:00

2020-09-27

Uwes kleines Technikblog - Kategorie LaTeX (by Uwe Ziegenhagen)

CTAN-Pakete per REST-API hochladen

Vermutlich ist es nicht so bekannt, dass man LaTeX-Pakete auch per REST-API auf CTAN hochladen kann. Das ist insbesondere dann praktisch, wenn man öfter Pakete aktualisieren muss, wie ich es beispielsweise mit der DTK Bibliografie mehrmals im Jahr mache.

Manfred Lotz vom CTAN-Team hat dazu ein Python-Skript geschrieben (https://gitlab.com/Lotz/pkgcheck/blob/master/ctan_upload.py), das diese API befüttert.

Ich habe sein Skript noch ein wenig angepasst (ich erstelle auch die ZIP-Datei damit und kopiere die richtigen Dateien an ihren Platz), mein Skript findet ihr im Github unter https://github.com/dante-ev/dtk-bibliography/blob/master/pack_and_upload_to_ctan.py. Für den Upload selbst benötigt man noch eine TOML-Datei, in der die Upload-Informationen als Key-Value-Paare stehen. Ein Beispiel für eine solche TOML-Datei findet ihr bei Manfred unter https://gitlab.com/Lotz/pkgcheck/-/blob/master/pkgcheck.toml.

Uwe

Uwe Ziegenhagen likes LaTeX and Python, sometimes even combined. Do you like my content and would like to thank me for it? Consider making a small donation to my local fablab, the Dingfabrik Köln. Details on how to donate can be found here Spenden für die Dingfabrik.

More Posts - Website

üääü­üöü–üöü–üöü–

by Uwe at Sunday, 2020-09-27 15:55

2020-09-22

TUG

TUGboat 41:2 published

TUGboat volume 41, number 2, the TUG 2020 (online) proceedings, has been mailed to TUG members. It is also available online and from the TUG store. In addition, prior TUGboat issue 41:1 is now publicly available. Submissions for the next issue are welcome; the deadline is October 15. Please consider joining or renewing your TUG membership if you haven't already (you'll get this issue immediately), and thanks.

Tuesday, 2020-09-22 17:52

2020-09-15

LaTeX Project

Talks from the online TUG Conference 2020

Talks from the online TUG Conference 2020 are now available on YouTube

The first TUG online conference in July was quite a success with many interesting talks on many subjects of TeX usage and related topics. The talks are now appearing one after another on YouTube, thanks to the the work of a few individuals who have prepared them (and the question/answer sessions) for publication.

If you have missed the conference (or if you want listen to one of the talks again) you will find them on the TeX Users Group Channel on YouTube. So far only a subset of the talks have been added; the remaining ones will probably appear pretty soon.

There are also several talks by members of the LaTeX Project team. These are:

Enjoy — Frank

Tuesday, 2020-09-15 00:00

2020-09-01

LaTeX Project

Final pre-release of LaTeX 2020-10-01 is available for testing

The third and final LaTeX pre-release for 2020-10-01 is available for testing

A few days ago we submitted a new LaTeX development format1 to CTAN and by now this should be available to all users using MiKTeX or TeX Live (on any operating system).

Main features of the final pre-release for 2020-10-01

In the previous pre-releases, distributed in May and July, the following main features (besides bug fixes) were made available:

  • The functionality of the xparse package was added directly to the LaTeX kernel.
  • LaTeX’s font series handling was improved.
  • A general hook management system was added to improve package interoperability and enable easier customization.

In this final prerelease we have fully integrated the hook management system (previously it was added at the end, overwriting earlier definitions in the kernel). In addition the following features were added:

  • The picture environent now supports dimensions (and not only numbers multiplied by \unitlength) in places where coordinates have to be specified. This is basically incorporating into the kernel the functionality of the picture.sty package by Heiko Oberdiek.
  • A \ShowCommand was added to show the definition of robust commands in an easy manner.
  • A \NewCommandCopy was added to allow making copies of robust commands (which can’t be done using \let).
  • docstrip and l3docstrip were merged so that one can now always use docstrip even with expl3-based sources.

Other fixes and improvements

There have been a larger number of smaller bug fixes and enhancements. A full list of all fixes and additions is given in a draft version of ltnews32, which you should be able to read by running

texdoc ltnews32

on the command line (or by any other means available with your operating system—somewhere there should be a file called ltnews32.pdf that you can open with a PDF reader). The draft version of this file is also available from our website as LaTeX2e News Issue 32 draft.

A general hook management system for LaTeX (the main feature of the 2020 fall release)

Most LaTeX users and package writers will know the handful of hooks that LaTeX has been offering until now, the most important one perhaps being \AtBeginDocument. These are important hooks, but they are far too few so that in many cases package developers had to directly patch the internals of LaTeX. This resulted in many problems.

With the new hook management system, LaTeX will get many more hooks that packages writers (and authors) can use to add code in a controlled and reliable way. New hooks have been added in a number of places by using the new system and more will follow over time. Available now are:

  • Hooks to add code before and after environments (formerly offered through the etoolbox package);
  • Hooks used when loading files, packages, or classes (similar to what the filehook package now provides);
  • Hooks in the page-building process (e.g., functionality previously available through packages such as atbegshi or atveryend and a few others).

The important point here is not so much that the functionality of these packages has been integrated into the LaTeX kernel, but that the hook management system provides a single structured way for different packages to reliably add and order code. This will resolve many of the inter-package interoperability issues which formerly could be resolved (if at all) only by loading the packages in a specific order, or by the use of complex and fragile code inside the packages to account for various scenarios in user documents.

The hook management system is currently described in these three documents:

  • texdoc lthooks — The description of the interfaces and the core hooks already added to the kernel.
  • texdoc ltshipout — The documentation of the hooks available during the page production process.
  • texdoc ltfilehook — hooks that can be used before or after a file gets loaded.

Outlook

We expect this to be the final pre-release matching the code that goes into the 2020-10-01 release, except for any bug fixes that are found between now and the release date.

Please help with the testing

We are issuing this final pre-release now in the hope that you will help us by making sure that all the enhancements and fixes we have provided are safe and that they do not have any undesired side effects, so please help with the testing if you can.

This development format allows you to test the upcoming LaTeX release scheduled for 2020-10-01 with your documents or packages. Such testing is particularly important for package maintainers to verify that changes to the core LaTeX haven’t introduced incompatibilities with existing code. We try to identify any such problems beforehand but such an undertaking is necessarily incomplete, which is why we are asking for user testing.

Besides developers, we also ask ordinary users to try out the new release, because the more people that test the new format, the higher the chances that any hidden problems are identified before the final release hits the streets.

Processing your documents with the pre-release is straight forward. All you have to do is to replace the invocation command by appending -dev to the executable, e.g., on the command line you would run

pdflatex-dev myfile    or    lualatex-dev myfile    or    xelatex-dev myfile

instead of using pdflatex, lualatex or xelatex. If you use an integrated editing environment, then it depends on the system how to configure it to use an alternative format; but in any case the necessary modification should be straight forward.

Enjoy — Frank

  1. The internal version number for this pre-release is LaTeX2e <2020-10-01> pre-release-8, the first 5 pre-releases just mirrored the patch releases we did for 2020-02-02. 

Tuesday, 2020-09-01 00:00

2020-08-30

Uwes kleines Technikblog - Kategorie LaTeX (by Uwe Ziegenhagen)

Mailman Spammer mit Python blocken

Für Dante e.V. betreue ich einige E-Mail-Listen auf mailman-Basis, die seit einigen Tagen von Spammern geflutet werden. Jeden Tag sind dutzende bis hunderte Aufnahme-Requests in der Liste, die ich manuell wegwerfen müsste. Nachdem ich dies einmal händisch getan hatte, musste eine automatische Lösung gefunden werden.

Die Lösung bestand darin, einen Treiber für Firefox („geckodriver“) zu installieren, der das Fernsteuern des Browsers erlaubt. Dann kann mittels selenium Modul die Steuerung aus Python heraus erfolgen. Unten der wesentliche Quellcode als Basis für eigene Arbeiten, den Teil zum Erkennen von legitimen Anfragen habe ich weggelassen.

# -*- coding: utf-8 -*-
"""
https://www.edureka.co/community/47679/is-it-possible-to-run-headless-browser-using-python-selenium
"""

from selenium.webdriver import Firefox
from selenium.webdriver.firefox.options import Options

opts = Options()
#opts.set_headless() # Ich will sehen, wie selenium arbeitet
#assert opts.headless  # Operating in headless mode
browser = Firefox(executable_path=r"C:\Users\Uwe\Downloads\geckodriver-v0.27.0-win64\geckodriver.exe", options=opts)
browser.implicitly_wait(3)

# einloggen
browser.get('<url panels')="" mailman="" search_form="browser.find_element_by_name('&lt;passwortfeld_ID" admin="" des="">')
search_form.send_keys('<adminpasswort>')
search_form.submit()

#wegwerfen Button pro Zeile
fields = browser.find_elements_by_xpath("//input[@value='3']")
#email Adresse des Spammers
emails = browser.find_elements_by_xpath('//td[contains(text(),"@")]')

if len(fields) == len(emails):
    zipped_list = list(zip(emails, fields))
    
    for i in zipped_list:
        email, field = i
        field.click()

</adminpasswort></url>

Uwe

Uwe Ziegenhagen likes LaTeX and Python, sometimes even combined. Do you like my content and would like to thank me for it? Consider making a small donation to my local fablab, the Dingfabrik Köln. Details on how to donate can be found here Spenden für die Dingfabrik.

More Posts - Website

<script type="text/javascript"> (function($){ var options = {"info_link":"http:\/\/www.heise.de\/ct\/artikel\/2-Klicks-fuer-mehr-Datenschutz-1333879.html","txt_help":"Wenn Sie diese Felder durch einen Klick aktivieren, werden Informationen an Facebook, Twitter oder Google in die USA &uuml;bertragen und unter Umst&auml;nden auch dort gespeichert. N&auml;heres erfahren Sie durch einen Klick auf das <em>i<\/em>.","settings_perma":"Dauerhaft aktivieren und Daten&uuml;ber&shy;tragung zustimmen:","cookie_path":"\/","cookie_expire":365,"cookie_domain":"","css_path":"http:\/\/www.uweziegenhagen.de\/wp-content\/plugins\/wp-socialshareprivacy\/socialshareprivacy.css","services":{"facebook":{"status":"on","dummy_img":"http:\/\/www.uweziegenhagen.de\/wp-content\/plugins\/wp-socialshareprivacy\/images\/dummy_facebook.png","txt_info":"2 Klicks f&uuml;r mehr Datenschutz: Erst wenn Sie hier klicken, wird der Button aktiv und Sie k&ouml;nnen Ihre Empfehlung an Facebook senden. Schon beim Aktivieren werden Daten an Dritte &uuml;bertragen &ndash; siehe <em>i<\/em>.","txt_fb_off":"nicht mit Facebook verbunden","txt_fb_on":"mit Facebook verbunden","display_name":"Facebook","referrer_track":"","language":"de_DE"},"twitter":{"status":"on","dummy_img":"http:\/\/www.uweziegenhagen.de\/wp-content\/plugins\/wp-socialshareprivacy\/images\/dummy_twitter.png","txt_info":"2 Klicks f&uuml;r mehr Datenschutz: Erst wenn Sie hier klicken, wird der Button aktiv und Sie k&ouml;nnen Ihre Empfehlung an Twitter senden. Schon beim Aktivieren werden Daten an Dritte &uuml;bertragen &ndash; siehe <em>i<\/em>.","txt_twitter_off":"nicht mit Twitter verbunden","txt_twitter_on":"mit Twitter verbunden","display_name":"Twitter","referrer_track":"","tweet_text":""},"gplus":{"status":"on","dummy_img":"http:\/\/www.uweziegenhagen.de\/wp-content\/plugins\/wp-socialshareprivacy\/images\/dummy_gplus.png","txt_info":"2 Klicks f&uuml;r mehr Datenschutz: Erst wenn Sie hier klicken, wird der Button aktiv und Sie k&ouml;nnen Ihre Empfehlung an Google+ senden. Schon beim Aktivieren werden Daten an Dritte &uuml;bertragen &ndash; siehe <em>i<\/em>.","txt_gplus_off":"nicht mit Google+ verbunden","txt_gplus_on":"mit Google+ verbunden","display_name":"Google+","referrer_track":"","language":"de"}}}; options.cookie_domain = document.location.host; options.uri = 'https://www.uweziegenhagen.de/?p=4386' $(document).ready(function(){ $('#socialshareprivacy_2cc3be34166c8e65c5e6982f5ce1f704').socialSharePrivacy(options); }); })(jQuery); </script>

by Uwe at Sunday, 2020-08-30 18:08

2020-08-23

Thomas Schramm

Fussball-Halbjahreskalender Saison 2020/21 September bis Dezember 2020 als PDF

Die Ansetzungen der Saison 2020/21 bis Dezember 2020 als A4-Halbjahreskalender. Zur Übersicht sind die Spiele der Vorsaison dabei, Januar bis Juni/August. Durch das komprimierte Programm, allein acht Länderspiele sind dieses Jahr noch geplant, könnte ein gewisser Überdruß entstehen – gefühlt jeden Tag finden irgendwelche Spiele statt. Gesetzt wie immer mit Xe(La)TeX und dem tikz-Paket, Quelltext […]

by Thomas Schramm at Sunday, 2020-08-23 22:08

2020-08-20

Some TeX Developments (by Joseph Wright)

\NewDocumentCommand versus \newcommand versus ‌

Creating new document commands in LaTeX has traditionally been the job of \newcommand. This lets you create command with mandatory arguments, and can also add a first optional argument. However, it can’t create more complex commands: LaTeX uses for example stars, multiple optional arguments, etc. To create these, the kernel itself uses lower-level TeX programming. But this is opaque to many users, and a variety of packages have been created to ease the burden.

Over the last decade, the LaTeX team have developed xparse, a generic document command parser, as a way to unify many ideas and provide a single consistent way to create document commands. Most of that code will soon be moving to the LaTeX kernel itself, and I’ve provided some ideas about how best to exploit that in a recent post.

Here, I want to look at a related issue: why use the xparse approach, and how it compares to existing solutions, both in the LaTeX kernel and the wider package sphere. Here, I’m going to avoid talking about ‘simple’ shortcuts (things like \newcommand\myname{Joseph Wright}): these are best left to \newcommand. Instead, I want to deal with commands which take arguments and have some element of ‘programming’ to them.

What I’ll seek to highlight here is that using \NewDocumentCommand, we get a single consistent and reliable way to create a variety of commands. There’s no need to worry about clashes between approaches, and it all ‘just works’.

Preliminaries: protected commands and optional arguments

Before we start, a couple of things are worth mentioning. First, there is the idea of ‘protected’ commands. In some places, we need commands not to ‘expand’ (turn into their definition). With a modern TeX system, that can be arranged by the engine itself (pdfTeX or similar), using ‘e-TeX’ and the \protected primitive (built-in). The LaTeX kernel doesn’t use that mechanism in \newcommand, but lots of other tools do. I’m going to assume that we want to make protected commands unless I mention otherwise. Almost always, unless you are creating a ‘shortcut’ for some text, you want your commands to be protected.

The second thing to note is that TeX itself has no concept of optional arguments, so they are arranged using some clever look-ahead code. In xparse, nested optional arguments are handled automatically, but again, \newcommand and similar do not do that.

The kernel: versus \newcommand

The kernel’s \newcommand can, as I’ve said, create commands with multiple mandatory arguments but only with one optional one. As a simple example, we might have

\newcommand\foo[3][default]{%
    Code perhaps using #1 and definitely using #2 and #3%
}

We can of course do the same using \NewDocumentCommand

\NewDocumentCommand\foo{+O{default} +m +m}{%
    Code perhaps using #1 and definitely using #2 and #3%
}

You’ll notice that I’ve use +m for the mandatory arguments, as that matches \newcommand: the arguments can accept paragraphs. With \newcommand, all arguments either accept \par or do not: with \NewDocumentCommand we can select on a per-argument level what happens.

The optional argument with a default works using O{default}, and the result will be the same functionality. We gain the idea that nested optional arguments are parsed properly, some better error messages if we use \foo incorrectly, and an engine-robust definition of \foo.

We can’t do a lot more with \newcommand, so rather than try to show off other \NewDocumentCommand features here, we’ll first consider how we might make more complex syntaxes using just the classical LaTeX kernel.

The kernel: versus \def

Using the TeX primitive \def, plus the kernel internal commands \@ifstar and \@ifnextchar, we can construct more complex syntaxes. For example, let’s create the syntax for \section: a star, and optional argument and a mandatory one. I’ll assume we are have @ as a letter here. I’m also going to pass the presence of a star as the text true or false, as it makes things clearer.

\newcommand\section{%
    \@ifstar
      {\section@auxi{true}}
      {\section@auxi{false}}%
}
\def\section@starred#1{%
    \@ifnextchar[%]
      {\section@auxii{#1}}
      {\section@auxii{#1}[]}%
}
\long\def\section@auxii#1[#2]#3{%
    % Here:
    % #1 is "true"/"false" for a star
    % #2 is the optional argument
    % #3 is the mandatory argument
}

As you’ll see, this is a bit tricky already, and it doesn’t cover the case where we want to have the optional argument default to the mandatory one, when it’s not given. It also doesn’t allow for nested optional arguments, and it’s not engine-robust. We might of course use more complex paths for the star: we could have independent routes.

Using \NewDocumentCommand, things are a lot easier

\NewDocumentCommand\section{s +O{#3} +m}{%
  % Here:
  % #1 is "true"/"false" for a star
  % #2 is the optional argument
  % #3 is the mandatory argument
}

The minor difference now is that #1 is a special token that we can test for truth using IFBooleanTF. I’ve also allowed for the optional argument picking up the mandatory one, when it’s not given.

We could make more complex examples, but the bottom line remains the same: using \NewDocumentCommand, we are always going to have simple one-line interface descriptions, and the behind-the-scenes TeX argument parsing is all hidden away.

etoolbox: versus \newrobustcmd

The etoolbox package offers \newrobustcmd as a complement to \newcommand. It does exactly the same as \newcommand, except it uses e-TeX to make engine-protected commands. So from an interface point-of-view, there’s nothing new here.

twoopt: versus \newcommandtwoopt

The twoopt package allows a syntax similar to \newcommand but for creating two optional arguments. We’ll take an example from it’s documentation:

\newcommandtwoopt\bsp[3][AA][BB]{%
    \typeout{\string\bsp: #1,#2,#3}%
}

This is reasonably clear: we have an optional argument #1, and optional argument #2 and a mandatory argument #3. The two optional arguments each here have a default.

How does that look with \NewDocumentCommand?

\NewDocumentCommand\bsp{+O{AA} +O{BB} +m}
    \typeout{\string\bsp: #1,#2,#3}%
}

You’ll see that we stay consistent here: the same syntax is used to create one, two or even more optional arguments. I wouldn’t recommend using multiple optional arguments in most cases, but when we do, it’s a lot easier using \NewDocumentCommand.

What twoopt cannot do, but \NewDocumentCommand can, is create optional arguments that are not in the first/second positions. That would require either the TeX coding we’ve already seen, or using a different tool again, if we were using twoopt.

suffix: versus \withsuffix

The suffix package allows one to extend an existing command to look for an optional token (‘suffix’) immediately after the command name. Taking a simple example from StackExchange, we start with

\newcommand\foo{blah}
\WithSuffix\newcommand\foo*{blahblah}

which translates to

\NewDocumentCommand\foo{s}{%
   \IFBooleanTF{#1}
     {blah}
     {blahblah}
}

This means we only need one line for the interface set up, and don’t need for example to split up grabbing optional arguments into two different places (back for example with \section).

xargs: versus \newcommandx

The xargs package is perhaps the most complete approach to extending \newcommand as far as optional arguments are concerned. It introduces \newcommandx, which has the same syntax as \newcommand but where the second optional argument is a keyval list. That then described which arguments are optional, and what their defaults are. Taking an example from the documentation

\newcommandx*\coord[3][2=1,3=n]{(#2_{#1},\ldots,#2_{#3})}

would create a command with two optional arguments, #2 and #3, leaving #1 mandatory. Translating into \NewDocumentCommand syntax might make that clearer!

\NewDocumentCommand\coord{m O{1} O{n}}{%
    (#2_{#1},\ldots,#2_{#3})%
}

xargs has the idea of usedefault, which allows [] to be the same as [default]. That’s not something xparse does, as it is pretty confusing: what happens when you want an empty optional argument? This links to something I’ve said before: avoid consecutive optional arguments unless the second is dependent on the first.

newcommand: versus newcommand.py

Stepping outside for TeX itself, Scott Pakin’s newcommand.py Python script provides a description language somewhat like xparse, and converts this into a ‘template’ of TeX code, allowing a ‘fill in the blanks’ approach to creating commands. It can cover several of the ideas that xparse can, including a few that will not be migrated to the LaTeX kernel. It can also set up a command taking more than 9 arguments, but that’s always going to be tricky as a user.

What is important is that using a script means we have to work in two steps, and it’s hard to see what’s happening from the TeX source. It also doesn’t offer anything that the kernel doesn’t already do: no protected commands, no nested optional arguments, no improved error messages. So in many ways this is simply techniques we’ve already seen, just made a little more accessible, at least if you have Python installed.

environ: versus \NewEnviron

As well as document commands, the xparse syntax can be used to create document environments: the same relationship we have between \newcommand and \newenvironment. What people sometimes want to do is grab an entire document environment body and use it like a command argument. Classically, one does that using the environ package. Again, taking an example from the documentation

\NewEnviron{test}{%
  \fbox{\parbox{1.5cm}{\BODY}}\color{red}
  \fbox{\parbox{1.5cm}{\BODY}}%
}

would grab all of the body of the environment test and set it twice: the body is saved as \BODY.

Using \NewDocumentEnvironment, we have a syntax similar to \newenvironment

\NewDocumentEnvironment{test}{+b}{%
  \fbox{\parbox{1.5cm}{#1}}\color{red}
  \fbox{\parbox{1.5cm}{#1}}%
}{}

with the argument grabbed in the normal way as (here) #1. We can therefore have ‘real’ arguments first, then grab the body.

Summary

Using the tools set up in \NewDocumentCommand, we can have a consistent way of creating a wide range of document commands. Rather than use a mixture of tools, from the kernel, TeX itself and the package sphere, it is far preferable to use the single interface of \NewDocumentCommand for all commands today.

Thursday, 2020-08-20 00:00

2020-08-19

Some TeX Developments (by Joseph Wright)

The good, the bad and the ugly: creating document commands

Creating document commands in LaTeX has traditionally involved a mix of \newcommand, semi-internal kernel commands (like \@ifnextchar and \@ifstar) and low-level TeX programming using \def. As part of wider efforts to improve LaTeX, the team have over the past few years improved xparse to the point where it is capable of creating a vast array of document commands.

The aims of xparse have always been two-fold: to provide a clear way to create new commands, and to provide a language to describe existing ones. It is also intended to be as flexible as possible, so it doesn’t impose artificial restrictions on syntax. That comes at a cost, however: it can be (ab)used to create commands that really don’t fit into the standard LaTeX pattern.

The upcoming LaTeX kernel release (autumn 2020) will integrate most, though not all, of xparse into the kernel. That means the core ideas will be available out-of-the-box. This seems like a good time, therefore, to look at the best ways to use the abilities of xparse in making document commands. I won’t look at the full detail, rather pick out how to, and how not to, create good document commands. (I’m going to assume some familiarity with the xparse description syntax.)

The Good

The LaTeX kernel is very careful to have consistent syntax for document commands. It uses a very small number of argument types, which I’ll describe in xparse terms

  • Mandatory (m) arguments in braces
  • Optional (o/O{<defaut>}) arguments in [], which may have a default; in xparse terms we can tell the difference between a missing optional argument and one given with an empty [] pair
  • A star (s)
  • Picture co-ordinates (r()), which are split into x and y, so in xparse terms subject to \SplitArgument

Most of the time, the LaTeX kernel makes arguments long, which is shown as + in xparse syntax.

A star is always used as the first argument after a command, so in some ways it looks like part of the command name itself. Optional arguments are almost always given before mandatory ones, and most of the time there is only one. Where two are used, for example with \makebox, it’s because the second is strictly dependent on the presence of the first.

Following the kernel, signatures (argument descriptions) such as s o m, s O{<default>} m m and o m m are ‘good’. You can use something like s +m O{0} +o +m (the syntax of \newcommand!) if you are careful, but think very carefully.

There’s one syntax that’s not from the kernel but is recommended where it applies: beamer’s’ overlay syntax, which is d<> in xparse terms. This always comes first (other than a star), and is best reserved for the ‘on X slides of Y’ idea in presentations (doesn’t have to be using beamer).

xparse lets us create arguments using _ and ^, similar to TeX’s core math mode syntax. Most of the time, this should be reserved for math mode where you need to emulate the TeX syntax but need for some reason to grab the arguments yourself. That’s done using e{^_}: I’d restrict this to math mode commands.

The Bad

The above already shows we have quite a few combinations available. Things go bad when too many combinations are used. That includes things like

  • Multiple optional arguments where the second or subsequent ones don’t strictly depend on the earlier ones
  • Optional arguments using tokens other than [] (or <> for overlays)
  • Testing for tokens other than * as ‘a special case’ (think things like +)

Almost always, complex set ups using these types of combination mean you need to rethink the syntax. In particular, multiple optional arguments tend to be much better replaced by using a keyval approach.

The Ugly

Some ideas in xparse won’t be making it to the kernel: these are definitely the Ugly. They’ll stay in a stub xparse for historical reasons, and as they do describe some syntax choices people have made, but really they should be avoided

  • Optional groups (g) in braces; breaks the LaTeX conventions badly
  • Arguments up to a left brace (l); useful at a low level, but not in a document command
  • Arguments up to a token (u); widely used in programming, but again not in document commands

You might wonder why they are all there in the first place: these were part of the more experimental work in xparse, and those particular experiments have shown we don’t want to enable such syntaxes even for emulating existing commands.

A Fistful of Tokens

There are of course places where you need to go outside of the xparse structures, particularly when parsing specialist data. The popular TikZ graphics system is one example, linguistic glosses are another. But these are always restricted contexts: normally within a dedicated environment where it is clear that the ‘usual’ rules do not apply. Basically, if you do this, you are on your own, so be sure to check the balance of consistency versus compactness.

For a Few Tokens More

Using xparse syntax makes it much easier to have a clear break between interface and implementation. As such, the fact that it’s go more going on ‘beneath the hood’ is worth it: it’s a lot easier to track what’s happening. The move into the kernel will make using xparse descriptions even easier to exploit, so it’s important users give a little thought to the syntax they choose.

Wednesday, 2020-08-19 00:00

2020-08-17

Typeset In The Future

Star Trek: The Motion Picture

Star Trek: The Motion Picture, a movie adaptation of the cult sixties TV show, very nearly didn’t get made. If it weren’t for the runaway success of Star Wars and Close Encounters of the Third Kind (both released in 1977), fickle Paramount executives would have canceled their faltering TV project Star Trek: Phase II, instead of adapting it into a big-screen production. We must give thanks, then, to producer Gene Roddenberry for pushing the project through ten years of development hell. In doing so, he named a space shuttle, created a custom font pack, and relaunched the swashbuckling futurism of the greatest of all space franchises: Star Trek.

Before any buckles get swashed, however, we must first sit through one minute and forty seconds of this:

This is not a positioning shot of deepest, darkest space. Nor is it a close-up of a black hole—although it could be a close-up of The Black Hole, Disney’s sci-fi adventure. That’s because The Motion Picture and The Black Hole, both released in 1979, were the last two big Hollywood movies to feature a musical overture before the start of the movie proper. If you begin either movie and see nothing but beautifully-scored blackness for a couple of minutes, don’t worry—just hum along to the theme tune, and wait for the typography to begin.


If you’re a fan of Star Trek: The Original Series, you might be expecting to see the font from its opening titles in Star Trek: The Motion Picture too. This font was (perhaps unsurprisingly) called Star Trek, though its modern-day digital version is known as Horizon, and is available only in non-italic form:

Opening titles to Star Trek: The Original Series
Star Trek, aka Horizon

In addition, I’m delighted to report that during season one, Star Trek was accompanied by a variant of sci-fi stalwart Eurostile in its closing credits:

End credits from Star Trek: The Original Series, season one
Eurostile Bold

The Star Trek font also appeared in a non-italic version, to introduce William Shatner and Leonard Nimoy to 1960s TV audiences:

Sadly, this is where the good news ends. When The Original Series returned for a second season, it added DeForest Kelley (Dr. “Bones” McCoy) as a second “ALSO STARRING”:

The problem here is obvious, isn’t it? Unlike the Es in “SHATNER” and “LEONARD,” the ones in “DEFOREST KELLEY” have straight corners, not curved ones:

Alas, The Original Series’s inconsistent typography did not survive the stylistic leap into the 1970s. To make up for it, The Motion Picture’s title card introduces a new font, with some of the curviest Es known to sci-fi. It also follows an emerging seventies trend: Movie names beginning with STAR must have long trailing lines on the opening S:

The opening title from 1979’s Star Trek: The Motion Picture. Note the elongated leading and trailing lines on the S in “STAR” and the K in “TREK.”
The opening title from another popular late-seventies sci-fi movie. Note the elongated leading and trailing lines on the S and R in “STAR” and on the S in “WARS.”

The font seen in The Motion Picture’s titles is a custom typeface created by Richard A. Foy, known at the time as Star Trek Film (and now known in digital form as Galaxy):

Star Trek Film, aka Galaxy

Star Trek Film also shows up on the movie’s US one-sheet poster, with bonus Technicolor beveling to make it even more futuristic:

Detail of the US theatrical one-sheet poster for Star Trek: The Motion Picture

Illustrated by Bob Peak, the poster for The Motion Picture has become something of a classic. Its striking rainbow motif was reprised in a limited-edition poster for 2016’s Star Trek Beyond, presented to fans who attended a Star Trek fiftieth-anniversary event.

US theatrical one-sheet poster for Star Trek: The Motion Picture
Limited-edition poster for Star Trek Beyond

On which theme: If you’ve ever doubted the power of type to aid recognition, note that the teaser poster for Star Trek Beyond features the word “BEYOND” in metallic, beveled, extruded Star Trek, without feeling the need to add the actual words “STAR” or “TREK.” The presence of the Enterprise and the use of an iconic font were deemed more than sufficient to identify the franchise. (Although it would have worked far more effectively if they’d remembered to curve the E.)

US teaser poster for Star Trek Beyond

If you like the style of Star Trek or Star Trek Film, and want to use them to spice up your corporate communications, I have excellent news. In 1992, the creators of the Star Trek franchise partnered with Bitstream to release an officially licensed “Star Trek” Font Pack. The pack contains full versions of Star Trek and Star Trek Film, plus Star Trek Pi (a collection of insignias and Klingon glyphs) and Starfleet Bold Extended (a Eurostile look-alike that appears on the outside of many Starfleet craft). It also, of course, uses Eurostile Bold Extended liberally on its front cover:

Let’s take a look at Bitstream’s examples of how the fonts should be used, from the back of the font pack’s box:

If you’re looking to bring character to all you do in Microsoft Windows 3.1, the “Star Trek” Font Pack is for you.


But back to the movie. Its opening scene starts with a menacing close-up of the movie’s central antagonist, which just happens to be a gigantic glowing space cloud:

Clouds are not generally known for their evil, murdering tendencies. To work around this potential dramatic limitation, the movie’s producers cleverly employ a scary sound effect whenever we’re meant to be intimidated by this galactic floaty miasma. Despite aural evidence to the contrary, the cloud’s twangling menace was not made by throwing a ham
into a sack of pianos. Instead, it’s the sound of a Blaster Beam, a twelve-to-eighteen-foot aluminum guitar-like device invented in the early 1970s by musician John Lazelle.

Los Angeles–based Beam player Francesco Lupica, who has performed as the Cosmic Beam Experience since the early seventies, is mentioned in The Motion Picture’s credits for his work creating sound effects for the movie. However, the primary credit for the cloud’s eerie twang goes to Craig Huxley, a child actor and musician who originally appeared in The Original Series as Captain Kirk’s nephew, before going on to create The Motion Picture’s scary soundscape.

Craig Huxley (billed under his birth name, Craig Hundley) as the unconscious Peter Kirk in the “Operation—Annihilate!” episode of The Original Series

As part of his collaboration with composer Jerry Goldsmith, Huxley (by now a professional musician) composed the Blaster Beam cadenza for The Motion Picture’s climax. He subsequently obtained a patent for the Blaster Beam’s design, and went on to bash its eerie strings in Back to the Future Part II and Part III, Tron, 2010, Who Framed Roger Rabbit, and 10 Cloverfield Lane, plus several more Star Trek movies.

If you’re not sufficiently freaked out by reading about how the Blaster Beam sounds, here’s Craig playing all 18 feet of it in a variety of movie scores:

Following its starring role in The Motion Picture, the Blaster Beam became so synonymous with sci-fi that it was featured in Seth MacFarlane’s The Orville as an aural homage to the original Trek movies. It’s basically become the audible equivalent of Eurostile.


As the cloud twangles ominously, three Klingon ships float into view. Clearly intimidated by the cloud’s noncorporeality, the Klingons fire some photon torpedoes into its inky midst. (It’s not entirely clear what they hope this will achieve.) As they optimistically launch their missiles, we see their native tongue translated into English in a slightly fuzzy Pump Demi:

Pump Demi

The Motion Picture is notable for being the first time in Star Trek history that Klingons speak Klingon, rather than conversing in English. An initial Klingon dialect was created for the movie by UCLA linguist Hartmut Scharfe, but wasn’t felt to be alien enough, so James Doohan (aka Scotty) volunteered to work on an alternative. Doohan’s skills with accents were already well known—after all, he played a Scotsman, despite being a Canadian with Irish parents—and he helped come up with a number of nonsense phrases that formed the basis of the movie’s Klingon language. Associate producer Jon Povill described the birth of Klingon this way:

After Hartmut had done his thing and worked it all out logically, Jimmy and I just sat down one day and made up stuff. We created the Klingonese by using some of what Hartmut had done and then combining it with our own: We strung together nonsense syllables, basically—totally made up sounds with clicks, and grunts, and hisses.

The Klingon typography seen in the movie is just as nonsensical as the spoken language—the limited set of Klingon glyphs seen here don’t actually mean anything. They were nonetheless adapted (along with various Starfleet and Klingon insignia) into a “Star Trek” Font Pack typeface known as Star Trek Pi:

Star Trek Pi, from the “Star Trek” Font Pack. (“Pi” is a common typographic name for a symbol font.)

The Motion Picture’s Klingon might be nonsense, but the language has had a long and active development since the movie’s release. Doohan’s guttural phrases were adapted by linguist Marc Okrand into a full language for 1984’s Star Trek III: The Search for Spock, which was followed in 1985 by an official Klingon Dictionary:

The Klingon Dictionary, by Marc Okrand (Pocket Books, 1985)

This dictionary describes the language’s grammar in detail, and provides two-way translations for common English and Klingon phrases. The dictionary has since been translated into Portuguese (Dicionário da língua Klingon), German (Das offizielle Wörterbuch: Klingonisch/Deutsch; Deutsch/Klingonisch), Italian (Il dizionario Klingon), Czech (Klingonský slovník), and Swedish (Klingonsk Ordbok).

Star Trek’s producers were not the first world builders to create an entire language, however. Lord of the Rings author J. R. R. Tolkien, inventor of two complete Elven languages, was a philologist long before he was a novelist. From his early days at the Oxford English Dictionary (where he was responsible for words starting with w), through his multiple professorships at Oxford University, Tolkien loved the study of language more than anything else. Indeed, he didn’t construct his Elven languages to add color to his novels; rather, he wrote novels to provide a world in which his created languages could live and breathe. Tolkien believed a language was truly alive only when it had a mythology to support it, and his books provided a world in which that mythology could exist.

Had Tolkien lived to see 1991’s Star Trek VI: The Undiscovered Country, I am therefore sure he would have approved of Klingon chancellor Gorkon’s tongue-in-cheek statement: “You have not experienced Shakespeare until you have read him in the original Klingon.” This comment inspired the Klingon Language Institute to produce The Klingon “Hamlet,” an equally tongue-in-cheek 219-page restoration of Shakespeare’s famous work to its original Klingon:

The Klingon “Hamlet,” originally published in hardcover by the Klingon Language Institute in 1996. The paperback edition shown above was published by Pocket Books in 2000.

Despite the Klingons’ use of their native tongue, things are not going well for them in their battle against the evil space cloud. A distress message from the Klingon command ship is picked up by a sensor drone from space station Epsilon 9 and translated into English for the benefit of the station crew. This time, the Klingon does not translate into Pump. Instead, it’s a customized version of Futura Display, which is almost certainly the Letraset Instant Lettering version with bits cut out of it to make it more futuristic:

Composite image of the translated Klingon message, as detected by Epsilon 9
The Letraset Instant Lettering version of Futura Display

Its final sentence is completed by a voice reading out the English translation: “IMPERIAL KLINGON CRUISER AMAR… CONTINUING TO ATTACK.” (Given that these events take place during the opening ten minutes of the movie, you can probably guess how “CONTINUING TO ATTACK” is going to pan out.)

As we cut back from Epsilon 9 to the cloud, we see that only two of the three Klingon ships remain. According to the movie’s shooting script, this is because a “FRIGHTENING WHIPLASH OF ENERGY” has been fired from the cloud, creating evil lightning that makes Klingon ships disappear. A second frightening whiplash takes out ship number two, leaving only the command craft. As a third bolt appears on the Klingons’ tactical display, the command ship poops a red torpedo, but it is all to no avail. The command ship is whiplashed into nothingness.

As they express their shock, the watching crew of Epsilon 9 observes that the sinister gassy blanket is heading directly for Earth. The cloud puts on its best evil face, and makes its destructive intentions clear via some particularly dramatic twangs.


With twangling in our ears, we quickly switch scenes to Vulcan, which is easily recognizable from its lava-strewn matte paintings and suspicious geological similarity to filming locations in Yellowstone Park. Look—it’s Spock! He’s talking in Vulcan, which also translates into English as fuzzy Pump Demi:

The lady Vulcan here is talking about the many years Spock has spent striving to attain Kolinahr, a state of pure logic in which Vulcans shed all of their emotion. She offers Spock a symbol of total logic to commemorate his imminent Kolinahr-ization—but just as she does so, Spock hears an 18ft cloud-guitar twangling in space. Lady Vulcan realizes that Spock’s goal lies elsewhere, and despite several years of exhortations, he fails to Kolinahr.


After our brief trip to Vulcan, we switch straight to San Francisco, specifically to the Golden Gate Bridge, as we follow an air tram on its way to Starfleet Command. The most notable change between now and the 2270s is that the bridge’s traffic lanes have been covered, thereby hiding the vehicles and making for a much cheaper effects shot:

Spoiler alert: That flying air tram contains Admiral James T. Kirk, who is heading to Starfleet Headquarters in San Francisco’s Presidio Park. There’s a nice close-up of the United Federation of Planets logo during a positioning shot as he arrives. This looks to be Eurostile Bold, but not quite—there’s something wrong with the capital Ss:

Starfleet is the military and exploratory arm of the Federation, which explains the use of the Federation’s stars-and-thistles logo as part of the floor decoration. We see the same logo a few moments later, on the side of Kirk’s air tram:

…and again when Kirk briefs the crew of the Enterprise about their upcoming mission:

Confusingly, each of these Federation logos has a different thistle design and a different constellation of stars. Perhaps more importantly, they bear a remarkable similarity to the official flag of the United Nations:

Comparison between a re-creation of the Federation logo on the floor of Starfleet Headquarters, and the official flag of the United Nations

This is clever branding by the movie’s design team. A viewer’s subconscious recollection of the United Nations via this extrapolated logo and color scheme gives the Federation (and therefore Starfleet) an immediate association with peacekeeping and ethical behavior, eliminating the need for an extended explanation of their role in the movie’s universe. (All of which goes to show: Design doesn’t have to include typography to provide a shortcut for exposition.)

Just in case you have any doubt about the coincidence of this similarity: The “Star Trek” Star Fleet Technical Manual, published in 1975, includes the charter of the United Federation of Planets. It’s a direct copy of the United Nations charter, but with life forms instead of humans, and planets instead of nations. Differences are indicated in yellow, additions in green:

Charter of the United Nations
Charter of the United Federation of Planets, reproduced from the “Star Trek” Star Fleet Technical Manual (Ballantine Books, 1975)

Shortly after arriving at Starfleet, Kirk transports to a space station near the USS Enterprise, which has been in dry dock while undergoing a post–Original Series refit. On arrival, he meets up with Scotty, and they take a shuttle to go and see the ship:

As they depart for the Enterprise, we see a mix of fonts on the side of the shuttle and station. “OFFICE LEVEL,” seen on the right, is, of course, perennial sci-fi favorite Eurostile Bold Extended:

Eurostile Bold Extended

…although, according to the Star Fleet Technical Manual, the “Earth font name” for Starfleet’s official type style is Microgramma Ext, suggesting that we may be looking at Eurostile’s predecessor Microgramma instead:

Detail from the “Official Type Style (Star Fleet Specification),” reproduced from the “Star Trek” Star Fleet Technical Manual

The “05” on the side of the shuttle is not Eurostile Bold Extended, however. This is Starfleet Bold Extended, the fourth and final font from the “Star Trek” Font Pack:

Starfleet Bold Extended

Unlike Eurostile, it has an outline, giving it a more curved silhouette overall. It also has higher bars on the P and R, a squared-off corner for the Q, and a very different number 1. (Eurostile’s 1 is perhaps its least practical glyph, which may explain why the Star Fleet Technical Manual extract above states that it should not be used in fleet operations.)

Eurostile Bold Extended digits, including an impractical elongated 1
Starfleet Bold Extended digits, including a much more balanced 1

The R and the 1 make it easy to tell that Starfleet Bold Extended is used for the Enterprise’s hull classification symbol (“NCC”) and number (“1701”):

USS Enterprise NCC-1701 in 1979’s Star Trek: The Motion Picture

Indeed, Starfleet Bold Extended is used on the front of USS Enterprise NCC-1701 (and 1701-A, B, C, D, and E) in all subsequent Star Trek movies and TV shows, in essentially the same style as in The Motion Picture. This gives a consistent, instant recognizability to each iteration of the craft, despite the crew’s tendency to crash or otherwise destroy the ships for dramatic effect.

USS Enterprise NCC-1701-A in 1986’s Star Trek: The Voyage Home
USS Enterprise NCC-1701-B in 1994’s Star Trek: Generations
USS Enterprise NCC-1701-C in the 1990 Star Trek: The Next Generation episode “Yesterday’s Enterprise”
The saucer section of USS Enterprise NCC-1701-D after a crash landing on Veridian III in 1994’s Star Trek: Generations
USS Enterprise NCC-1701-E in 2002’s Star Trek: Nemesis

A different style was introduced for 2009’s Star Trek reboot, and continued through 2016’s Star Trek Beyond. The reboot movies use an outlined version of plain old Eurostile Extended for “U.S.S. ENTERPRISE,” though they continue to heed the advice from the Star Fleet Technical Manual about Eurostile’s 1, opting for flat lines for the 1s in “1701”:

USS Enterprise NCC-1701 in 2009’s Star Trek reboot. The lower bar of the R makes it clear that this is Eurostile Extended, not Starfleet Bold Extended.
USS Enterprise NCC-1701-A in dry dock during its construction at the end of 2016’s Star Trek Beyond. Note the straight 1s in “1701.”
Eurostile Extended

In 2017, the Star Trek: Discovery TV series rebooted the typography once more, opting for Eurostile Bold Extended for the USS Discovery’s name and designation. Crucially, the Discovery becomes the first central Star Trek craft to ignore the Starfleet official type style, shamelessly using a Eurostile 1 character without modification:

The USS Discovery’s name and designation in Eurostile Bold Extended, seen in the Star Trek: Discovery season-one episode “Context Is for Kings.”

The Enterprise’s complex shape could easily make it a nightmare to navigate. Thankfully, it has a fancy horizontal-and-vertical elevator system known as a turbolift, which whizzes crew members from A to B via handily placed turboshafts. We see a two-part map of the craft’s turboshaft network as Kirk heads up to the bridge. Judging from this map, it looks like the turbolift can move in all three dimensions, including along a curved route:

The white dot on the map indicates the lift’s current position. We see it move from right to left as the lift departs, and it’s clearly visible at the top of the map after the lift arrives at the bridge:

The turbolift’s multidirectional nature is also represented in its iconography, seen when Decker boards a turbolift on level five later in the movie:

Despite the turbolift’s obvious futurism, Star Trek was not the first to propose a multidirectional elevator. That honor goes to Roald Dahl, whose 1964 novel Charlie and the Chocolate Factory features a glass elevator that goes “sideways and longways and slantways and any other way you can think of.” (Unlike the turbolift, it can also propel itself out of its containing building and into the air via sugar power.)

The turbolift’s secret is being able to switch cars between shafts dynamically, in order to route travelers around the craft in the most efficient manner. This may have been a futuristic concept in 1979 when The Motion Picture was released, but I am delighted to report that it recently became a reality. In 2017, German elevator company thyssenkrupp performed the first real-world test of its MULTI elevator system.

Detail of a theoretical MULTI installation, showing elevator cars following a nontraditional path, giving greater flexibility in building design

According to thyssenkrupp, MULTI increases elevator capacity by 50 percent, while halving the elevator footprint within a building. It allows ninety-degree turning of the system’s linear drive and guiding equipment, enabling cars to quickly move between horizontal and vertical shafts during a single journey.

Schematic of the MULTI system’s transfer mechanism between vertical and horizontal movement

MULTI operates in two dimensions, not three, so it’s not quite a full-fledged turbolift. Nonetheless, its motors are based on technology from a magnetic levitating train system, so it’s still pretty damned futuristic.


Kirk gathers his crew and explains that they are the only ship near enough to intercept the evil space cloud, and so they’re going to have to save the day. A message comes through on the big screen from Epsilon 9, informing everyone that the cloud is “over eighty-two AUs in diameter.” This seems somewhat incredible, given that one AU (astronomical unit) is the average distance from Earth to the sun, which is about ninety-three million miles. (Indeed, it’s so incredible, that the director’s edition of the movie used some sneaky dialogue editing to make the cloud “over two AUs in diameter” instead.)

Technical schematic of one astronomical unit (not to scale)

To give some context to vastness on this scale: On August 25, 2012, NASA’s Voyager 1, humankind’s farthest-traveling spacecraft, entered interstellar space at a distance of around 125 AUs from the sun, after traveling nonstop for nearly thirty-five years. Even this distance is still technically within our solar system, however, and it will be another forty thousand years before Voyager 1 approaches a planetary system outside our own. (Unless, that is, a Voyager craft finds a way to sidestep such vast distances by traveling through a black hole—not that this is likely to be relevant to the plot of The Motion Picture, of course.)

Regardless of the true size of the cloud, it makes short work of dispatching Epsilon 9 (and a random man, with really big hands, in an orange spacesuit) while the Enterprise crew looks on. Kirk tells the shocked crew to get ready to leave, and they set about their preparations.

A random man, with really big hands, in an orange spacesuit shortly before Epsilon 9 is destroyed

As the Enterprise heads cloud-wards, Kirk notes that he must risk engaging warp drive while still within the solar system. In a turn of events that will surprise no one, engaging a not-properly-tested warp drive while still in the solar system turns out to be a bad idea, resulting in the Enterprise being dragged into a wormhole. Everything goooeeesss a biiiiiit sloooooow moootionnnnnn as the crew tries to destroy an asteroid that has been dragged into the wormhole with them. This scene’s highlight is an excellent use of some spare Letraset Instant Lettering, in which a combination of symbols from the bottom right corner of a Letraset sheet are used upside-down in a weapons system’s “TRACKING SEQUENCE”:

Detail from the Enterprise’s weapons system tracking sequence, showing a cluster of punctuation glyphs just below “TRACKING”
Upside-down detail of a sheet of Letraset Instant Lettering, showing the same cluster of punctuation glyphs

With the wormhole successfully navigated, the Enterprise continues on its way toward the space cloud. Not much happens. Indeed, there are ten whole minutes that basically consist just of people looking at a cloud, without any typography. So let’s skip ahead to the cloud’s approximate center, and the mild peril therein.


Intruder alert! A strange alien light-beam probe intrudes into the bridge. It heads over to the science station and starts calling up blueprints of the Enterprise:

These blueprints are taken directly from 1973’s “Star Trek” Blueprints by Franz Joseph, who also wrote and designed the Star Fleet Technical Manual. Specifically, these blueprints show the inboard profile and crew’s quarters of the Enterprise. (Strictly speaking, this means the probe is scanning blueprints from before the Enterprise underwent its Motion Picture refit, but let’s not worry too much.)

A selection of the blueprints scanned by the light-beam probe during its bridge intrusion
Detail of the Inboard Profile from “Star Trek” Blueprints, showing the location of three blueprints queried by the probe
Detail of the Deck 6 Plan—Crew’s Quarters from “Star Trek” Blueprints, showing the location of two blueprints queried by the probe

Joseph’s Blueprints and Technical Manual were spectacularly successful publications for Ballantine, with the Technical Manual reaching number one on the New York Times trade paperback list. (Indeed, it is entirely possible that the books’ success contributed to Paramount’s decision to revive Star Trek in the first place.)

“Star Trek” Blueprints, by Franz Joseph (Ballantine Books, 1973)

After scanning the blueprints, the probe zaps Deltan crew member Ilia, and she disappears. She’s not gone long, though. A few minutes later, she returns as IliaBot, a synthetic replacement created by the mysterious life force at the space cloud’s center in order to communicate with the “carbon units” aboard the Enterprise.

The newly appeared IliaBot notes that she has been “programmed by V’Ger to observe and record.” In doing so, she provides a name for the mysterious cloud-based entity that’s been causing everyone so much trouble. (Although why there would be a flying entity called V’Ger this many AUs from the sun still remains unclear.)

Spock advocates a thorough medical analysis, and IliaBot is whisked off to the Enterprise’s sick bay, where she is scanned on a futuristic medical table:

Alien, also released in 1979, features a remarkably similar scanner in the Nostromo’s onboard “autodoc.” Alien’s version didn’t make it into the original theatrical cut of the movie, but you can see it in the 2003 director’s cut, in which it is used to scan the body of a recently face-hugged Kane:

Both of these devices look to be inspired by the just-invented science of X-ray computed tomography (better known today as a CT scan), for which physicists Allan M. Cormack and Godfrey N. Hounsfield shared a 1979 Nobel Prize.

These two scanning devices may have seemed futuristic to 1979 viewers, but as CT scanners became more common, later sci-fi outings had to up their game to keep one step ahead of reality. Lost in Space (1998) and the “Ariel” episode of Firefly (2002) both moved to holographic scanners, projecting a virtual image of a patient’s innards above their actual body instead of on a screen.

Dr. Judy Robinson’s body is holographically scanned for a heartbeat in 1998’s Lost in Space.
River Tam’s body is scanned by a holo-imager device in 2002’s “Ariel” episode of Firefly.

More recent movies have adopted even more advanced approaches. Prometheus (2012), Elysium (2013), and Passengers (2016) all feature devices that perform actual surgery without the need for a doctor.

An ailing Dr. Elizabeth Shaw staggers toward a Pauling MedPod in 2012’s Alien prequel, Prometheus.
Frey Santiago places her daughter, Matilda, in a Med-Bay device in 2013’s Elysium. The Med- Bay scans her, detects leukemia, and cures her via the magical process of “re-atomizing.”
Aurora Lane resuscitates a just-dead Jim Preston in 2016’s Passengers. As on the Nostromo in Alien, the medical gadget on the starship Avalon is known as an “autodoc.”

These devices are definitely one step ahead of present-day robotic surgery, which focuses more on improving the precision of humans rather than removing the need for them altogether. Their nearest real-world equivalent is the da Vinci Surgical System from Intuitive Surgical, which was approved by the FDA in 2000, but is still controlled by a specially trained surgeon from a nearby console.

Intuitive Surgical’s da Vinci Surgical System: a surgeon’s control console (left) and a patient cart with instruments tray (right)

The most impressive movie medical gadget, however, is surely the reconstruction device seen in 1997’s The Fifth Element, which can re-create an entire living being from a single bone fragment. Despite recent real-world advancements in growing human replacement organs, we can be confident that this device will remain futuristic for some time yet.

The Fifth Element’s body- reconstruction machine attaches new bone to an existing fragment.

As these medical devices show, the threat of technology catching up with the future is a perennial problem for science fiction, especially in a franchise as long-running as Star Trek. In the time between The Original Series (1966–69) and The Motion Picture (1979), hand-held communicators of the type used by the original Enterprise crew had gone from a futuristic possibility to a technical reality. As a result, Gene Roddenberry felt a new style of communicator was needed to make The Motion Picture feel like it was set in the future. In the place of hand-held devices, The Motion Picture introduced wrist-based communicators, worn at all times by Enterprise crew members.

Captain Kirk shows off his fancy wrist-based communicator while making a firm point.

Despite their always-available convenience, these wrist-based communicators are used to advance the plot in only two scenes—when Decker tells the crew to return to their stations, and when Kirk speaks to Uhura from beside the V’Ger craft. Indeed, it’s notable that earlier in the movie, Kirk uses a desktop comms panel in the Enterprise’s sick bay rather than speaking into the device on his wrist:

Real-world wrist-based devices such as Apple’s cellular Apple Watch don’t require the device’s built-in microphone to be held anywhere near the face in order to be effective. However, in a movie, it’s necessary to move the device close to the actor’s mouth to make it clear that it is being used to communicate. (The alternative—keeping the actor’s wrist by his or her side—makes it look like the actors are simply talking to themselves.) This practical storytelling disadvantage, with the potential to mask the actor’s face, may be why the otherwise futuristic wrist communicators were used sparingly in The Motion Picture—and were quietly replaced by more traditional handheld devices in later Star Trek movies.

Nearly two hours into the movie, Decker becomes the first crew member to use a wrist-based communicator to advance the plot, telling all personnel on the Enterprise to resume their stations. He inadvertently obscures his face while doing so.
As part of the movie’s climax, Kirk uses his wrist-based communicator to speak to the Enterprise-bound Uhura from beside the V’Ger craft.

The medical scanner’s analysis tells Bones that IliaBot is made from “micro-miniature hydraulics, sensors, and molecule-size multiprocessor chips.” However, despite the spectacular level of detail in this mechanical marvel, V’Ger’s creation is sadly missing any sign of emotion. An optimistic Decker takes it on a tour of the recreation deck, one of Ilia’s favorite hangouts pre-robotification, to try to connect with its non-robotic side. Here, Decker introduces IliaBot to a set of illustrations of five historical ships called Enterprise:

From left to right, they are:

1) The USS Enterprise (1799–1823), in 1812 when rigged as a brigantine
2) The USS Enterprise (CV-6), an aircraft carrier that became the most decorated ship in World War II
3) The space shuttle Enterprise (OV-101), NASA’s prototype orbiter
4) The USS Enterprise (XCV-330), an early Star Trek craft that never actually appeared
on screen. The same illustration does, however, appear on a wall in “First Flight,” a 2003 Star Trek: Enterprise episode.
A model of XCV-330 also appears briefly on Admiral Marcus’s desk in 2013’s Star Trek: Into Darkness, as part of another collection of significant flying machines.
5) The original USS Enterprise (NCC-1701), as it looked in The Original Series before its Motion Picture refit. (I haven’t been able to track down the exact image used here, but it is representative of nearly every Original Series planet flyby, albeit from right to left rather than the usual left to right.)

The USS Enterprise (CV-6) is not the only aircraft carrier to have a Star Trek connection. Its successor, the nuclear-powered CVN-65, appears in Star Trek IV: The Voyage Home, when Chekov and Uhura attempt to steal some nuclear power to recrystallize their dilithium. We can be confident that they steal it from the USS Enterprise (CVN-65), because the US Navy has conveniently left out some large signs that say “USS ENTERPRISE CVN-65”:

The USS Enterprise (CVN-65) in 1986’s Star Trek: The Voyage Home

UPDATE: Actually, it turns out we can’t be confident that this is the USS Enterprise (CVN-65), because it was out at sea when this scene was filmed. The non-nuclear USS Ranger (CV-61) stood in for it instead.

The inclusion of the space shuttle Enterprise (OV-101) in this gallery is also of note. NASA’s prototype space shuttle was originally going to be called Constitution, to celebrate its unveiling on the anniversary of the signing of the United States Constitution. However, Star Trek superfan Bjo Trimble, who had previously led a successful letter-writing campaign to bring back The Original Series for a third season, launched a new campaign to encourage fans to ask for the shuttle’s name to be changed to Enterprise. Tens of thousands of fans responded. This led President Gerald Ford’s senior economic adviser, William F. Gorog, to write a memo to the president, advocating for the name change:

Memo from William F. Gorog to President Gerald Ford, September 3, 1976. The “public interest… the CB radio provided” refers to Betty Ford, First Lady of the United States, who jumped on the 1970s craze for CB radio with the handle First Mama. This turned out to be great PR.

Four days later, Ford received a follow-up memo from James Connor, secretary to the cabinet. Ford was convinced, and the Constitution became the Enterprise.

Memo from James Connor to President Gerald Ford, September 7, 1976. Feedback on the naming was provided by Philip Buchen, legal adviser; Brent Scowcroft, national security adviser; James Cannon, domestic policy adviser; Guy Stever, science adviser; Robert Hartmann, counselor to the president; and Jack Marsh, counselor to the president.

When the shuttle was unveiled on September 17, 1976, many of the Star Trek cast were in attendance as special guests, along with creator Gene Roddenberry. To complete the occasion, the Air Force band struck up a surprise rendition of the Star Trek theme in their honor.

(left to right) Dr. James C. Fletcher (NASA administrator); DeForest Kelley (Bones); George Takei (Sulu); James Doohan (Scotty); Nichelle Nichols (Uhura); Leonard Nimoy (Spock); Gene Roddenberry; Don Fuqua (chairman of the House Space Committee); Walter Koenig (Chekov).

Paramount wisely made the most of the announcement’s PR value, taking out a full-page ad in the New York Times a few days later to announce the movie to the world:

As a prototype, the space shuttle Enterprise was constructed without engines or a functional heat shield, and was therefore incapable of independent spaceflight. The photo seen in The Motion Picture’s recreation room looks to be from the Enterprise’s final test flight in 1977’s approach and landing tests, during which the Enterprise was flown atop a Shuttle Carrier Aircraft (a heavily modified Boeing 747), then jettisoned from its perch using explosive bolts. After its release, it glided to a landing on the runway at Edwards Air Force Base. (The engines seen on the back of the craft in the photo are dummy engines, for aerodynamic test purposes only.) Despite initial plans to retrofit the Enterprise as a spaceflight-capable vehicle, this ended up being financially impractical, and the Enterprise never made it into orbit. Nonetheless, its 1977 test landing happened just in time for a photo of OV-101 to appear aboard NCC-1701.

The space shuttle Enterprise might not have made it into space, but some of its campaigners did. As a thank-you for the letter-writing efforts, Star Trek creator Gene Roddenberry asked Trimble to organize a cattle call of fans to appear in the recreation room scene. The Enterprise crew members shown listening to Kirk’s impassioned speech in The Motion Picture are all Trekkers, in a range of costumes and latex alien masks:

UPDATE: According to none other than Bjo Trimble herself in the comments below, the crew in this scene is actually a mix of Trek fans and professional extras (at the insistence of the Screen Extras Guild). The good news is that fans were paid the same as the professional extras for their time; the bad news is that both fans and extras were on set from 5am until midnight, as the scene was filmed in one day rather than the expected two. Bjo herself is in a white uniform, halfway back, behind an extremely tall extra.

The space-faring hopes of real-world Enterprise craft didn’t end with the space shuttle Enterprise, however. In December 2009, Virgin Galactic unveiled its SpaceShipTwo spacecraft, as part of an attempt to become the world’s first commercial space carrier:

VSS Enterprise during its first supersonic powered flight.

From a Virgin Galactic press release of the time:

In honour of a long tradition of using the word Enterprise in the naming of Royal Navy, US Navy, NASA vehicles and even science fiction spacecraft, Governor Schwarzenegger of California and Governor Richardson of New Mexico will today christen SS2 with the name Virgin Space Ship (VSS) ENTERPRISE. This represents not only an acknowledgement to that name’s honorable past but also looks to the future of the role of private enterprise in the development of the exploration, industrialisation and human habitation of space.

Sadly, this Enterprise didn’t make it into space either. During a powered test flight in October 2014, the VSS Enterprise was destroyed after a premature deployment of its descent device.

In a twist of hope for The Motion Picture fans, the VSS Enterprise’s successor was nearly named VSS Voyager. Its final name, however, was chosen by physicist Stephen Hawking, who christened the new craft VSS Unity instead.


While Decker gives Ilia a tour, Spock sneaks outside in a spacesuit and flies through V’Ger’s pulsating orifice with pinpoint timing. He encounters a giant pseudo-Ilia, with whom he attempts a mind-meld. A dazzling montage of images flashes up on his space helmet during the meld, including one of particular interest:

This illustration is from the six-by-nine-inch gold-anodized aluminum plaque mounted aboard NASA’s Pioneer 10 and Pioneer 11 spacecraft, which traveled to Jupiter (and in the case of Pioneer 11, to Saturn) in the late 1970s:

Pioneer 10’s engraved aluminum plaque

These plaques were created by author Carl Sagan and SETI (Search for Extraterrestrial Intelligence) Institute founder Frank Drake, and illustrated by Linda Salzman Sagan, as a message from mankind to any passing extraterrestrials that might chance upon the flights.

The purpose of these plaques was to educate inhabitants of other planetary systems about Pioneer’s home planet. Pioneer 10 is shown leaving the solar system after passing Jupiter, though the arrow symbol indicating its trajectory is unlikely to make sense to extraterrestrial species unless they, too, evolved from a hunter-gatherer society.

Pioneer 10’s aluminum plaque, mounted facing inward on the craft’s antenna support struts to shield it from erosion by interstellar dust

This Pioneer plaque also appears briefly in Star Trek V: The Final Frontier, though it is erroneously facing outward for shot-framing convenience. It doesn’t really make a difference, as the Pioneer craft is unceremoniously blown up by a passing extraterrestrial, without so much as a glance at the engraving. (The moral: Let’s hope mankind’s actual first contact does not involve a Klingon.)

The aluminum plaque, shown facing outward on a Pioneer craft…
…shortly before its destruction by a bored Klingon captain in 1989’s Star Trek V: The Final Frontier.

Pioneer’s successor, the Voyager program, also contained a message to extraterrestrial life. For Voyager, however, Sagan went one better than a plaque, sending an entire album of Earth facts into the cosmos. And when I say “album,” I really do mean it in the old-school vinyl LP sense. Voyager 1 and 2 each carried a golden record of earthly sights and sounds, along with engraved instructions as to how to play the record.

Engraved cover of the Voyager golden record
Side one of the Voyager golden record, The Sounds of Earth

The Voyager golden records include music, spoken phrases, a recording of human brain-waves, and a series of encoded images including food, anatomy, biology, animals, and architecture. The presence of these images showing Earth’s amazingness raised concerns around the launch of Voyager 1 and Voyager 2 that if an intelligent species ever did encounter one of the records, it might decide to invade or attack Earth to control that amazingness itself. Still, I’m sure there’s no chance of that happening in practice, right?

After a tense standoff during which a mysterious being named “V’Ger” threatens to attack Earth to control its amazingness, the Enterprise crew manage to gain an audience with V’Ger itself. Much to everyone’s surprise, the all-powerful creature at the center of the evil space cloud turns out to be an old-school NASA spacecraft. As the crew inspects the craft, Kirk rubs some dirt off a sign on the side and reads out, “V-G-E-R. V’Ger.” Oh my goodness—it’s a hypothetical Voyager 6!

The timing of The Motion Picture’s release is important when considering the cultural significance of this scene. Voyager 2 launched on August 20, 1977, followed by Voyager 1 a couple of weeks later. (Voyager 1 overtook its sister craft in December 1977, hence the unusual ordering.) Both craft completed their observations of Jupiter during 1979, shortly before The Motion Picture was released. In the same way that 1968’s 2001: A Space Odyssey leap-frogged the Apollo moon missions and aimed for Jupiter, 1979’s The Motion Picture leapfrogged Jupiter and imagined humanity sending a probe far beyond its 1970s counterparts.

The idea of a rogue Voyager would not have been alien to seventies viewers, either. Shortly after the launch of Voyager 2, the probe started triggering its onboard systems independently, ignoring the post-launch instructions of its creators. The extreme shaking of Voyager 2’s launch rocket had made the probe think it was failing, causing it to switch to its backup systems and
try to figure out what was going on. Indeed, for a few days NASA engineers on Earth had no idea whether the Voyager 2 mission had been lost altogether. Newspapers ran headlines such as “‘Mutiny’ in Space” and “Voyager Going to HAL?”, with 2001’s antagonist HAL 9000 clearly still a reference point for space-based computers turning rogue. Thankfully, Voyager 2’s trajectory stabilized and conditions returned to normal operating thresholds, ending its temporary rebellion.

Front page of the Pasadena Star-News, Monday, August 29, 1977. The headline “Voyager Going To HAL?” compares Voyager 2 with the evil space-based computer from 2001: A Space Odyssey.

The real-world Voyager 1 and 2 were sadly not succeeded by Voyagers 3 through 6, leaving The Motion Picture to imagine what the program’s future might have been. Its ambitions are certainly grand—during the movie’s finale, Decker suggests that V’Ger traveled to the far side of the galaxy after falling in to “what they used to call a ‘black hole’”. This is impressive, given that the nearest black hole to Earth is HR 6819, which is about 1,120 light years away from our solar system. Even traveling at its current speed of 38,000mph, it would take Voyager 1 nearly 20 million years to reach this black hole. (No amount of Eurostile Bold Extended can set The Motion Picture that far in the future.)


Back in the movie’s finale, the Enterprise crew deduces a remarkable amount of V’Ger’s life story from zero evidence, thereby accelerating the exposition nicely. Without any assisting typography, they determine that V’Ger wants to physically join with a human to become one with its creator. Decker bravely volunteers to join with IliaBot. He turns all sparkly, and his hair blows crazily in the breeze. Ilia’s does not. As their join completes, the entirety of V’Ger turns sparkly and disappears to leave nothing but the Enterprise, flying pointedly toward the camera.


As we wave goodbye to the Enterprise, we should note that unlike many sci-fi movies, The Motion Picture has not once shown a present-day company in a futuristic setting. To make up for it, Italian car manufacturer created a movie tie-in ad for the launch of their own spacey craft, the Fiat Panda:

The text on the poster can be translated as: “The human adventure is just beginning. [This was The Motion Picture’s tagline.] Fiat Panda. The conquest of space.”

To find out if this is a fair comparison on Fiat’s behalf, let’s pitch the USS Enterprise refit against the Panda 30:

USS Enterprise
(refit)
Fiat Panda 30
LaunchedJan 2244Feb 1980
Length (m)305.13.38
ShapeCurvyBoxy
Gross Weight (tons)235,2001.05
Top Speed (mph)522,201,121,422 (1)71
Max Acceleration45 g0.08 g (2)
Officer Complement761
Crew Complement4260
Passenger Capacity150One in the front,
two in the back
7-Position Adjustable Rear SeatNoYes

Here’s that seven-position adjustable rear seat in action, converting from fold-up storage to a comfortable bed:

Impressive as the Enterprise undoubtedly is, I think we can all agree that the Panda is the clear winner here.

Dave Addey


The article above is an expanded chapter from the Typeset in the Future book, which you should absolutely go and buy while you’re all excited about sci-fi typography. There’s even a Japanese edition! The book also contains an in-depth interview with Mike Okuda, designer of many Star Trek movies and TV shows, and creator of the LCARS computer interface first seen in Star Trek: The Next Generation.

If you’re not yet sold by just how much you need the TITF book in your life, you can always check out my other articles, including 2001: A Space Odyssey, Alien, Blade Runner, WALL·E, and Moon. And if you’d prefer even more Star Trek font trivia, I highly recommend The Fonts of Star Trek by Yves Peters.


(1) USS Enterprise (refit) max speed warp factor = 9.2
Warp speed calculation: speed = (warp3) * c
speed = (9.2*9.2*9.2) * 299,792,458 m/s
speed = 778.69 * 299,792,458 m/s
speed = 233,444,789,535 m/s
speed = 522,201,121,422 mph

(2) 0-60mph in 34 sec = 0.08 g

by Dave Addey at Monday, 2020-08-17 15:51

2020-08-02

Uwes kleines Technikblog - Kategorie LaTeX (by Uwe Ziegenhagen)

Lose-Blatt-Sammlungen mit LaTeX

Für ein spezielles Projekt habe ich eine Möglichkeit gesucht, eine Lose-Blatt-Sammlung mit LaTeX zu gestalten. Lose-Blatt-Sammlung heißt in dem Zusammenhang:

1) Für jede (logische) Seite existiert ein separates Unterdokument.

2) Im Dateinamen jedes Unterdokuments ist die Seitennummer enthalten.

3) Einzelne Seiten können dadurch problemlos ausgetauscht werden, ohne dass sich die Seitenzahlen der anderen Seiten ändern.

4) Jedes Unterdokument besteht grundsätzlich aus nur einer physischen Textseite, kann jedoch auch länger sein. Daher hat jedes Unterdokument eine Seitennummerierung der Form xx-yy. xx steht dabei für die logische Seitennummer im Hauptdokument, die aus dem Dateinamen gezogen wird, yy für die fortlaufende (physische) Seitennummer im Unterdokument selbst. Die fortlaufende Seitennummer für das Gesamtdokument werte ich nicht aus.

5) Alle Dokumente stehen natürlich unter Versionverwaltung, hier nutze ich Subversion.

Wie habe ich es umgesetzt?

Das Hauptdokument sieht wie folgt aus. Neben dem svn-multi Paket, das die Subversion-Daten bereitstellt, nutze ich varsfromjobname, um die einzelnen Komponenten der Dateinamen (xx-yy.tex) auswerten zu können. Die einzelnen Teile von Kopf- und Fußzeile werden durch scrlayer-scrpage modifiziert.

%!TEX TS-program = Arara
% arara: pdflatex: {shell: yes}
% arara: pdflatex: {shell: yes}

\documentclass[12pt,ngerman,headsepline,footsepline]{scrartcl}
\usepackage[utf8]{inputenc}
\usepackage[T1]{fontenc}
\usepackage{babel}
\usepackage{csquotes}
\usepackage{xcolor}
\usepackage{svn-multi}
\usepackage{currfile}
\usepackage{varsfromjobname}

\usepackage{blindtext}
\addtokomafont{pagenumber}{\textit}
\addtokomafont{sectionentrypagenumber}{\color{white}}
\usepackage[left=3cm,right=1.5cm,top=2cm,bottom=3cm]{geometry}

\usepackage[headsepline=0.25pt,footsepline=0.25pt]{scrlayer-scrpage}
\pagestyle{scrheadings}

\svnidlong{$HeadURL: svn://192.168.0.80/Dokumente/Uwe/Main.tex $}
{$LastChangedDate: 2020-07-16 10:42:03 +0200 (Do, 16 Jul 2020) $}
{$LastChangedRevision: 62 $}
{$LastChangedBy: uwe $}

\title{Main Document}
\author{Max Mustermann}

\begin{document}
	
\tableofcontents

\include{sub-01}
\include{sub-02}

\end{document}

Die Subdokumente sehen jeweils wie folgt aus. Über die *foot und *head Befehle werden die Seitenköpfe und -füße mit den Komponenten aus dem Dateinamen befüllt, \addsec fügt die Dokumente zum Inhaltsverzeichnis hinzu.

%!TeX root=main.tex

\svnidlong
{$HeadURL: svn://192.168.0.42/Dokumente/Uwe/Sub-01.tex $}
{$LastChangedDate: 2020-07-16 10:10:19 +0200 (Do, 16 Jul 2020) $}
{$LastChangedRevision: 60 $}
{$LastChangedBy: uwe $}

\setcounter{page}{1}

\ohead{V\svnfilerev\ vom \svnfileday.\svnfilemonth.\svnfileyear}
\ofoot[\gettwofromcurrfilename-\pagemark]{\gettwofromcurrfilename-\pagemark}
\ifoot[\currfilename]{\currfilename} % inner foot
\ihead[]{} % inner head
\cfoot[]{} % center foot
\chead[]{} % center head


\addsec{\gettwofromcurrfilename~\getonefromcurrfilename}


\blindtext[6]

Beispieldokument: Main.pdf

Uwe

Uwe Ziegenhagen likes LaTeX and Python, sometimes even combined. Do you like my content and would like to thank me for it? Consider making a small donation to my local fablab, the Dingfabrik Köln. Details on how to donate can be found here Spenden für die Dingfabrik.

More Posts - Website

üääü­üöü–üöü–üöü–

by Uwe at Sunday, 2020-08-02 12:50

2020-07-25

TUG

Glisterings: The Book

TUG has published a third book, Glisterings: LaTeX and Other Oddments, by Peter Wilson, a collection of 27 columns on LaTeX and other topics written for TUGboat , It is softcover, 130 pages, 8.5x11, ISBN 098246262X, available from TUG for $20 including shipping or from Amazon (more links on the book's page). The original columns as published in TUGboat are available to all in the journal archives. For this book, they were corrected and augmented where needed. The author also wrote a brief introduction, and prepared an extensive index and bibliography. This book joins the two others published by TUG some years ago, TeX People: Interviews from the world of TeX, and TeX's 2^5 Anniversary: A Commemorative Collection. Although the book is not specifically related to the TUG'20 conference going on as we write this, it seemed a good time to announce it! Please join us at the conference (link for Zoom clients, meeting ID 95051714640, youtube link on conference pages). Hope everyone enjoys!

Saturday, 2020-07-25 15:23

2020-07-23

TUG

TUG 2020 conference starts July 24

The regular talks for the TUG'20 conference will start tomorrow (as we write this), at 9am US/Pacific (16:00 UTC) with a keynote talk by Steve Matteson, creative type director at Monotype. The conference's link for Zoom clients (meeting ID 95051714640), as well as the link for streaming via YouTube are posted on the conference web pages, along with the full program and abstracts, and more. The YouTube link will change occasionally during the conference, unfortunately, so if it doesn't work, check back. Hope everyone enjoys!

Thursday, 2020-07-23 21:32

2020-07-22

LaTeX Project

Second pre-release of LaTeX 2020-10-01 is available for testing

The second LaTeX pre-release for 2020-10-01 is available for testing

A few days ago we submitted a new LaTeX development format1 to CTAN and by now this should be available to all users using MiKTeX or TeX Live (on any operating system).

Main features of the second pre-release for 2020-10-01

The first pre-release, distributed at the end of May, had the following two main features (beside bug fixes):

  • the functionality of the xparse package was added directly to the LaTeX kernel
  • LaTeX’s font series handling was improved

This second pre-release adds one major new component to LaTeX: a general hook management system to improve package interoperability and enable easier customization and extension of LaTeX.

A general hook management system for LaTeX

Most LaTeX users and package writers will know the handful of hooks that LaTeX has been offering until now, the most important one perhaps being \AtBeginDocument. These are important hooks, but they are far too few so that in many cases package developers had to directly patch the internals of LaTeX. Thus resulted in many problems.

With the new hook management system, LaTeX will get many more hooks that package writers (and authors) can use to add code in a controlled and reliable way. New hooks have been added in a number of places by using the new system and more will follow over time. Available now are:

  • Hooks to add code before and after environments (formerly offered through the etoolbox package);
  • Hooks used when loading files, packages, or classes (similar to what the filehook package now provides);
  • Hooks in the page-building process (e.g., functionality previously available through packages such as atbegshi or atveryend and a few others).

The important point here is not so much that the functionality of these packages has been integrated into the LaTeX kernel, but that the hook management system provides a single structured way for different packages to reliably add and order code. This will resolve many of the inter-package interoperability issues which formerly could be resolved (if at all) only by loading the packages in a specific order, or by the use of complex and fragile code inside the packages to account for various scenarios in user documents.

The hook management system is currently described in these three documents (for the final release they will be consolidated):

  • texdoc lthooks — The description of the interfaces and the core hooks already added to the kernel.
  • texdoc ltshipout — The documentation of the hooks available during the page production process.
  • texdoc ltfilehook — hooks that can be used before or after a file gets loaded.

Other fixes and improvements

A full list of all fixes and additions is given in a draft version of ltnews32 which you should be able to read by running

texdoc ltnews32

on the command line (or by any other means available at your operating system—somewhere there should be a file called ltnews32.pdf that you can open with a PDF reader). The draft version of this file is also available from our website as LaTeX2e News Issue 32 draft.

Outlook

We expect to produce a third and final pre-release incorporating the user feedback we receive and consolidating some of the documentation. A few additional outstanding issues are expected to get fixed as well, but nothing major — the main functionality planned for the fall release is available already now with the second pre-release.

Please help with the testing

We are issuing this second pre-release now in the hope that you will help us by making sure that all the enhancements and fixes we have provided are safe and that they do not have any undesired side effects, so please help with the testing if you can.

This development format allows you to test the upcoming LaTeX release scheduled for 2020-10-01 with your documents or packages. Such testing is particularly important for package maintainers to verify that changes to the core LaTeX haven’t introduced incompatibilities with existing code. We try to identify any such problems beforehand but such an undertaking is necessarily incomplete, which is why we are asking for user testing.

Besides developers, we also ask ordinary users to try out the new release, because the more people that test the new format, the higher the chances that any hidden problems are identified before the final release in October hits the streets.

Processing your documents with the pre-release is straight forward. All you have to do is to replace the invocation command by appending -dev to the executable, e.g., on the command line you would run

pdflatex-dev myfile    or    lualatex-dev myfile    or    xelatex-dev myfile

instead of using pdflatex, lualatex or xelatex. If you use an integrated editing environment, then it depends on the system how to configure it to use an alternative format; but in any case the necessary modification should be straight forward.

Enjoy — Frank

  1. The internal version number for the pre-release is LaTeX2e <2020-10-01> pre-release-7, the first 5 pre-releases just mirrored the patch releases we did for 2020-02-02. 

Wednesday, 2020-07-22 00:00

2020-07-21

LaTeX Project

New articles by project members

New articles by project members

We have added five articles recently published in TUGboat to the site.

  • Enrico discusses aspects of mathematical typesetting,
  • Frank’s article discusses a new package that implements an improved algorithm for float handling,
  • Joseph looks a case changing in the Unicode world and the LaTeX3 tools available,
  • Ulrike’s article gives a sort overview about the work going on in providing tagged PDF documents and
  • Ulrike and Marcel discuss the use of LuaLaTeX with Harfbuzz in a case study.

You will find all of them on the Publications from 2020 page.

Tuesday, 2020-07-21 00:00

2020-07-19

Uwes kleines Technikblog - Kategorie LaTeX (by Uwe Ziegenhagen)

Appendixnotes für Beamer-Präsentationen

Remark: An English version of this article has been published on latex.net.

Stellen wir uns vor, dass wir in einer Präsentation eine Abschlussarbeit verteidigen müssen. Es kann sein, dass weitergehende Fragen kommen, die dann tiefergehend beantwortet werden müssen.

Wir könnten das entsprechende Material direkt in die Präsentation setzen, müssen dann beim Vortrag aber eventuell Folien überspringen, was auch kein guter Präsentationsstil ist. Die zusätzlichen Inhalte einfach nur ans Ende setzen ist auch schlecht, im schlimmsten Fall muss man dann während der Präsentation hin- und herspringen.

Eine elegante Lösung gibt es mit dem beamerappendixnote Paket. Es nutzt zwei Befehle, \appxnote und \printappxnotes, um Material für den Anhang zu integrieren.

Der erste Befehl erstellt den Link zum Anhang und den anzuzeigenden Link-Text, der zweite Befehl setzt alle Appendix-Notizen.

\documentclass[12pt,ngerman]{beamer}
\usepackage[T1]{fontenc}
\usepackage{babel}
\usepackage{beamerappendixnote}

\begin{document}

\begin{frame}
\frametitle{Some Beamer Slide}

\begin{itemize}
	\item Some stuff
	\item that requires
	\item more background 
\end{itemize}

\appxnote{Proof}{We can easily prove this.}

\end{frame}

\printappxnotes
	
\end{document}

Uwe

Uwe Ziegenhagen likes LaTeX and Python, sometimes even combined. Do you like my content and would like to thank me for it? Consider making a small donation to my local fablab, the Dingfabrik Köln. Details on how to donate can be found here Spenden für die Dingfabrik.

More Posts - Website

üääü­üöü–üöü–üöü–

by Uwe at Sunday, 2020-07-19 19:31

LaTeX.net

Sorting Glossaries with bib2gls

In previous posts I discussed the glossaries package (which can be extended with the glossaries-extra package) and also using makeindex or xindy to sort the terms that had been indexed in the document. This post describes another tool, bib2gls, which can be used instead of makeindex or xindy.

Makeindex and xindy are both general purpose indexing applications, but bib2gls was designed specifically for use with glossaries-extra and works in a very different way. With makeindex and xindy, commands like \gls add indexing information to an external file. This information consists of the sort value, hierarchical information (if the term has a parent entry), the associated formatting code to display the entry in the glossary and the location (normally the page number) where the indexing occurred. The indexing application sorts and collates this information and writes all the code needed to format the glossary in another file, which is input by \printglossary.

By contrast, bib2gls doesn’t create a file containing the code to typeset the glossary. Instead, it parses bib files containing definitions of terms, symbols and abbreviations, then selects the entries that are required by the document and writes the LaTeX code (using commands provided by glossaries-extra.sty) that defines these terms in a file that’s input by the preamble-only command \GlsXtrLoadResources. All the indexing information (such as the locations) is stored in internal fields associated with each entry. If a glossary is required, it can be displayed with \printunsrtglossary (or \printunsrtglossaries, which does \printunsrtglossary for each defined glossary).

This means that you can have a large database of all entries defined in a bib file (or across multiple bib files) that can be managed in an application such as jabref. If you have a document that only requires, say, 10 from a database of 1000 entries, then LaTeX only needs to use the resources required to define those 10 entries, which can improve the document build time.

\printunsrtglossary

The \printunsrtglossaries command was briefly mentioned in an earlier post. When using bib2gls it helps to understand how \printunsrtglossary works. The “unsrt” part of the name stands for “unsorted” because this command simply iterates over all defined entries in the order in which they were defined. There is no attempt to sort entries or gather child entries, although letter group headings are inserted whenever a change in group is detected. As was previously illustrated, this can cause strange results. Consider the following example:

\documentclass{article}
 
\usepackage[style=indexgroup]{glossaries-extra}
 
\newglossaryentry{parrot}{name={parrot},
  description={mainly tropical bird with bright plumage}
}
 
\newglossaryentry{armadillo}{name={armadillo},
  description={nocturnal insectivore with large claws}
}
 
\newglossaryentry{zebra}{name={zebra},
  description={wild African horse with black-and-white stripes}
}
 
\newglossaryentry{duck}{name={duck},
  description={a waterbird with webbed feet}
}
 
\newglossaryentry{aardvark}{name={aardvark},
  description={nocturnal African burrowing mammal}
}
 
\newglossaryentry{macaw}{name={macaw},
 parent={parrot},
 description={long-tailed, colourful parrot}
}
 
\newglossaryentry{mallard}{name={mallard},
 parent={duck},
 description={dabbling duck with long body and broad bill}
}
 
\newglossaryentry{ara}{name={Ara},
 parent={macaw},
 description={neotropical genus of macaw}
}
 
\newglossaryentry{northernpintail}{name={northern pintail},
 parent={duck},
 description={long-necked, small-headed duck with curved back}
}
 
\newglossaryentry{anodorhynchus}{name={Anodorhynchus},
 parent={macaw},
 description={genus of large blue macaws}
}
 
\begin{document}
\printunsrtglossary
\end{document}

This produces a strange result (click to view larger version).

Image of glossary listing entries in order of definition.

The list is in the order of definition. The parent key simply provides the hierarchical level for the benefit of the glossary style. The level starts at 0 for top-level (parentless) entries, then 1 for an entry that has a parent but no grandparent, 2 for an entry with a grandparent but no great-grandparent, etc. The glossary style may (or may not) apply a different format for entries according to their hierarchical level. In the case of the indexgroup style used here, different indenting is applied. In this example, this has led to a rather strange visual appearance that makes it look as though “macaw”, “mallard” and “northern pintail” are sub-items of “aardvark”, and the macaw genera appear to be sub-items of “mallard” and “northern pintail”.

The default behaviour of bib2gls is to hierarchically sort all the required entries before writing them to the file that’s input by \GlsXtrLoadResources. This means that \printunsrtglossary should then list them in an appropriate order.

While \printunsrtglossary iterates over all the entries for the given glossary, it inserts the group headers as follows:

  1. If the current entry doesn’t have a parent, then:
    1. find the group label associated with the current entry
    2. if the group label is different to the previous group label then:
      1. if there’s no previous group label just insert \glsgroupheading{label}
      2. if there is a previous group label insert \glsgroupskip\glsgroupheading{label}

The way the group label is obtained depends on the glossaries-extra package options and on bib2gls settings. Whether or not the group heading is actually displayed depends on the glossary style. Some styles simply redefine \glsgroupheading to do nothing (but \glsgroupskip may still create a vertical space).

Switching to bib2gls

Let’s rewrite the above example so that it uses bib2gls. The first thing that needs to be done is to create a bib file that contains all the entry definitions. If you already have an existing .tex file that contains all your entries then bib2gls comes with a convenient command-line application called convertgls2bib, which can parse a .tex file for commands like \newglossaryentry and \newabbreviation and create a bib file suitable for use with bib2gls. For example, if the above document is contained in the file myDoc.tex then run the following from a command prompt:

convertgls2bib -p myDoc.tex entries.bib

(The -p switch indicates that convertgls2bib should only parse the document preamble.) This will create the file entries.bib that contains the following:

% Encoding: UTF-8
@entry{parrot,
  name = {parrot},
  description = {mainly tropical bird with bright plumage}
}
 
@entry{armadillo,
  name = {armadillo},
  description = {nocturnal insectivore with large claws}
}
 
@entry{zebra,
  name = {zebra},
  description = {wild African horse with black-and-white stripes}
}
 
@entry{duck,
  name = {duck},
  description = {a waterbird with webbed feet}
}
 
@entry{aardvark,
  name = {aardvark},
  description = {nocturnal African burrowing mammal}
}
 
@entry{macaw,
  parent = {parrot},
  name = {macaw},
  description = {long-tailed, colourful parrot}
}
 
@entry{mallard,
  parent = {duck},
  name = {mallard},
  description = {dabbling duck with long body and broad bill}
}
 
@entry{ara,
  parent = {macaw},
  name = {Ara},
  description = {neotropical genus of macaw}
}
 
@entry{northernpintail,
  parent = {duck},
  name = {northern pintail},
  description = {long-necked, small-headed duck with curved back}
}
 
@entry{anodorhynchus,
  parent = {macaw},
  name = {Anodorhynchus},
  description = {genus of large blue macaws}
}

The definitions now need to be removed from myDoc.tex and the record package option is required. The stylemods option is also useful (but not essential). This will load the glossaries-extra-stylemods package, which modifies the predefined styles provided by glossaries.sty to make them easier to adjust and also to make them integrate better with bib2gls.

\usepackage[record,stylemods,style=indexgroup]{glossaries-extra}

The \GlsXtrLoadResources command is also required. This writes information to the aux file for the benefit of bib2gls as well as inputting the file created by bib2gls (if it exists). By default, bib2gls will only select entries that have been indexed in the document (using commands like \gls) and any dependent entries. In this example, I want to select all entries that have been defined in my entries.bib file, which means I need to use the selection=all option.

\GlsXtrLoadResources[src=entries,selection=all]

The src option indicates the name or names of the bib files where the entry data is defined. In this case, there’s only one file. If I have additional files, they need to be specified in a comma-separated list. Note that this means that braces will be required to prevent the comma from being parsed as part of the key=value list. For example, if I also have entries in the file abbreviations.bib, then I would need:

\GlsXtrLoadResources[src={entries,abbreviations},selection=all]

The complete document is now:

\documentclass{article}
 
\usepackage[record,stylemods,style=indexgroup]{glossaries-extra}
 
\GlsXtrLoadResources[src=entries,selection=all]
 
\begin{document}
\printunsrtglossary
\end{document}

The document build process is similar to using makeglossaries, but bib2gls is used instead:

pdflatex myDoc
bib2gls myDoc
pdflatex myDoc

(Alternatively, use xelatex etc instead of pdflatex.) This produces the glossary shown below:

Image of glossary with entries in alphabetical hierarchical order.

The ordering is now correct: aardvark, armadillo, duck (with sub-entries mallard and northern pintail), parrot (with sub-entry macaw that has sub-entries Anadorhynchus and Ara) and zebra. However, there are no letter group headings, even though the indexgroup style has been used. This is because bib2gls doesn’t set the group labels by default (not all glossaries require them). If you need the glossary separated into letter groups then you should instruct bib2gls to set the group labels using the --group (or -g) switch:

pdflatex myDoc
bib2gls --group myDoc
pdflatex myDoc

If you are using arara you need to add the following lines to your document:

arara: pdflatex
arara: bib2gls: { group: on }
arara: pdflatex

(For other ways of integrating bib2gls into your document build see Incorporating makeglossaries or makeglossaries-lite or bib2gls into the document build.) With the --group switch, the glossary now looks like:

Image of hierarchically sorted glossary with letter groups.

Naturally you also need to use a glossary style that supports groups.

If the document source is in the file myDoc.tex then bib2gls will create a file called myDoc.glstex that corresponds to the first \GlsXtrLoadResources command. If there are multiple instances of this command then myDoc-1.glstex will correspond to the second \GlsXtrLoadResources, myDoc-2.glstex will correspond to the third \GlsXtrLoadResources etc. These files simply contain LaTeX code and so can be inspected in a text editor.

In order to make it easier to perform minor adjustments, the glstex files provide wrapper commands. For example, terms that are defined with @entry in the bib file are defined in the glstex file using \bibglsnewentry, which has the definition provided before the first instance of its use:

\providecommand{\bibglsnewentry}[4]{%
\longnewglossaryentry*{#1}{name={#3},#2}{#4}%
}

This uses \longnewglossaryentry* instead of \newglossaryentry to allow for paragraph breaks in the description. If you try the above example document and look at the glstex file, you will see the entry definitions. These include the sort key for informational purposes. If you’re not sure why bib2gls ordered the entries a certain way, checking the sort key in the glstex file can show you the value used by the selected bib2gls sort method. For example, the “northern pintail” entry is defined in the glstex file as follows:

\bibglsnewentry{northernpintail}%
{parent={duck},
sort={northern|pintail|}}%
{northern pintail}%
{long-necked, small-headed duck with curved back}

Note that the sort value is northern|pintail| because the default sort method is a locale-sensitive word sort that discards spaces and (some) punctuation and marks up the end of each word with a marker (the pipe character | by default). Any remaining non-letter characters may by ignored by the comparison function (the comparator). This is appropriate for entries where the name is a word or phrase but may not be appropriate for other types of entries, such as symbols.

Sort Fallbacks

In general, you typically shouldn’t set the sort field in the bib file, but instead use bib2gls’s fallback system to choose the most appropriate field. If the sort field is explicitly set then the fallback isn’t used. The fallback depends on the entry type used in the bib file. For example, if a term was defined in the bib file using @entry then the sort fallback is obtained from the name field. However, if the term was defined using @symbol (or @number) then the sort fallback is obtained from the label, and if the term was defined using @abbreviation (or @acronym) then the sort fallback is obtained from the short field

Suppose now that in addition to my entries.bib file described above, I also have a file called abbreviations.bib that contains:

% Encoding: UTF-8
 
@abbreviation{xml,
 short={XML},
 long={extensible markup language},
 description={a markup language that defines a set of rules for
    encoding documents}
}
 
@abbreviation{html,
  short={HTML},
  long={hypertext markup language},
  description={the standard markup language for creating web pages}
}

and a file called constants.bib that contains:

% Encoding: UTF-8
 
@number{pi,
  description={ratio of circumference of a circle to its diameter},
  name={\ensuremath{\pi}},
  user1={3.14159}
}
 
@number{e,
  description={Euler's number},
  name={\ensuremath{e}},
  user1={2.71828}
}
 
@number{root2,
  description={Pythagoras' constant},
  name={\ensuremath{\surd2}},
  user1={1.41421}
}
 
@number{gelfondscons,
  description={Gelfond's constant},
  name={\ensuremath{e^\pi}},
  user1={23.140692}
}
 
@number{zero,
  description={zero},
  name={\ensuremath{0}}
}
 
@number{one,
  description={one},
  name={\ensuremath{1}}
}

I can now modify my document as follows:

\documentclass{article}
 
\usepackage[record,stylemods,style=indexgroup]{glossaries-extra}
 
\setabbreviationstyle{long-short-desc}
\GlsXtrLoadResources[src={entries,abbreviations,constants},selection=all]
 
\begin{document}
\printunsrtglossaries
\end{document}

Note that I have to set the abbreviation style before \GlsXtrLoadResources. This produces the following glossary:Image of glossary containing terms, mathematical constants and abbreviations.The sort value for the mathematical constants has been obtained from the label. For example, Pythagoras’ constant has the label root2 and so ends up in the “R” letter group, whereas Ď€ (which has the label pi) is in the “P” letter group. The sort value for the extensible markup language (XML) term has been obtained from the short field (XML) so it ends up in the “X” letter group. The fallback values can be changed. For example:

\documentclass{article}
 
\usepackage[record,stylemods,style=indexgroup]{glossaries-extra}
 
\setabbreviationstyle{long-short-desc}
\GlsXtrLoadResources[src={entries,abbreviations,constants},
 abbreviation-sort-fallback=long,
 symbol-sort-fallback=description,
 selection=all]
 
\begin{document}
\printunsrtglossaries
\end{document}

This will now use the long field as the fallback for abbreviations (@abbreviation or @acronym) and the description field as the fallback for symbols (@symbol or @number).Image of glossary with abbreviations order by long form and symbols ordered by description.The “extensible markup language (XML)” entry is now in the “E” letter group and Pythagoras’ constant is now in the “P” letter group, but Ď€ (pi) is now in the “R” letter group.

Sub-Blocks

I’m now going to make a strange modification to the above document:

\documentclass{article}
 
\usepackage[record,stylemods,style=indexgroup]{glossaries-extra}
 
\setabbreviationstyle{long-short-desc}
\GlsXtrLoadResources[src={entries},selection=all]
 
\GlsXtrLoadResources[src={constants},
 sort={en-reverse},
 symbol-sort-fallback=description,
 selection=all]
 
\GlsXtrLoadResources[src={abbreviations},
 abbreviation-sort-fallback=long,
 selection=all]
 
\begin{document}
\printunsrtglossary
\end{document}

This splits the data in the three bib files into three separate resource commands (\GlsXtrLoadResources).

The first \GlsXtrLoadResources selects all the terms defined in entries.bib and sorts them according to my locale (since there’s no language set in the document) which happens to be en-GB (British English). These terms are written to the myDoc.glstex file, which is input by the first resource command on the next LaTeX run.

The second command selects all the terms defined in constants.bib and sorts them according to en-reverse (reverse English, that is, Z–A) and writes them to myDoc-1.glstex, which is input by the second resource command.

The third command selects all the terms defined in abbreviations.bib and sorts them according to my locale and writes them to myDoc-2.glstex, which is input by the third resource command.

Remember that \printunsrtglossary simply iterates over all entries in the order in which they are defined (via commands like \longnewglossaryentry or \newabbreviation, not the order in which they are defined in the bib files). This results in the order: (first glstex file) aardvark and armadillo (both assigned to the “A” letter group), duck (assigned to the “D” letter group) followed by its child entries, parrot (assigned to the “P” letter group) followed by its descendants, zebra (assigned to the “Z” letter group) then (second glstex file) zero (assigned to the “Z” letter group), Ď€ (assigned to the “R” letter group), Pythagoras’ constant (assigned to the “P” letter group), one (assigned to the “O” letter group), Gelfond’s constant (assigned to the “G” letter group), and Euler’s number (assigned to the “E” letter group) then (third glstex file) extensible markup language (assigned to the “E” letter group) and hypertext markup language (assigned to the “H” letter group).

Image of glossary made up of three sub-blocks.

Note that there’s no visual indication between the sub-blocks. The “Z” letter group contains zebra from the end of the first sub-block and zero from the start of the second sub-block. The “E” letter group contains Euler’s number from the end of the second sub-block and extensible markup language from the start of the third sub-block. Note also that there are two “P” letter groups: one from the first sub-block and the other from the second.

This is a contrived example. It’s more typical to use multiple resource commands to process multiple glossaries that require different settings or to gather together all entries in a sub-block into a custom group. (See also Logical Glossary Divisions (type vs group vs parent).)

The following modification demonstrates custom groups:

\documentclass{article}
 
\usepackage[record,stylemods,style=indexgroup]{glossaries-extra}
 
\setabbreviationstyle{long-short-desc}
\GlsXtrLoadResources[src={entries},selection=all]
 
\glsxtrsetgrouptitle{cons}{Mathematical Constants}
 
\GlsXtrLoadResources[src={constants},
 group=cons,% group label
 symbol-sort-fallback=description,
 selection=all]
 
\glsxtrsetgrouptitle{abbrvs}{Abbreviations}
 
\GlsXtrLoadResources[src={abbreviations},
 group=abbrvs,% group label
 abbreviation-sort-fallback=long,
 selection=all]
 
\begin{document}
\printunsrtglossary
\end{document}

This assigns the constants to a group labelled cons with the title “Mathematical Constants” and assigns the abbreviations to a group labelled abbrvs with the title “Abbreviations”.
Image of glossary with custom groups.

Multiple Glossaries

The following demonstrates multiple glossaries:

\documentclass{article}
 
\usepackage[abbreviations,symbols,record,stylemods=mcols]{glossaries-extra}
 
\setabbreviationstyle{long-short-desc}
\GlsXtrLoadResources[src={entries},selection=all]
 
\GlsXtrLoadResources[src={constants},
 type=symbols,
 symbol-sort-fallback=description,
 selection=all]
 
\GlsXtrLoadResources[src={abbreviations},
 type=abbreviations,
 abbreviation-sort-fallback=long,
 selection=all]
 
\begin{document}
\printunsrtglossary[style=indexgroup]
\printunsrtglossary[type=symbols,style=mcolindex,nogroupskip]
\printunsrtglossary[type=abbreviations,style=index]
\end{document}

This uses the abbreviations and symbols package options to create the additional glossaries. Note that the use of multiple glossaries means that I can apply a different style to each glossary. (The stylemods=mcols option indicates to load glossary-mcols.sty as well as loading glossaries-extra-stylemods.sty.) The mcolindex style doesn’t display the group headings but it does still create a vertical gap with \glsgroupskip. I’ve added the nogroupskip option to suppress it. The index style likewise doesn’t show the group heading but does create a vertical gap. I haven’t used the nogroupskip option for the abbreviations glossary so the inter-group gap is visible.

Image of main glossary, list of symbols and list of abbreviations.

The type=abbreviations resource option wasn’t actually necessary in the above example. Entries defined in the bib file with @abbreviation are written to the glstex file using the provided command:

\providecommand{\bibglsnewabbreviation}[4]{%
  \newabbreviation[#2]{#1}{#3}{#4}%
}

Note that this uses \newabbreviation which automatically assigns the entry to the “abbreviations” glossary if the abbreviations package option is used. This means that both the main glossary and the abbreviations can be processed in one resource command:

\GlsXtrLoadResources[
 src={entries,abbreviations},
 abbreviation-sort-fallback=long,
 selection=all]

This only needs to be split into two resource commands if there are conflicting settings. For example, suppose I want bib2gls to ignore the description fields in the abbreviations.bib file but not the entries.bib file. Then I would need:

\GlsXtrLoadResources[src={entries},selection=all]
 
\setabbreviationstyle{long-short}
\GlsXtrLoadResources[src={abbreviations},
 ignore-fields={description},
 selection=all]

Categories

In an earlier post I mentioned categories, which can be assigned using the category key. It’s possible to assign this key using the resource command with the same name. For example, I could make all my entries in the constants.bib file have the category “symbol” like this:

\GlsXtrLoadResources[src={constants},
 type=symbols,
 category=symbol,
 symbol-sort-fallback=description,
 selection=all]

Alternatively I could use category={same as entry}, which would make the category the same as the bib entry type without the leading @ (in this case number), or category={same as type}, which would make the category the same as the type (in this case symbols), or category={same as base}, which would make the category the same as the basename of the bib file without the .bib extension (in this case constants).

The other post described how to define a post-link category hook using \glsdefpostlink. This allows additional content to be added after commands like \gls for a particular category. There’s a similar hook that’s implemented after the description is displayed with the standard glossary styles.

\glsdefpostdesc{category}{definition}

This is something you can also use with makeindex and xindy, but the bib2gls resource options make it easier to change the category for specific documents without having to alter the entry definition.

Some of the terms defined in constants.bib have the user1 field set to the approximate numerical value. Assigning a category and post-description hook makes it easy to display this value in the glossary. Note that the post-description hook comes before the punctuation inserted with the postdot or postpunc package options.

\documentclass{article}
 
\usepackage[postdot,abbreviations,symbols,record,stylemods={mcols}]{glossaries-extra}
 
\GlsXtrLoadResources[src={entries},selection=all]
 
\glsdefpostdesc{number}{%
 \glsxtrifhasfield{useri}{\glscurrententrylabel}%
 {\space ($\approx\glscurrentfieldvalue$)}{}%
}
 
\GlsXtrLoadResources[src={constants},
 type=symbols,
 category={same as entry},
 symbol-sort-fallback=description,
 selection=all]
 
\setabbreviationstyle{long-short}
\GlsXtrLoadResources[src={abbreviations},
 ignore-fields={description},
 selection=all]
 
\begin{document}
\printunsrtglossary[style=indexgroup]
\printunsrtglossary[type=symbols,style=mcolindex,nogroupskip]
\printunsrtglossary[type=abbreviations,style=index]
\end{document}

Image of main glossary, list of symbols and list of abbreviations. The approximate values are shown after the description in the symbols list.

For example, Euler’s number is now followed by “ (≈2.71828)”. The closing full stop (automatically inserted by the postdot package option) comes after the parenthetical material. Note that zero and one don’t have the user1 field set as the value given in the name field is the exact numerical value.

Numeric Sorting

There are some numerical sorting methods available to use with bib2gls, such as sort=integer for integers or sort=double for double-precision floating point values. For locale-sensitive numbers (that include thousand separators) you need to use sort=numeric or sort=numberformat instead. Let’s suppose that I now want the mathematical constants ordered according to their approximate value. As before I can use symbol-sort-fallback to change the fallback field to user1, but that will leave a blank sort value for zero and one, which don’t have that field set.

With bib2gls version 2.7 or above, I can use the field concatenation operator symbol-sort-fallback=user1+name. This will construct the sort fallback value from a combination of the user1 value and the name. This means that, for example, Euler’s number will have the sort value 2.71828 \ensuremath{e} and “one” will have the sort value \ensuremath{1}. The non-numeric content can then be stripped off using:

sort-replace={{[^0-9\string\.\string\-]+}{}}

The following is a modification of the previous example document. I’ve decided to provide my own custom glossary for the mathematical constants instead of using the one provided by the symbols package option for illustrative purposes.

\documentclass{article}
 
\usepackage[postdot,abbreviations,record,stylemods={mcols}]{glossaries-extra}
 
\GlsXtrLoadResources[src={entries},selection=all]
 
\glsdefpostdesc{number}{%
 \glsxtrifhasfield{useri}{\glscurrententrylabel}%
 {\space ($\approx\glscurrentfieldvalue$)}{}%
}
 
\newglossary*{constants}{Mathematical Constants}
 
\GlsXtrLoadResources[src={constants},
 type=constants,
 category={same as entry},
 sort=double,
 symbol-sort-fallback=user1+name,
 sort-replace={{[^0-9\string\.\string\-]+}{}},
 selection=all]
 
\setabbreviationstyle{long-short}
\GlsXtrLoadResources[src={abbreviations},
 ignore-fields={description},
 selection=all]
 
\begin{document}
\printunsrtglossary[style=indexgroup]
\printunsrtglossary[type=constants,style=mcolindex,nogroupskip]
\printunsrtglossary[type=abbreviations,style=index]
\end{document}

The list of mathematical constants is now ordered according to the associated numeric value: 0, 1, Pythagoras’ constant (1.41421), Euler’s number (2.71828), π (3.14159) and Gelfond’s constant (23.140692). Note that the main glossary and list of abbreviations are still ordered according to the locale-sensitive word sort method.

Image of main glossary, list of mathematical constants ordered by numeric value, and list of abbreviations.

(If you decide to change the glossary label, you may need to delete the glstex files before you can rebuild the document.)

The TeX Parser Library

When determining the sort value, bib2gls uses the TeX Parser Library if the value contains any of the following symbols: \ { } or $. This library isn’t a proper TeX engine and has to interpret TeX fragments outside of the complete document code, but it does understand a limited number of commands and is able to convert some common kernel symbol commands into the closest matching Unicode character. It also has some limited understanding of a few packages, such as siunitx, pifont and upgreek. To find the complete list run bib2gls from the command line with the --list-known-packages switch:

bib2gls --list-known-packages

If a command is unknown bib2gls will ignore it. This means that the sort value can end up empty if it originally consisted solely of unknown commands.

Suppose now I want the mathematical constants from the previous example to be ordered according to the name (symbol-sort-fallback=name). When sorting symbols, it’s better to use a character code comparator rather than the locale-sensitive word method (which can discard non-letters). In the example document below I’ve used sort=letter-case, which is a case-sensitive character code sort method.

\documentclass{article}
 
\usepackage[postdot,abbreviations,record,stylemods={mcols}]{glossaries-extra}
 
\GlsXtrLoadResources[src={entries},selection=all]
 
\glsdefpostdesc{number}{%
 \glsxtrifhasfield{useri}{\glscurrententrylabel}%
 {\space ($\approx\glscurrentfieldvalue$)}{}%
}
 
\newglossary*{constants}{Mathematical Constants}
 
\GlsXtrLoadResources[src={constants},
 type=constants,
 category={same as entry},
 sort=letter-case,
 symbol-sort-fallback=name,
 selection=all]
 
\setabbreviationstyle{long-short}
\GlsXtrLoadResources[src={abbreviations},
 ignore-fields={description},
 selection=all]
 
\begin{document}
\printunsrtglossary[style=indexgroup]
\printunsrtglossary[type=constants,style=mcolindex,nogroupskip]
\printunsrtglossary[type=abbreviations,style=index]
\end{document}

The main glossary and list of abbreviations are still ordered according to the locale-sensitive word sort but the list of mathematical constants is now ordered according to the character code of the interpreted sort value.

Image of list of mathematical constants.

The order is now: 0, 1, e (Euler’s number), eđœ‹ (Gelfond’s constant), √2 (Pythagoras’ constant) and đœ‹ (pi). Note that đœ‹ is the Unicode character Mathematical italic small pi U+1D70B (rather than the Greek lowercase Ď€ U+03C0). The square root symbol √ is the Unicode character U+221A, so √ comes before đœ‹.

@preamble

As with bibtex, @preamble may be used in the bib file to provide commands that are used by one or more of the entries. The contents of @preamble are copied to the glstex file to ensure that the commands are defined in the document, but the contents are also supplied to the TeX parser library so that it can recognise those commands in the sort value, if required. The use of \providecommand to define the entries means that you can have an alternative definition in the document, but it must be defined before the glstex file is input.

For example, suppose I have a file called books.bib that contains the following:

@preamble{"\providecommand{\nameart}[2]{#2}"}
 
@index{thehobbit,
  name={\nameart{The}{Hobbit}}
}
 
@index{ataleoftwocities,
  name={\nameart{A}{Tale of Two Cities}}
}

This provides the command \nameart which ignores its first argument and just does the second. This means that the TeX parser library will interpret \nameart{The}{Hobbit} as simply “Hobbit”. A different definition can be provided in the document:

\documentclass{article}
 
\usepackage[record]{glossaries-extra}
 
\newcommand{\nameart}[2]{#1 #2}
 
\GlsXtrLoadResources[src=books,selection=all]
 
\begin{document}
\printunsrtglossary
\end{document}

This means that the articles “A” and “The” will be ignored when sorting. This results in the order: The Hobbit, A Tale of Two Cities because bib2gls only picks up the definition in @preamble not the one in the document.Image of glossary containing two items: The Hobbit and A Tale of Two Cities.

(@index is like @entry but doesn’t require a description. This can look a little odd when there’s no extra information, such as the location list.)

The @preamble can also be used to supply alternative definitions for commands provided by packages in the document that bib2gls doesn’t recognise. Suppose, for example, that I need to use commands provided by marvosym, such as \Email. These commands aren’t recognised by the TeX parser library and so will be ignored if they occur in the sort value. The @preamble can be used to provide definitions that expand to the closest matching Unicode character. If \providecommand is used it won’t override the definition provided by marvosym.

For example, I could create a file called pictographs.bib that contains:

@preamble{"\providecommand{\Email}{\symbol{"1F584}}
\providecommand{\Letter}{\symbol{"1F582}}
\providecommand{\Mobilefone}{\symbol{"1F581}}
\providecommand{\Telefon}{\symbol{"1F57F}}"
}
 
@symbol{email,
  name={\Email},
  description={email}
}
 
@symbol{envelope,
  name={\Letter},
  description={letter}
}
 
@symbol{phone,
  name={\Mobilefone},
  description={mobile phone}
}
 
@symbol{landline,
  name={\Telefon},
  description={telephone}
}

The document can then sort the name by character code:

\documentclass{article}
 
\usepackage{marvosym}
\usepackage[record]{glossaries-extra}
 
\GlsXtrLoadResources[src=pictographs,
  sort=letter-case,
  symbol-sort-fallback=name,
  selection=all
]
 
\begin{document}
\printunsrtglossary
\end{document}

Image of glossary listing pictographs.

The list is now in the order đŸ•ż (U+1F57F) telephone, đŸ– (U+1F581) mobile phone, đŸ–‚ (U+1F582) letter, đŸ–„ (U+1F584) email.

If there’s a possiblity that the contents of @preamble may cause a problem if they are copied into the glstex file, you can prevent the contents from being written using the write-preamble=false resource option. The contents will still be provided to the TeX parser library. If you want to prevent the contents from being interpreted use the interpret-preamble=false option.

by Nicola Talbot at Sunday, 2020-07-19 14:42

2020-07-17

LaTeX Project

Online TUG Conference 24-26 July 2020

The first online TUG conference will happen 24-26 July 2020

You probably heard this already from other sources and maybe already registered, but if not, now is the time.

Next week the annual TeX Users Group Conference will be broadcast live: to watch the presentations and otherwise participate all you need to do is to register. This year registration is free of charge, because the costs for an online event are much lower, so all that is needed to attend is the registration.

A number of people from the LaTeX Project Team are giving presentations: Javier, Joseph, Marcel and myself.

You will find many interesting and exciting talks in the program schedule and while some may be at inconvenient hours, depending on your location, most are probably at reasonable times. If not there are plans to make the talks available afterwards (no details yet), but live or partly live events are usually more fun in my opinion.

So go ahead and register yourself if you are interested and let’s make this the biggest TeX conference in years!

Enjoy — Frank

Friday, 2020-07-17 00:00

2020-07-14

LaTeX.net

Managing beamer background materials with beamerappendixnote

Imagine you are in a situation where you need to defend your thesis. Besides the actual presentation slides you have prepared lots of additional slides, just in case someone asks for the details of your proofs, etc.

You could put them to the end of your presentation and manually jump there and back which I do clearly not recommend as it destroys the “natural flow” of your presentation. It is way better to use hyperlinks to get directly to the corresponding note and back. Until today I created these links manually, now I discovered a package named beamerappendixnote which automates these steps.

Let’s have a look at the following example which loads the beamerappendixnote package. It uses two commands, \appxnote and \printappxnotes. The first creates the named link to the appendix and the text to be displayed, the latter command outputs all the appendixnotes.

\documentclass[12pt,ngerman]{beamer}
\usepackage[T1]{fontenc}
\usepackage{babel}
\usepackage{beamerappendixnote}
 
\begin{document}
 
\begin{frame}
\frametitle{Some Beamer Slide}
 
\begin{itemize}
	\item Some stuff
	\item that requires
	\item more background 
\end{itemize}
 
\appxnote{Proof}{We can easily prove this.}
 
\end{frame}
 
\printappxnotes
 
\end{document}

by Uwe Ziegenhagen at Tuesday, 2020-07-14 04:55

2020-07-10

LaTeX.net

TUG 2020 program schedule online

Great progress in a few days! Now there are 224 registered participants.

And the program schedule with times and abstracts is online now: tug.org/tug2020/sched.html

by Stefan Kottwitz at Friday, 2020-07-10 21:28

2020-07-03

LaTeX.net

TeX Users Group meeting 2020 online and for free

This year’s annual TeX users conference will be held online via Zoom (an online meeting solution). This 41th TUG meeting will happen July 24-26, 2020, and registration and attendance is for free.

No flight to pay, no hotel costs, no risk – check out the program!

Until today, 166 participants registered, and 36 presentations are listed.

Topics include

    • LaTeX online services (learnlatex.org, LaTeX-on-HTTP, Overleaf, texlive.info)
    • Tools (Make, Pandoc, Python)
    • Teaching LaTeX
    • Typography (conveying emotional charge in text, church material)
    • Fonts (Monotype, HarfBuzz, Malayalam, history)
    • Accessibility
    • Interviews (biber author, babel maintainer)

and more, during sessions, workshops, keynotes, and connecting with users wordwide.

The event contains video streams either live or prerecorded plus live discussion, and separate online rooms for “coffee talks”.

For more information and attending,

  • read the program
  • visit the event site
  • write to tug-conferences@tug.org if you have a question

The easiest way to attend a TUG meeting – see you there?

by Stefan Kottwitz at Friday, 2020-07-03 15:14

2020-06-28

LaTeX.net

Creating bilingual Beamer slides

For a lecture on Python I have been looking for a way to create bilingual slides in a single document. Based on https://tex.stackexchange.com/questions/443714/bilingual-slides-beamer-comment-package-and-non-ascii-characters-umlauts-dia and https://tex.stackexchange.com/questions/203412/using-a-generic-way-to-evaluate-global-document-options I have compiled the following example.

One simply needs to set the babel language in the document options to switch between German and English.

\documentclass[ngerman]{beamer}
 
\usepackage{comment}
\makeatletter
\newif\if@ngerman
\newif\if@option@ngerman
\DeclareOption{ngerman}{%
	\@ngermantrue
	\@option@ngermantrue
}
\ProcessOptions*\relax
\newcommand*{\ifngerman}{%
	\if@ngerman
	\expandafter\@firstoftwo
	\else
	\expandafter\@secondoftwo
	\fi
}
\makeatother
 
\ifngerman{
\usepackage[main=ngerman]{babel}
\includecomment{DE}
\excludecomment{EN}
}{
\usepackage[main=english]{babel}
\includecomment{EN}
\excludecomment{DE}
}
 
\begin{document}
	\begin{DE}
		\begin{frame}[fragile]{Hallo Welt}
		Hallo Welt
	\end{frame}
\end{DE}
 
\begin{EN}
	\begin{frame}[fragile]{Hello World}
	Hello World
\end{frame}
\end{EN}
 
\end{document}

Photo by Alex Litvin on Unsplash

by Uwe Ziegenhagen at Sunday, 2020-06-28 14:15

Typesetting todo lists with fontawesome and the tasks package

The following example shows, how todo lists and similar documents can be typeset with the tasks package, a spin-off of the exSheets package to generate task-like environments.

In this example we define three tasks environments (for TODO, PROGRESSING, DONE), for each environment we use a different symbol from the fontawesome package, a collection of more than 600 logos, many of them related to internet and WWW.

One could also use these packages to e.g. get entries from a Trello board, see the article in my blog (German): https://www.uweziegenhagen.de/?p=4228

\documentclass[12pt,english]{article}
 
\usepackage[T1]{fontenc}
\usepackage{babel}
\usepackage{tasks}
\usepackage{fontawesome}
 
\NewTasksEnvironment[label=\faHandORight,label-width=15pt]{todo}[*](1)
\NewTasksEnvironment[label=\faHandRockO,label-width=15pt]{progress}[*](1)
\NewTasksEnvironment[label=\faThumbsOUp,label-width=15pt]{done}[*](1)
 
\begin{document}
 
\section*{TODO}
 
\begin{todo}
* Water the plants
* Shop for groceries
* Sort newspapers
\end{todo}
 
\section*{PROGRESSING}
 
\begin{progress}
* Clean the basement
* Sort \LaTeX\ books
* Reinstall Linux server
\end{progress}
 
\section*{DONE}
 
\begin{done}
* Finish tax declaration
* Reinstall Windows server
* Reconfigure SSH access
\end{done}
 
\end{document}

tasks example

by Uwe Ziegenhagen at Sunday, 2020-06-28 09:55

2020-06-27

LaTeX.net

A Beamer Template for MINT Lectures

As a lecturer for MINT lectures I wanted to create a Beamer template where individual lectures could be compiled separately as well as in a collection of all slides.

After polishing the template for a few semesters an initial version of the template has been uploaded to Github (https://github.com/UweZiegenhagen/MINT-Lecture-Slide-Template).

Screenshot of listings slide

Feel free to send pull requests, comments and proposals.

by Uwe Ziegenhagen at Saturday, 2020-06-27 12:33

2020-06-22

Reden ist Silber ¶ Druckerey Blog (Martin Z. Schröder, Drucker)

Pedaltiegel mit Schwungrad zu verkaufen

Er hat mir seit 2004 treu gedient. Briefpapier, Eheurkunden, Einladungen, kleine Plakate und das zweite der bibliophilen Büchlein von Max Goldt sowie die Umschläge der Bände 2 bis 4 wurden auf dieser Maschine gedruckt.

Aber nun steht sie seit Jahren still. Wird nur noch alle paar Monate der Pflege-Routine unterzogen, entstaubt, ein wenig geölt und gestreichelt.

Es wird Zeit, daß der Nußknacker einen neuen Rumpelkutscher findet.

Gebaut wurde diese Maschine um 1900 von der Maschinenfabrik »Emil Kahle« in Leipzig-Paunsdorf.

Ich habe sie in einem nicht verwendbaren Zustand aus einem Privatmuseum gekauft, woraus sie aus Platzmangel weichen mußte. Zu dritt hievten wir sie mit Mühe auf einem Transporter. In Berlin legten wir einen dicken Strick um die vor dem Haus Schonensche Straße 38 stehende Laterne und ließen das Fundament langsam auf einer Schräge aus Brettern ins Souterrain. Anno 2003 hatte ich diese Werkstatt erst bezogen.

Wir bekamen sie aber nicht aufgestellt. Die Maschine lag schwer auf dem Boden und ließ sich von uns dreien nicht aufrichten. Da rief ich eine Freundin an und bat sie, mir ihren Sohn mit ein paar von dessen Freunden zu leihen. Dieser Sohn war nämlich damals ein junger Polizist und Kampfsportler.

Er kam rasch mit seinen Freunden nach dem Fußballtrainig vorbei, und drei der überbreiten Herren stellten die Maschine spielerisch auf die Füße.

Anschließend mußten einige Teile neu gedreht werden. Damals noch ohne CNC-Fräsen. Und ich mußte erst einmal eine Zeichnung der Walzenspindeln und Laufräder anfertigen. Ein Schlosser nahm sich der Sache an. Und der Tischlermeister Horst Wrede aus Emmen baute die fehlende Ablage, deren Maße ich einem alten Buch über ähnliche Maschinen so ungefähr entnehmen konnte.

Ein zweiter Schlosser, damals noch auf einem Hof in der Kastanienallee, in der es heute nur noch ständig wechselnde Firlefanzgeschäfte gibt, baute mir die Befestigung für den Aufzug und eine Muffe zum Festhalten des Schwungrades.

Anschließend wurden die vier Walzen neu mit Gummi bezogen.

Und dann verdiente diese Maschine zusammen mit einem Handtiegel nicht nur meinen Lebensunterhalt, sondern auch das Geld für meinen ersten Heidelberger Tiegel. Der ihr dann die Arbeit abnahm. Im Jahr 2013 zog die Maschine mit mir nach Weißensee in die jetzige Werkstatt. Aber hier hatte sie kaum noch zu tun, obwohl sie, das ist einer ihrer Vorzüge, ein etwas größeres Druckformat als der Heidelberger hat und auch über das Format weit herausragende Materialen bedrucken kann. Nun ist es also an der Zeit für einen neuen Standort, und für 3500 Euro brutto kann die Maschine abgeholt werden.

by Martin Z. Schröder at Monday, 2020-06-22 06:00

2020-06-21

Uwes kleines Technikblog - Kategorie LaTeX (by Uwe Ziegenhagen)

Mehrsprachige Beamerfolien erstellen – Teil 2

Basierend auf dem letzten Blog-Artikel dazu und einer Frage, die ich 2014 auf TSX gestellt habe, habe ich den Code für mehrsprachige Beamer-Folien angepasst.

\documentclass[ngerman]{beamer}

\usepackage{comment}
\makeatletter
\newif\if@ngerman
\newif\if@option@ngerman
\DeclareOption{ngerman}{%
	\@ngermantrue
	\@option@ngermantrue
}
\ProcessOptions*\relax
\newcommand*{\ifngerman}{%
	\if@ngerman
	\expandafter\@firstoftwo
	\else
	\expandafter\@secondoftwo
	\fi
}
\makeatother

\ifngerman{
\usepackage[main=ngerman]{babel}
\includecomment{DE}
\excludecomment{EN}
}{
\usepackage[main=english]{babel}
\includecomment{EN}
\excludecomment{DE}
}
	
\begin{document}
	\begin{DE}
		\begin{frame}[fragile]{Hallo Welt}
		Hallo Welt
	\end{frame}
\end{DE}

\begin{EN}
	\begin{frame}[fragile]{Hello World}
	Hello World
\end{frame}
\end{EN}

\end{document}

Uwe

Uwe Ziegenhagen likes LaTeX and Python, sometimes even combined. Do you like my content and would like to thank me for it? Consider making a small donation to my local fablab, the Dingfabrik Köln. Details on how to donate can be found here Spenden für die Dingfabrik.

More Posts - Website

üääü­üöü–üöü–üöü–

by Uwe at Sunday, 2020-06-21 17:28

2020-06-20

2020-06-08

LaTeX.net

About LaTeX.net

On LaTeX-Community.org, founded January 2007, many authors published articles about LaTeX and related tools. The site has been renamed to LaTeX.net. The article database has been changed from Joomla to a modern WordPress CMS. The accompanying forum has been renamed to LaTeX.org.

The LaTeX network is actually more than just this. Here comes a list.

All of the servers and web sites here are supported by DANTE e.V., the German speaking TeX user group. For several years now, DANTE covered the provider and hardware costs for two professional servers that have enough resources to run various virtual machines for different purposes.

This is what runs on this DANTE supported hardware:

Forums

LaTeX.orgInternational forum with 96112 posts in 24287 topics by 18367 users
TeXwelt.deGerman Q&A forum with 3428 questions and 3954 answers
goLaTeX.deGerman forum with 102352 posts in 19965 topics by 8170 users
Plus German wiki with command reference: golatex.de/wiki
TeXnique.frFrench Q&A forum with 991 questions and 1302 answers
LaTeX.net.brBrazilian Q&A forum, demo site
TeX.qaDemo Q&A forum

Galleries

TeXample.netTikZ gallery with community blog aggregator, with 407 examples of 190 authors
pgfplots.netPGFPlots gallery
LaTeX-Cookbook.netA gallery of chosen examples of the LaTeX Cookbook, re-using the same framework as above sites
LaTeX.netThis article database
TeX.worldA showcase of TeX sites

Software Archives and Mirrors

CTAN.netOfficial CTAN mirror
TeXlive.netTeX Live mirror (in maintainance, pointing to CTAN.net)
TeXlive.deOther TeX Live mirror (in maintainance, pointing to CTAN.net)

FAQs

TeXfragen.deGerman TeX FAQ
texfaq.orgInternational TeX FAQ, moved to Github now
www.tex.ac.ukOriginal UK TeX FAQ, moved to Github now
LaTeX.net.br/faqBrazilian Portuguese TeX FAQ
TeX.org.ukJust an experimental wiki version of the UK TeX FAQ

Blogs

tex-talk.netStackExchange TeX blog with a lot of Interviews of TeX friends
texblog.netStefan’s TeX blog since 2008
tex.tipsQuick mini TeX tips blog
mychemistry.euClemen’s chemistry TeX blog
tikz.deGerman TikZ friends blog
tex.myMalaysian TeX Users Group, with Lian Tze
TeX.coGerman TeX blog

Tools

TeXdoc.netPaulo powered web interface for TeXdoc

Photo by Taylor Vick on Unsplash

by Stefan Kottwitz at Monday, 2020-06-08 15:18

2020-06-06

LaTeX.net

Semi-automatic plotting

Producing data plots is an important part of doing research work. Making
good looking plots is not easy, and getting them right as well is a real
challenge. Perhaps the best way of producing plots, whether for use with LaTeX
or otherwise, is to use the pgfplots
package. For a general overview of using pgfplots effectively, see my TUGBoat
article
.

Using a programmatic approach to plotting has several advantages, as the plots
you get are easy to keep consistent. That’s particularly useful if several
people are preparing plots: using a GUI-based approach, it’s hard for multiple
workers to stick exactly to the same look. So it is worth putting some effort
into setting up templates for pgfplots: basic .tex files which can
be modified easily and reused multiple times. Putting a bit of effort into
developing templates also makes it easier to use data directly from a lot of
specialist systems. Many data file formats can be read as either comma or space
separated files, but it can take a little effort to get this right. So by
working on the basics, you can save yourself time later.

There is another advantage to setting up well documented templates. Not
everyone is a LaTeX expert: in my area, most people are not even day to day
LaTeX users. So making clear templates which can be used by altering a few key
settings is a great way to make LaTeX results accessible to more people.

Setting up

Before you can start developing a template, you need of course to have some
data and produce a one-off plot. That process is covered in the TUGBoat
article I mentioned earlier. There are then two big things to worry about:
generalising the .tex, and adding enough comments to let other
people use the template. That second point very important, as is talking to
other people to get things right: there’s no use in creating a template that
no-one can use!

As an example, I’ll use a plot of an infra-red spectrum: this is similar
to one in the TUGBoat article. The original version looks like this:

\documentclass{standalone}
 
\usepackage[T1]{fontenc}
\usepackage{helvet}
\sansmath
\renewcommand{\rmfamily}{\sffamily}
 
\usepackage{pgfplots}
\pgfplotsset
  {
    compat                   = newest,
    every tick/.append style = thin
  }
\pgfkeys{/pgf/number format/set thousands separator = }
 
\usepackage{siunitx}
\sisetup{mode = text}
 
\begin{document}
 
\begin{tikzpicture}[font = \sffamily]
  \begin{axis}
    [
      x dir               = reverse,
      xlabel              = Wavenumber/\si{\per\centi\metre},
      xmax                = 2100,
      xmin                = 1800,
      ylabel              = Milliabsorbance,
      ymax                = 34
    ]
    \addplot[color = black, mark = none] table {example.txt};
 
    \node[coordinate,  pin = {[rotate=90]right:1884}] at 
      (axis cs:1884,1.3) { };
    \node[coordinate,  pin = {[rotate=90]right:1922}] at 
      (axis cs:1922,1.3) { };
    \node[coordinate,  pin = {[rotate=90]right:1965}] at 
      (axis cs:19651,1.3) { };
    \node[coordinate,  pin = {[rotate=90]right:1965}] at 
      (axis cs:1965,1.3) { };      
    \node[coordinate,  pin = {[rotate=90]right:2076}] at 
      (axis cs:2076,1.3) { };
  \end{axis}
\end{tikzpicture}

From one plot to many plots

There are several things to notice about the example. First, there are no
comments: that’s fine for me (provided I remember how it works), but what about
my coworkers? Second, everything is hard coded, for example the file containing
the raw data, which is pretty hard to find. Third, I had to pre-modify the data
file to get it working: the .txt file is based on an instrument
file which is in a text format but which contains lines I needed to remove and
scale. Finally, there is a lot of repetition in the pin part,
which would be better handled using a loop.

The most important change to make is probably adding comments: that’s true of
any form of programming. In this case, that means labelling up the lines which
should be changed, and saying what should go in them. So for example I would
the settings for the axes to read

\begin{axis}
  [
    x dir               = reverse,
    xlabel              = Wavenumber/\si{\per\centi\metre},
    xmax                = 2100, % Alter "2100" to change x-max
    xmin                = 1800, % Alter "1800" to change x-min
    ylabel              = Milliabsorbance,
% Set ymax value to allow space for labels
% Alter "34" to set y-max, or comment out for autoscale
    ymax                = 34 
  ]

Making the template more flexible means moving some parts to macros which stand
out. In this template, the most important thing is where the data comes from.
So I would make that a macro right at the start of the file

% The file name for the raw data goes here
\newcommand*{\datafile}{example.txt}

Later in the file, I then use

\addplot[color = black, mark = none] table {\datafile};

Of course, I could have simply added a comment, but that does not work so well
for this type of ‘hidden’ setting. I find that the key-value lists used a lot
by pgfplots work fine with comments, but for other settings using a well-named
macro works better.

Making templates that work directly from instrument data rather than having to
post-process in a spreadsheet can require a bit of effort. Provided you can
save data in a text format (space-, tab- or comma-delimited), the pay-off is
that you only have to do the job once, and can then forget about the problem:
if you have to post-process every time, it’s easy to make mistakes. In the
example, I had to remove some lines at the start of the instrument file to make
it usable: easy to set up using

\addplot[color = black, mark = none] table[skip first n = 2] {\datafile};

Dealing with comma-separated files is also easy

\pgfplotsset{table/col sep = comma}

(the standard setting for pgfplots is whitespace delimited).

Scaling or shifting data points is sometimes necessary, and again pgfplots can
help as it will work with expressions for x and y, not just
values. We can therefore have something like

\addplot[color = black, mark = none]
  table
    [
      skip first n = 4,
      x expr = \thisrowno{0} + 10,
      y expr = 1000000 * \thisrowno{1}
    ]

where the column numbers for a table start at 0 (usually the x value)
and work up. Of course, if the values you need to shift or multiply by are
variable at all, you can store them as commands.

Finally, we can use loops to deal with repetition. I pointed out
that where I added some text markers in the original, things were
repetitive and a loop would work

\pgfplotsinvokeforeach{1884,1922,1965,2076}{% Alter numbers as needed
  \node[coordinate,  pin = {[rotate=90]right:#1}] at 
    (axis cs:#1,23) { }; % Alter "1.3" to set height of labels
  }

TikZ experts might wonder why I haven’t used \pgfforeach
here: it doesn’t work!

Putting it together

So what does the completed template look like?

% Template for plotting a single IR spectrum
 
% The file name for the raw data goes here
\newcommand*{\datafile}{example.asc}
 
\documentclass{standalone}
 
\usepackage[T1]{fontenc}
\usepackage{helvet}
\usepackage[EULERGREEK]{sansmath}
\sansmath
\renewcommand{\rmfamily}{\sffamily}
 
\usepackage{pgfplots}
\pgfplotsset
  {
    compat                   = newest,
    every tick/.append style = thin
  }
\pgfkeys{/pgf/number format/set thousands separator = }
 
\usepackage{siunitx}
\sisetup{mode = text}
 
\begin{document}
 
\begin{tikzpicture}[font = \sffamily]
  \begin{axis}
    [
      x dir               = reverse,
      xlabel              = Wavenumber/\si{\per\centi\metre},
      xmax                = 2100, % Alter "2100" to change x-max
      xmin                = 1800, % Alter "1800" to change x-min
      ylabel              = Milliabsorbance,
% Set ymax value to allow space for labels
      ymax                = 34 % Alter "34" to set y-max, or comment out for autoscale
    ]
    \addplot[color = black, mark = none] table[skip first n = 2] {\datafile};
 
% A list of labels: put all of the positions in the list.
  \pgfplotsinvokeforeach{1884,1922,1965,2076}{% Alter numbers as needed
    \node[coordinate,  pin = {[rotate=90]right:#1}] at 
      (axis cs:#1,23) { }; % Alter "1.3" to set height of labels
    }
  \end{axis}
\end{tikzpicture}
 
\end{document}

The result is shown in the figure.

Example plot

Programming for flexibility

Of course, you can make templates as simple or as complex as you like. For
example, we have some data that can come from one of three machines. Two save
directly in text-based files, but the formats are different. The third can only
export data, in .csv format, which is different again from the
other two! I could have written three templates, but as my non-LaTeX using
colleagues need to use them too, a programmatic approach looked better. So I
worked out the three different settings needed, then set up some code to work
out the file extension and set up accordingly

% The file name for the raw data goes here
\newcommand*{\datafile}{100mvn.ocw} % Change "100mvn.par"
 
% This does the auto-detection of file type
% You don't need to change anything
\newcommand*{\xcolumn}{0}
\newcommand*{\ycolumn}{1}
\newcommand*{\ignorelines}{0}
 
\newcommand*{\ext}{}
\newcommand*{\getext}{}
\def\getext#1.#2\stop{%
  \expandafter\ifx\expandafter\relax\detokenize{#2}\relax
    \renewcommand{\ext}{#1}%
    \expandafter\getextaux
  \else
    \expandafter\getext
  \fi
  #2\stop
}
\newcommand*{\getextaux}{}
\def\getextaux#1\stop{%
  \ifnum\pdfstrcmp{\ext}{par}=0 %
    \renewcommand*{\ignorelines}{110}
    \renewcommand*{\xcolumn}{2}
    \renewcommand*{\ycolumn}{3}
  \else
    \ifnum\pdfstrcmp{\ext}{ocw}=0 %
      \renewcommand*{\ignorelines}{2}
      \AtBeginDocument{%
        \pgfplotsset{table/col sep = space}
      }
    \fi
  \fi
}
\expandafter\getext\datafile.\stop

You might not want to go that far, but the point is that using LaTeX gives you
the possibility to program this kind of thing. You only need to set it up once,
so it is worth considering.

Conclusions

With a bit of effort, you can use pgfplots to produce sophisticated templates
that can be used to produce high quality plots with ease. This helps you
keep you data presentation consistent, and can also be used where several
workers have to produce similar output: vital if one person (you!) is to
avoid doing all of the work.

by Joseph Wright at Saturday, 2020-06-06 19:25

TikZ Library for Structural Analysis

At university it is always a very time consuming work to create new assignments, and tests; especially when those tasks include drawing graphics.

In the field of structural engineering those small structures are a key part for teaching. For this reason I developed, in cooperation with the Institute for Structural Analysis at the Graz University of Technology, a TikZ library for Structural Analysis.

There are two different types of libraries available: one for 2D structures (structuralanalysis.sty) and one for 3D structures (3dstructuralanalysis.sty). In the following article only the 2D library will be discussed, but the principles and methods are the same for 3D structures.

Contents

Principles

TikZ is a very powerful tool; however, unfortunately not everyone is used to this part of LaTeX. Therefore, in addition to the creation of the library the second goal was to keep the usage as simple as possible. So that:

  • anyone can use this library without (deeper) knowledge in TikZ
  • skilled users can easily modify and customise the code
  • the principle of this library can be used for any other library

Getting Started

Installation


Download the .sty file in the right directory and add it to the LaTeX file like:

\usepackage{structuralanalysis}

LaTeX Environment

Like every TikZ graph, needs also this library the

\begin{tikzpicture}
    ...
\end{tikzpicture}

environment.

Elements

Basic Commands

The library provides 10 different commands:

  • \point
  • \beam
  • \support
  • \hinge
  • \load or \lineload and \temperature
  • \internalforces
  • \dimensioning
  • \influenceline
  • \notation
  • \addon

For each element (command) are different options available. Obligatory options are marked with {curley brackets} and optional values are marked with [square brackets]. The first type is a must have criteria. In contrast, the optional input is not required to be entered.

An easy example is the following single force:

\load{type}{insertion point}[rotation][length or included angle][loaddistance];

Manual

More specific information about the elements and their options can be found in the manual. The manual is written in German, but section 1 provides a table with all elements and options; furthermore, the code is also given for each picture in the manual.

Examples

The easiest way to create a structure is the list above, starting with \point and ending with \addon.

The following examples shall give a first impression how drawings can be generated. In the manual both examples are explained in detail.

2D Simplified Roof

\begin{tikzpicture}
     \scaling{.65};
 
     \point{a}{0}{1};
     \point{b}{3}{1};
     \point{c}{11}{3};
     \point{d}{19}{1};
     \point{e}{22}{1};
     \point{f}{3}{0};
     \point{g}{11}{-2};
     \point{h}{19}{0};
 
     \beam{1}{a}{b}[0][1];
     \beam{1}{b}{c}[1][1];
     \beam{1}{c}{d}[1][1];
     \beam{1}{d}{e}[1][0];
     \beam{1}{f}{b};
     \beam{1}{d}{h};
     \beam{2}{f}{g};
     \beam{2}{g}{h};
     \beam{2}{g}{c};
 
     \support{1}{f};
     \support{2}{h};
 
     \hinge{1}{f};
     \hinge{1}{h};
     \hinge{1}{g};
     \hinge{2}{c}[<!-- no bbcode -->b][d];
 
     \lineload{2}{a}{b}[1][1][.5];
     \lineload{2}{b}{c};
 
     \dimensioning{1}{a}{b}{-2.5}[$3,0$];
     \dimensioning{1}{b}{c}{-2.5}[$8,0$];
     \dimensioning{1}{c}{d}{-2.5}[$8,0$];
     \dimensioning{1}{d}{e}{-2.5}[$3,0$];
     \dimensioning{2}{f}{a}{-1}[$1,0$];
     \dimensioning{2}{g}{f}{-1}[$2,0$];
     \dimensioning{2}{a}{c}{-1}[$2,0$];
 
     \influenceline{a}{e}{3}[.3];
 
     \notation{1}{a}{$1$}[left];
     \notation{1}{b}{$2$}[below right=2mm];
     \notation{1}{c}{$3$};
     \notation{1}{d}{$4$}[above];
     \notation{1}{e}{$5$}[above];
     \notation{1}{f}{$6$}[left=2mm];
     \notation{1}{g}{$7$}[below=2mm];
     \notation{1}{h}{$8$}[right=2mm];
     \notation{4}{f}{g}[$S$];
 
\end{tikzpicture}
2D Simplified Roof

3D Support Construction

By using the library 3dstructuralanalysis, the following graph can be created in a very short time:

\setcoords{-25}{10}[1][1.2]
\setaxis{2}
%\showpoint
\begin{tikzpicture}[coords]
 
     \dpoint{a}{0}{0}{0};
     \dpoint{b}{3}{0}{0};
     \dpoint{c}{6}{0}{0};
     \dpoint{d}{9}{0}{0};
     \dpoint{e}{12}{0}{0};
     \dpoint{f}{0}{3}{0};
     \dpoint{g}{3}{3}{0};
     \dpoint{h}{6}{3}{0};
     \dpoint{i}{9}{3}{0};
     \dpoint{j}{12}{3}{0};
 
     \daxis{1}{a};
 
     \dbeam{1}{f}{b};
     \dbeam{1}{b}{h};
     \dbeam{1}{h}{d};
     \dbeam{1}{d}{j};
     \dbeam{3}{a}{e};
     \dbeam{3}{f}{j};
     \dbeam{3}{a}{f};
     \dbeam{3}{b}{g};
     \dbeam{3}{c}{h};
     \dbeam{3}{d}{i};
     \dbeam{3}{e}{j};
 
     \dsupport{1}{b};
     \dsupport{1}{h}[0][0];
     \dsupport{1}{d}[0];
 
     \dhinge{2}{b}[f][h][1];
     \dhinge{2}{h}[<!-- no bbcode-->b][d][1];
     \dhinge{2}{d}[h][j][1];
 
     \dlineload{5}{0}{f}{b}[.5][.5][.11];
     \dlineload{5}{0}{b}{h}[.5][.5][.11];
     \dlineload{5}{0}{h}{d}[.5][.5][.11];
     \dlineload{5}{0}{d}{j}[.5][.5][.11];
 
     \ddimensioning{xy}{f}{g}{4.5}[$3~m$];
     \ddimensioning{xy}{g}{h}{4.5}[$3~m$];
     \ddimensioning{xy}{h}{i}{4.5}[$3~m$];
     \ddimensioning{xy}{i}{j}{4.5}[$3~m$];
     \ddimensioning{yx}{e}{j}{13}[$3~m$];
 
     \dnotation{1}{f}{$q=10~kN/m$}[above left=3mm];
     \dnotation{1}{b}{$A$}[below left];
     \dnotation{1}{h}{$C$}[right=2mm];
     \dnotation{1}{d}{$B$}[below left];
 
\end{tikzpicture}
3D Support Construction

Downloads

by Jürgen Hackl at Saturday, 2020-06-06 19:18

Uwes kleines Technikblog - Kategorie LaTeX (by Uwe Ziegenhagen)

Mehrsprachige Beamer-Folien erstellen

Unter https://tex.stackexchange.com/questions/443714/bilingual-slides-beamer-comment-package-and-non-ascii-characters-umlauts-dia gibt es ein gutes Beispiel, wie man mehrsprachige Beamer-Folien erstellen kann. Je nach gesetztem \newcommand{\lvlang}{EN} wird entweder die englische oder deutsche Version erzeugt.

Der Code müsste sich noch verbessern lassen, wenn man beispielsweise globale Klassenoptionen wie english oder ngerman auswertet. Nachtrag: Siehe dazu https://www.uweziegenhagen.de/?p=4352

Hier ein vollständiges Beispiel:

\documentclass{beamer}

\newcommand{\lvlang}{EN}

\usepackage[utf8]{inputenc}
\usepackage[T1]{fontenc}

\usepackage{comment}
\long\def\WriteCommentLine#1{\immediate\write\CommentStream{\unexpanded{#1}}}
\let\ThisComment\WriteCommentLine

\usepackage{ifthen}
\newcommand{\iflvlangde}[2]{%
  \ifthenelse{\equal{\lvlang}{DE}}{#1}{#2}%
}

\makeatletter
\iflvlangde{
  \usepackage[main=ngerman]{babel}
  \includecomment{DE}
  \excludecomment{EN}
}{
  \usepackage[main=english]{babel}
  \includecomment{EN}
  \excludecomment{DE}
}
\makeatother

\begin{document}
\begin{DE}
\begin{frame}[fragile]{Hallo Welt}
  Hallo Welt
\end{frame}
\end{DE}

\begin{EN}
\begin{frame}[fragile]{Hello World}
  Hello World
\end{frame}
\end{EN}

\end{document}

Uwe

Uwe Ziegenhagen likes LaTeX and Python, sometimes even combined. Do you like my content and would like to thank me for it? Consider making a small donation to my local fablab, the Dingfabrik Köln. Details on how to donate can be found here Spenden für die Dingfabrik.

More Posts - Website

üääü­üöü–üöü–üöü–

by Uwe at Saturday, 2020-06-06 15:48

LaTeX.net

Drawing with the tikz-3dplot Package

When I started working on my thesis dissertation using LaTeX, I discovered the TikZ package for drawing vector-based figures. I needed a way to easily draw three-dimensional figures, and so I put together a few handy tools in the tikz-3dplot package. This package builds on TikZ, providing an easy way to rotate the perspective when drawing three-dimensional shapes using basic shapes in a tikzpicture environment.

Let’s explore some examples of what tikz-3dplot can do.

– A contribution to the LaTeX and Graphics contest – This article is available for reading and for download in pdf format

The Basics

Let’s draw a cube, with side length of 2. To help illustrate the comparison, we’ll draw a grid in the xy plane, and also show the x, y, and z axes.

\begin{tikzpicture}
		[cube/.style={very thick,black},
			grid/.style={very thin,gray},
			axis/.style={-&gt;,blue,thick}]
 
	%draw a grid in the x-y plane
	\foreach \x in {-0.5,0,...,2.5}
		\foreach \y in {-0.5,0,...,2.5}
		{
			\draw[grid] (\x,-0.5) -- (\x,2.5);
			\draw[grid] (-0.5,\y) -- (2.5,\y);
		}
 
	%draw the axes
	\draw[axis] (0,0,0) -- (3,0,0) node[anchor=west]{$x$};
	\draw[axis] (0,0,0) -- (0,3,0) node[anchor=west]{$y$};
	\draw[axis] (0,0,0) -- (0,0,3) node[anchor=west]{$z$};
 
	%draw the top and bottom of the cube
	\draw[cube] (0,0,0) -- (0,2,0) -- (2,2,0) -- (2,0,0) -- cycle;
	\draw[cube] (0,0,2) -- (0,2,2) -- (2,2,2) -- (2,0,2) -- cycle;
 
	%draw the edges of the cube
	\draw[cube] (0,0,0) -- (0,0,2);
	\draw[cube] (0,2,0) -- (0,2,2);
	\draw[cube] (2,0,0) -- (2,0,2);
	\draw[cube] (2,2,0) -- (2,2,2);
 
\end{tikzpicture}
\tdplotsetmaincoords{60}{125}
\begin{tikzpicture}
		[tdplot_main_coords,
			cube/.style={very thick,black},
			grid/.style={very thin,gray},
			axis/.style={-&gt;,blue,thick}]
 
	%draw a grid in the x-y plane
	\foreach \x in {-0.5,0,...,2.5}
		\foreach \y in {-0.5,0,...,2.5}
		{
			\draw[grid] (\x,-0.5) -- (\x,2.5);
			\draw[grid] (-0.5,\y) -- (2.5,\y);
		}
 
 
	%draw the axes
	\draw[axis] (0,0,0) -- (3,0,0) node[anchor=west]{$x$};
	\draw[axis] (0,0,0) -- (0,3,0) node[anchor=west]{$y$};
	\draw[axis] (0,0,0) -- (0,0,3) node[anchor=west]{$z$};
 
	%draw the top and bottom of the cube
	\draw[cube] (0,0,0) -- (0,2,0) -- (2,2,0) -- (2,0,0) -- cycle;
	\draw[cube] (0,0,2) -- (0,2,2) -- (2,2,2) -- (2,0,2) -- cycle;
 
	%draw the edges of the cube
	\draw[cube] (0,0,0) -- (0,0,2);
	\draw[cube] (0,2,0) -- (0,2,2);
	\draw[cube] (2,0,0) -- (2,0,2);
	\draw[cube] (2,2,0) -- (2,2,2);
 
\end{tikzpicture}

Basic cube example

In the default configuration, shown on the left, the xy plane lies in the same plane as the page. The z-axis extends down and to the left, giving the illusion of depth. This is a useful representation of a three-dimensional space as is, but gives little in the range of customization.

The tikz-3dplot package can redefine the perspective for viewing this three-dimensional space. In the tikz-3dplot configuration, the z-axis points in the vertical direction, and the viewed perspective of the xy plane can be specified using the user-specified values θ and φ (in degrees). These angles define the direction from which the three-dimensional space is viewed, using the standard definition of the polar and azimuthal angles from polar coordinates.

To specify the display orientation, use the command \tdplotsetmaincoords{θ}{φ} before the tikzpicture environment, and include the tdplot_main_coords style within any drawing command. Alternatively, as seen in this example, you can include the tdplot_main_coords style within the global style settings of the tikzpicture environment.

As suggested by the name, tdplot_main_coords, I refer to this coordinate frame as the “main” coordinate frame. It provides the base through which you can perform all drawing operations. It may seem a bit restrictive that the z-axis of the main coordinate frame can only the vertical direction. If you wish to define a more general orientation, there is a second, “rotated” coordinate frame that can be specified relative to the orientation of the main coordinate frame. Let’s take a look at this cube example again, drawn in the rotated coordinate frame.

\tdplotsetmaincoords{60}{120}%
\tdplotsetrotatedcoords{0}{20}{0}%
\begin{tikzpicture}
		[tdplot_rotated_coords,
			cube/.style={very thick,black},
			grid/.style={very thin,gray},
			axis/.style={-&gt;,blue,thick},
			rotated axis/.style={-&gt;,purple,thick}]
 
	%draw a grid in the x-y plane
	\foreach \x in {-0.5,0,...,2.5}
		\foreach \y in {-0.5,0,...,2.5}
		{
			\draw[grid] (\x,-0.5) -- (\x,2.5);
			\draw[grid] (-0.5,\y) -- (2.5,\y);
		}
 
	%draw the main coordinate frame axes
	\draw[axis,tdplot_main_coords] (0,0,0) -- (3,0,0) node[anchor=west]{$x$};
	\draw[axis,tdplot_main_coords] (0,0,0) -- (0,3,0) node[anchor=north west]{$y$};
	\draw[axis,tdplot_main_coords] (0,0,0) -- (0,0,3) node[anchor=west]{$z$};
 
 
	%draw the rotated coordinate frame axes
	\draw[rotated axis] (0,0,0) -- (3,0,0) node[anchor=west]{$x'$};
	\draw[rotated axis] (0,0,0) -- (0,3,0) node[anchor=south west]{$y'$};
	\draw[rotated axis] (0,0,0) -- (0,0,3) node[anchor=west]{$z'$};
 
	%draw the top and bottom of the cube
	\draw[cube] (0,0,0) -- (0,2,0) -- (2,2,0) -- (2,0,0) -- cycle;
	\draw[cube] (0,0,2) -- (0,2,2) -- (2,2,2) -- (2,0,2) -- cycle;
 
	%draw the edges of the cube
	\draw[cube] (0,0,0) -- (0,0,2);
	\draw[cube] (0,2,0) -- (0,2,2);
	\draw[cube] (2,0,0) -- (2,0,2);
	\draw[cube] (2,2,0) -- (2,2,2);
 
\end{tikzpicture}

Rotated cube

Here, I show the axes for both the main coordinate frame (xyz) and rotated coordinate frame (xyz‘). Using the command \tdplotsetrotatedcoords{θz}{θy}{θz}, the rotated coordinate frame is defined by taking the orientation of the main coordinate frame, and then performing the Euler angle rotations (θz, θy, θz) (in degrees) to define the rotated coordinate frame. The coordinate transformation for the rotated coordinate frame is stored in the tdplot_rotated_coords style. In this example, I simply rotated about the y-axis of the main coordinate frame.

Examples

Defining tdplot_main_coord and tdplot_rotated_coords is just the beginning. tikz-3dplot provides a bunch of handy tools to make it easier to work in a three-dimensional space.

Defining Coordinates

For example, the command \tdplotsetcoord{r}{θ}{φ} defines a coordinate using spherical coordinates r, θ, φ. More than that, this handy command also defines points that can be useful for projections on the axes and their shared planes. Consider the following example, where the coordinate (P) is defined in the three-dimensional space using spherical polar coordinates.

\tdplotsetmaincoords{60}{120}
\begin{tikzpicture}
	[scale=3,
		tdplot_main_coords,
		axis/.style={-&gt;,blue,thick},
		vector/.style={-stealth,red,very thick}]
 
	%standard tikz coordinate definition using x, y, z coords
	\coordinate (O) at (0,0,0);
 
	%tikz-3dplot coordinate definition using r, theta, phi coords
	\tdplotsetcoord{P}{.8}{55}{60}
 
	%draw axes
	\draw[axis] (0,0,0) -- (1,0,0) node[anchor=north east]{$x$};
	\draw[axis] (0,0,0) -- (0,1,0) node[anchor=north west]{$y$};
	\draw[axis] (0,0,0) -- (0,0,1) node[anchor=south]{$z$};
 
	%draw a vector from O to P
	\draw[vector] (O) -- (P);
\end{tikzpicture}
\tdplotsetmaincoords{60}{120}
\begin{tikzpicture}
	[scale=3,
		tdplot_main_coords,
		axis/.style={-&gt;,blue,thick},
		vector/.style={-stealth,red,very thick},
		vector guide/.style={dashed,red,thick}]
 
	%standard tikz coordinate definition using x, y, z coords
	\coordinate (O) at (0,0,0);
 
	%tikz-3dplot coordinate definition using r, theta, phi coords
	\tdplotsetcoord{P}{.8}{55}{60}
 
	%draw axes
	\draw[axis] (0,0,0) -- (1,0,0) node[anchor=north east]{$x$};
	\draw[axis] (0,0,0) -- (0,1,0) node[anchor=north west]{$y$};
	\draw[axis] (0,0,0) -- (0,0,1) node[anchor=south]{$z$};
 
	%draw a vector from O to P
	\draw[vector] (O) -- (P);
 
	%draw guide lines to components
	\draw[vector guide] (O) -- (Pxy);
	\draw[vector guide] (Pxy) -- (P);
\end{tikzpicture}

Coordinate system

In the left diagram, it is hard to identify where the vector lies in the three-dimensional space. In the right diagram, I make use of the (automatically generated) projection coordinate (Pxy), which lies in the xy plane below (P), to illustrate the orientation of the vector.

Drawing Circles and Arcs

To draw a circle or arc, the \tdplotdrawarc command is provided. Here, you specify the center of the circle/arc, the radius, the start and end angles, and label notes. Here are a couple examples that use arcs and circles.

\tdplotsetmaincoords{60}{120}
\begin{tikzpicture}
	[scale=3,
		tdplot_main_coords,
		axis/.style={-&gt;,blue,thick},
		vector/.style={-stealth,red,very thick},
		vector guide/.style={dashed,red,thick},
		angle/.style={red,thick}]
 
	%standard tikz coordinate definition using x, y, z coords
	\coordinate (O) at (0,0,0);
 
	%tikz-3dplot coordinate definition using r, theta, phi coords
	\tdplotsetcoord{P}{.8}{55}{60}
 
	%draw axes
	\draw[axis] (0,0,0) -- (1,0,0) node[anchor=north east]{$x$};
	\draw[axis] (0,0,0) -- (0,1,0) node[anchor=north west]{$y$};
	\draw[axis] (0,0,0) -- (0,0,1) node[anchor=south]{$z$};
 
	%draw a vector from O to P
	\draw[vector] (O) -- (P);
 
	%draw guide lines to components
	\draw[vector guide] (O) -- (Pxy);
	\draw[vector guide] (Pxy) -- (P);
 
	%draw an arc illustrating the angle defining the orientation
	\tdplotdrawarc[angle]{(O)}{.35}{0}{60}{anchor=north}{$\phi$}
 
	%define the rotated coordinate frame to lie in the "theta plane"
	\tdplotsetthetaplanecoords{55}
 
	\tdplotdrawarc[tdplot_rotated_coords,angle]{(O)}{.35}{0}{55}
          {anchor=south west}{$\theta$}
 
\end{tikzpicture}
\tdplotsetmaincoords{55}{5}
\begin{tikzpicture}
	[scale=3,
		tdplot_main_coords,
		curve/.style={red,densely dotted,thick}]
 
	\coordinate (O) at (0,0,0);
 
	\foreach \angle in {-90,-75,...,90}
	{
		%calculate the sine and cosine of the angle
		\tdplotsinandcos{\sintheta}{\costheta}{\angle}%
 
		%define a point along the z-axis through which to draw
		%a circle in the xy-plane
		\coordinate (P) at (0,0,\sintheta);
 
		%draw the circle in the main frame
		\tdplotdrawarc[curve]{(P)}{\costheta}{0}{360}{}{}
 
		%define the rotated coordinate frame based on the angle
		\tdplotsetthetaplanecoords{\angle}
 
		%draw the circle in the rotated frame
		\tdplotdrawarc[curve,tdplot_rotated_coords]{(O)}{1}{0}{360}{}{}
	}
 
\end{tikzpicture}

Circles and arcs

The left example is an extension of the vector example shown earlier. An arc was added in the xy plane to illustrate the azimuthal angle, φ, and another arc was added to illustrate the polar angle, θ. In the right example, a series of circles are drawn to represent longitudinal and latitudinal lines on a sphere.

To make these examples, a few useful helper functions are used. First, the \tdplotsetthetaplanecoords{φ} function gives a convenient way of placing rotated coordinate frame such that the xy‘ plane is lined up to draw the polar angle, θ. Second, the \tdplotsinandcos{\sintheta}{\costheta}{θ} command takes the specified angle, θ (in degrees), and stores the sine and cosine value of that angle in the macros \sintheta and \costheta. Note that you can use any name in place of \sintheta and \costheta for your convenience.

Drawing Surfaces

Drawing lines and curves can be great fun, but what if you wanted to represent a solid object with opaque surfaces? With proper planning and careful consideration of which surfaces are drawn first, you can render simple shapes.

\tdplotsetmaincoords{60}{125}
\begin{tikzpicture}
	[tdplot_main_coords,
		grid/.style={very thin,gray},
		axis/.style={-&gt;,blue,thick},
		cube/.style={opacity=.5,very thick,fill=red}]
	%draw a grid in the x-y plane
	\foreach \x in {-0.5,0,...,2.5}
		\foreach \y in {-0.5,0,...,2.5}
		{
			\draw[grid] (\x,-0.5) -- (\x,2.5);
			\draw[grid] (-0.5,\y) -- (2.5,\y);
		}			
 
	%draw the axes
	\draw[axis] (0,0,0) -- (3,0,0) node[anchor=west]{$x$};
	\draw[axis] (0,0,0) -- (0,3,0) node[anchor=west]{$y$};
	\draw[axis] (0,0,0) -- (0,0,3) node[anchor=west]{$z$};
 
	%draw the bottom of the cube
	\draw[cube] (0,0,0) -- (0,2,0) -- (2,2,0) -- (2,0,0) -- cycle;
 
	%draw the back-right of the cube
	\draw[cube] (0,0,0) -- (0,2,0) -- (0,2,2) -- (0,0,2) -- cycle;
 
	%draw the back-left of the cube
	\draw[cube] (0,0,0) -- (2,0,0) -- (2,0,2) -- (0,0,2) -- cycle;
 
 
	%draw the front-right of the cube
	\draw[cube] (2,0,0) -- (2,2,0) -- (2,2,2) -- (2,0,2) -- cycle;
 
	%draw the front-left of the cube
	\draw[cube] (0,2,0) -- (2,2,0) -- (2,2,2) -- (0,2,2) -- cycle;
 
	%draw the top of the cube
	\draw[cube] (0,0,2) -- (0,2,2) -- (2,2,2) -- (2,0,2) -- cycle;
 
\end{tikzpicture}
\tdplotsetmaincoords{60}{125}
\begin{tikzpicture}[
		tdplot_main_coords,
		grid/.style={very thin,gray},
		axis/.style={-&gt;,blue,thick},
		cube/.style={very thick,fill=red},
		cube hidden/.style={very thick,dashed}]
	%draw a grid in the x-y plane
	\foreach \x in {-0.5,0,...,2.5}
		\foreach \y in {-0.5,0,...,2.5}
		{
			\draw[grid] (\x,-0.5) -- (\x,2.5);
			\draw[grid] (-0.5,\y) -- (2.5,\y);
		}
 
	%draw the axes
	\draw[axis] (0,0,0) -- (3,0,0) node[anchor=west]{$x$};
	\draw[axis] (0,0,0) -- (0,3,0) node[anchor=west]{$y$};
	\draw[axis] (0,0,0) -- (0,0,3) node[anchor=west]{$z$};
 
	%draw the front-right of the cube
	\draw[cube] (2,0,0) -- (2,2,0) -- (2,2,2) -- (2,0,2) -- cycle;
 
	%draw the front-left of the cube
	\draw[cube] (0,2,0) -- (2,2,0) -- (2,2,2) -- (0,2,2) -- cycle;
 
	%draw the top of the cube
	\draw[cube] (0,0,2) -- (0,2,2) -- (2,2,2) -- (2,0,2) -- cycle;
 
	%draw dashed lines to represent hidden edges
	\draw[cube hidden] (0,0,0) -- (2,0,0);
	\draw[cube hidden] (0,0,0) -- (0,2,0);
	\draw[cube hidden] (0,0,0) -- (0,0,2);
 
\end{tikzpicture}

Surface plots: cubes

The left example uses fill color transparency to illustrate surfaces hidden behind others. Here, the covered and back edges of the cube are distinguished by having the fill for the front surfaces drawn over top. The right example has opaque front surfaces of the cube drawn first, and then the back edges are illustrated using dashed lines. In both situations, the order of drawing depends on the perspective. If you change the orientation of the coordinate frame, you will need to be careful about whether the drawing order needs to change. In the case with dashed lines, you may even need to reassign roles to the components of the object.

Another example of surface plots is the \tdplotsphericalsurfaceplot command. This command was developed to allow me to plot complex spherical harmonics, where the radius and hue of the surface is plotted as a function of the polar and azimuthal angles.

\tdplotsetmaincoords{70}{135}
\begin{tikzpicture}[scale=4,line join=bevel,tdplot_main_coords, fill opacity=.5]
	\pgfsetlinewidth{.2pt}
	\tdplotsphericalsurfaceplot[parametricfill]{72}{36}{sin(\tdplottheta)*cos(\tdplottheta)}{black}{\tdplotphi}%
		{\draw[color=black,thick,-&gt;] (0,0,0) -- (1,0,0) node[anchor=north east]{$x$};}%
		{\draw[color=black,thick,-&gt;] (0,0,0) -- (0,1,0) node[anchor=north west]{$y$};}%
		{\draw[color=black,thick,-&gt;] (0,0,0) -- (0,0,1) node[anchor=south]{$z$};}%
\end{tikzpicture}

Surface plot

Here, I am plotting the function r = sin(θ)cos(θ), and using φ as the parameter to choose the surface fill hue. The instructions for \tdplotsphericalsurfaceplot define the angular step sizes, the function to plot, the line style, the fill style, and instructions for drawing axes. Unlike the manual surface plot examples shown earlier, the \tdplotsphericalsurfaceplot command does the work of determining how to appropriately draw the surface so that surfaces and edges drawn on the back side are appropriately rendered below those on the front. The only downside with this is that it takes some time to render, so you may want to look into externalizing these figures.

Limitations

When drawing shapes in tikz-3dplot, only simple shapes like line segments and arcs behave properly, whereas more involved shapes like rectangles and grids do not adhere to the sense of rotated coordinate system. Let’s look at the first example again, where we use the rectangle command to draw the top and bottom of the cube, rather than a series of line segments.

\begin{tikzpicture}
	%draw a grid in the x-y plane
	\foreach \x in {-0.5,0,...,2.5}
		\foreach \y in {-0.5,0,...,2.5}
		{
			\draw[gray,very thin] (\x,-0.5) -- (\x,2.5);
			\draw[gray,very thin] (-0.5,\y) -- (2.5,\y);
		}
 
	%draw the axes
	\draw[-&gt;] (0,0,0) -- (3,0,0) node[anchor=west]{$x$};
	\draw[-&gt;] (0,0,0) -- (0,3,0) node[anchor=west]{$y$};
	\draw[-&gt;] (0,0,0) -- (0,0,3) node[anchor=west]{$z$};
 
	%draw the top and bottom of the cube
	\draw[very thick] (0,0,0) rectangle (2,2,0);
	\draw[very thick] (0,0,2) rectangle (2,2,2);
 
	%draw the edges of the cube
	\draw[very thick] (0,0,0) -- (0,0,2);
	\draw[very thick] (0,2,0) -- (0,2,2);
	\draw[very thick] (2,0,0) -- (2,0,2);
	\draw[very thick] (2,2,0) -- (2,2,2);
 
\end{tikzpicture}
\tdplotsetmaincoords{60}{125}
\begin{tikzpicture}[tdplot_main_coords]
	%draw a grid in the x-y plane
	\foreach \x in {-0.5,0,...,2.5}
		\foreach \y in {-0.5,0,...,2.5}
		{
			\draw[gray,very thin] (\x,-0.5) -- (\x,2.5);
			\draw[gray,very thin] (-0.5,\y) -- (2.5,\y);
		}
 
	%draw the axes
	\draw[-&gt;] (0,0,0) -- (3,0,0) node[anchor=west]{$x$};
	\draw[-&gt;] (0,0,0) -- (0,3,0) node[anchor=west]{$y$};
	\draw[-&gt;] (0,0,0) -- (0,0,3) node[anchor=west]{$z$};
 
	%draw the top and bottom of the cube
	\draw[very thick] (0,0,0) rectangle (2,2,0);
	\draw[very thick] (0,0,2) rectangle (2,2,2);
 
	%draw the edges of the cube
	\draw[very thick] (0,0,0) -- (0,0,2);
	\draw[very thick] (0,2,0) -- (0,2,2);
	\draw[very thick] (2,0,0) -- (2,0,2);
	\draw[very thick] (2,2,0) -- (2,2,2);
 
\end{tikzpicture}

Limitations

If you look carefully at the rotated diagram, you’ll notice that the rectangles are drawn with the correct beginning and end points, but the overall shape conforms to the original coordinate system of the page.

For More Information

For more information, I recommend you have a look at the package, located at www.ctan.org/pkg/tikz-3dplot.


About the Author:Jeff Hein is the author of the tikz-3dplot package. He maintains a blog for this package on tikz3dplot.wordpress.com.

by Jeff Hein at Saturday, 2020-06-06 06:31

2020-05-28

LaTeX Project

First prerelease of LaTeX 2020-10-01 is available for testing

The first LaTeX prerelease for 2020-10-01 is available for testing

A few days ago we have submitted a new LaTeX development format1 to CTAN and by now it should be available to all users using MiKTeX or TeX Live (on any operating system).

This format allows you to test the upcoming LaTeX release scheduled for 2020-10-01 with your documents or packages. Such testing is particularly important for package maintainers to verify that changes to the core LaTeX haven’t introduced incompatibilities with existing code. We try to identify any such problem beforehand, but such an undertaking is necessarily incomplete, which is why we ask for user testing.

Besides developers we also ask ordinary users to try out the new release candidate, because the more people are testing the new format, the higher the chances that any hidden problems are identified before the final release in February hits the streets.

Processing your documents with the prerelease is straight forward. All you have to do is to replace the invocation command by appending -dev to the executable, e.g., on the command line you would run

pdflatex-dev myfile    or    lualatex-dev myfile    or    xelatex-dev myfile

instead of using pdflatex, lualatex or xelatex. If you use an integrated editing environment, then it depends on the system how to configure it to use an alternative format; but in any case the necessary modification should be straight forward.

Main features of the first prerelease for 2020-10-01

We have been quite busy (did Corona help?) and so already this first prerelease contains close to thirty smaller and larger fixes and enhancements. A full list is given in a draft version of ltnews32 which you should be able to read by running

texdoc ltnews32

on the command line (or by any other means available at your operating system—somewhere there should be a file called ltnews32.pdf that you can open with a PDF reader). The draft version is also available from our website as LaTeX2e News Issue 32 draft.

In this post I only touch two of the most important topics, but many others are worth exploring too, so please check that documentation out.

Providing xparse as part of the format

In the previous release we added the LaTeX3 programming layer to the LaTeX format to improve the loading speed when packages using expl3 are used (such as fontspec or xparse). In the upcoming release we are now extending this support by integrating xparse so that the extended interface for defining document-level commands becomes available out of the box.

This enables users and most importantly package developers to easily define LaTeX commands with multiple optional arguments or other syntax features with ease. For details, check out the xparse documentation, e.g., via texdoc xparse.

Improving the font series handling

In the previous release we extended NFSS (the new font selection scheme) to better support modern fonts that offer different font faces, e.g., condensed, semi-bold, etc., and make them work seamlessly with each other. Experiences with the extended interface showed that for some use cases adequate support was still missing or that in special setups the algorithms sometimes selected a wrong font series value. These cases have now been resolved and additional support commands have been added. For example, with

\IfFontSeriesContextTF{〈context〉} {〈true code〉}{〈false code〉}

you can now define commands that behave differently depending on the current font series context. The 〈context〉 to check has to be specified as either bf or md. The command then chooses the 〈true code〉 or the 〈false code〉 based on where it is used (e.g., inside \textbf (or \bfseries) or not).

Outlook

We issue the first prerelease now in the hope that you will help by making sure that all the enhancements and fixes inside are safe and without any undesired side effects, so please help with the testing if you can.

We expect further extensions (currently under development) to be added in a second prerelease, in particular a general hook management system for LaTeX, so stay tuned.

Enjoy — Frank

  1. The internal version number for the pre-release is LaTeX2e <2020-10-01> pre-release-6, the earlier prereleases just mirrored the patch releases we did for 2020-02-02. 

Thursday, 2020-05-28 00:00

2020-05-24

Uwes kleines Technikblog - Kategorie LaTeX (by Uwe Ziegenhagen)

Beamer-Vorlage für (MINT)-Vorlesungen

Ich habe heute die erste Version einer Beamer-Vorlage für Vorlesungen auf github hochgeladen. Ziel war es, sowohl einzelne Module (Vorlesungen) als auch den kompletten Foliensatz erzeugen zu können. Die Vorlage eignet sich nicht nur für MINT-Veranstaltungen, mein Fokus lag aber auf dem sauberen Einbinden von Quellcodes, was bei geisteswissenschaftlichen Vorlesungen vielleicht nicht der Hauptfokus ist.

Die Dateien finden sich unter https://github.com/UweZiegenhagen/MINT-Lecture-Slide-Template

Pull Requests, Kommentare und Vorschläge werden gern gesehen.

Als nächstes baue ich noch Beispiele für die von mir definierten Befehle ein, die ein vereinfachtes Einbetten von Grafiken und Quellcodes erlauben.

Uwe

Uwe Ziegenhagen likes LaTeX and Python, sometimes even combined. Do you like my content and would like to thank me for it? Consider making a small donation to my local fablab, the Dingfabrik Köln. Details on how to donate can be found here Spenden für die Dingfabrik.

More Posts - Website

üääü­üöü–üöü–üöü–

by Uwe at Sunday, 2020-05-24 14:07

2020-05-22

LaTeX.net

goLaTeX Forum Updated

Auch auf Deutsch / in German: TeX.co


In April I did a pretty big software update on goLaTeX.de, and added more features since then.

What’s new? For example:

  • The display is automatically adjusted, so it’s much better readable on smart phones. (before vs. now)
  • Embedding images is easier
  • More options for faster navigation
  • Better protection agains spam (thanks to stopforumspam.com)
  • Internal notification system (as alternative way instead of email)
  • Markdown support for easier input

For interested readers there’s a short description of the update procedure.

Thanks to DANTE for supporting the server operation!


Photo by Markus Winkler on Unsplash

by Stefan Kottwitz at Friday, 2020-05-22 15:44

2020-05-20

LaTeX.net

DANTE

DANTE, “Deutschsprachige Anwendervereinigung TeX e.V.”, is a registered non-profit association of TeX users. It was founded April 14, 1989 in Heidelberg, Germany.

Its purpose is to support TeX and LaTeX mainly for German speaking people, by providing information, software, and support.

DANTE is an important supporter of the international TeX world too, such as by providing the central services for the Comprehensive TeX Archive Network, CTAN. Furthermore, it supports projects, such as font development, LaTeX development, user and developer meetings.

DANTE also supports web forums, FAQ sites, wikis, and further sites by covering the recurring server hosting costs since years. Thanks in the name of thousands registered users and countless anonymous LaTeX web surfers!

Photo by Raphael Schaller on Unsplash

by Stefan Kottwitz at Wednesday, 2020-05-20 00:00

2020-05-15

Thomas Schramm

Fussball-Halbjahreskalender 2020 mit Bundesliga-Geisterspielen

Das Interesse an den Geisterspielen der Bundesliga d체rfte sich in Grenzen halten, aber durch Aktualisierung des Halbjahreskalenders kann man den zu erwartenden Flickenteppich recht gut veranschaulichen. Hier d체rfte sich noch einiges 채ndern, und f체r die internationalen Spielen gibt es aktuell noch gar keine Termine. Die werden dann ebenfalls nach Bekanntgabe aktualisiert.

by Thomas Schramm at Friday, 2020-05-15 16:02

2020-04-27

Uwes kleines Technikblog - Kategorie LaTeX (by Uwe Ziegenhagen)

Aufgabenlisten mit dem tasks Paket setzen

Heute mal ein Beispiel, wie man mit dem tasks Paket Aufgabenlisten setzen kann. Im nächsten Schritt werde ich dann die Listen per Export aus Trello befüllen.

\documentclass[12pt,ngerman]{scrartcl}
\usepackage[utf8]{inputenc}
\usepackage[T1]{fontenc}
\usepackage{babel}
\usepackage{tasks}
\usepackage{fontawesome}

\NewTasksEnvironment[label=\faHandORight,label-width=15pt]{todo}[*](1)
\NewTasksEnvironment[label=\faHandRockO,label-width=15pt]{progress}[*](1)
\NewTasksEnvironment[label=\faThumbsOUp,label-width=15pt]{done}[*](1)

\begin{document}

\section*{TODO}

\begin{todo}
* Blumen gießen
* Einkaufen gehen
* Zeitschriften sortieren
\end{todo}

\section*{PROGRESSING}

\begin{progress}
* Keller aufräumen
* Bücher sortieren
* Rechner neu installieren
\end{progress}

\section*{DONE}

\begin{done}
* Steuererklärung
* Server neu installieren
* SSH-Zugang einrichten
\end{done}

\end{document}

Uwe

Uwe Ziegenhagen likes LaTeX and Python, sometimes even combined. Do you like my content and would like to thank me for it? Consider making a small donation to my local fablab, the Dingfabrik Köln. Details on how to donate can be found here Spenden für die Dingfabrik.

More Posts - Website

üääü­üöü–üöü–üöü–

by Uwe at Monday, 2020-04-27 19:52

2020-04-26

TUG

TUGboat 41:1 published

TUGboat volume 41, number 1, a regular issue, has been mailed to TUG members. (Mail delivery will be subject to unknown delays due to the global health situation, but the physical issues are in the pipeline, at least.) It is also available online and from the TUG store. In addition, prior TUGboat issue 40:3, a regular issue, is now publicly available. Please consider joining or renewing your TUG membership if you haven't already, and thanks.

Sunday, 2020-04-26 16:58

Uwes kleines Technikblog - Kategorie LaTeX (by Uwe Ziegenhagen)

Monatskalender für LaTeX mit Python erzeugen

Hier ein Beispiel, wie man mit Python kleine Monatskalender erzeugen kann. Geht auch mit LaTeX allein, ich möchte aber verschiedene Output-Formate (Markdown, HTML, etc.) erzeugen und dabei die komplette Kontrolle über den Code behalten.

# -*- coding: utf-8 -*-

import calendar
import datetime

def number_of_weeks(year, month):
    """
        Returns a tupel with the ISO no of the first and week and the no of weeks
    """
    tup_month_days = calendar.monthrange(year, month)
    first = datetime.date(year, month, 1)
    last = datetime.date(year, month, tup_month_days[1])
    first_week = first.isocalendar()[1]
    last_week = last.isocalendar()[1]
    return (first_week, last_week, last_week-first_week+1)

def gen_cal_latex(year, month):
    """
        https://stackoverflow.com/questions/9459337/assign-value-to-an-individual-cell-in-a-two-dimensional-python-array
    """
    c = calendar.TextCalendar()
    week_month_first, week_month_last, no_of_weeks = number_of_weeks(year, month)

    # generate calendar list, using tupel as a key    
    m = {(i, j):' ' for i in range(no_of_weeks) for j in range(7)}

    for tupel_date in c.itermonthdays4(year, month):
        t_year, t_month, t_day, t_weekday = tupel_date
        # use only dates inside the required month
        if t_month == month:
            temp_date = datetime.date(t_year, t_month, t_day)
            # check in which week we are with the current date
            # to get index for the list
            week_no = temp_date.isocalendar()[1]
            m[week_no % week_month_first, t_weekday] = t_day

    print(r'\begin{tabular}{rrrrrrr}')
    print(r'Mo & Di & Mi & Do & Fr & Sa & So \\')
    for i in m:
        if i[1] < 6:
            print('{0} &'.format(m[i]), end='')
        else:
            print('{0}'.format(m[i]),end='')
        if i[1] == 6:
            print(r'\\')
    print(r'\end{tabular}')

gen_cal_latex(2020, 4)

Erzeugt werden kleine Monatskalender der Form

\begin{tabular}{rrrrrrr}
Mo & Di & Mi & Do & Fr & Sa & So \\
  &  &1 &2 &3 &4 &5\\
6 &7 &8 &9 &10 &11 &12\\
13 &14 &15 &16 &17 &18 &19\\
20 &21 &22 &23 &24 &25 &26\\
27 &28 &29 &30 &  &  & \\
\end{tabular}

Per Copy & Paste kann man den Code in ein LaTeX-Dokument kopieren, natürlich lässt sich das alles auch direkt in eine LaTeX-Datei schreiben.

Uwe

Uwe Ziegenhagen likes LaTeX and Python, sometimes even combined. Do you like my content and would like to thank me for it? Consider making a small donation to my local fablab, the Dingfabrik Köln. Details on how to donate can be found here Spenden für die Dingfabrik.

More Posts - Website

üääü­üöü–üöü–üöü–

by Uwe at Sunday, 2020-04-26 16:42

2020-04-19

Uwes kleines Technikblog - Kategorie LaTeX (by Uwe Ziegenhagen)

Trello-Boards exportieren mit py-trello

Hier etwas Beispiel-Code, um Trello-Boards zu exportieren. Unten im Code nur nach stdout, Code für LaTeX und HTML/MD werde ich im Github Repository https://github.com/UweZiegenhagen/python-trello-output ergänzen.

from trello import TrelloClient # pip install py-trello

client = TrelloClient(
    api_key='',
    token=''
)

def list_all_boards(client):
    """
        get list of all boards to determine the ID
        for further functions
    """
    all_boards = client.list_boards()
    for counter, board in enumerate(all_boards):
        print(counter, board.name)

## uncomment if needed
# list_all_boards(client)

def print_cards_from_board(board_id, client):
    """
        Access board with ID board_id in the client instance
        and print all non-archived lists with their non-archived cards 
    """
    all_boards = client.list_boards()
    my_board = all_boards[board_id] # 15 = my someday projects
    all_lists_on_board = my_board.list_lists()

    for list in all_lists_on_board:
        if not list.closed:
            for card in list.list_cards():
                if not card.closed:
                    print(list.name, ':' , card.name)
                
print_cards_from_board(15, client)

Uwe

Uwe Ziegenhagen likes LaTeX and Python, sometimes even combined. Do you like my content and would like to thank me for it? Consider making a small donation to my local fablab, the Dingfabrik Köln. Details on how to donate can be found here Spenden für die Dingfabrik.

More Posts - Website

üääü­üöü–üöü–üöü–

by Uwe at Sunday, 2020-04-19 11:58

2020-04-15

Uwes kleines Technikblog - Kategorie LaTeX (by Uwe Ziegenhagen)

TeX-Dokumentstrukturen visualisieren mit Graphviz und Python

Hier ein Code-Schnipsel, um die Inputs und Includes von LaTeX-Dokumenten zu visualisieren. Ist noch ein wenig rudimentär und hardcoded, in den nächsten Tagen werde ich den Code mal in ein Github-Repo packen und dann ein wenig aufräumen. Aktuell wird eine Datei Master.tex erwartet, eine Graph.dot Datei wird als Output geschrieben. Das Skript geht rekursiv durch die TeX-Dateien durch und sucht nach \input, \include, \includegraphics und \lstinputlisting Befehlen.

# -*- coding: utf-8 -*-
import re

nodes = []

# which commands indicate following
commandsToFollow = ('input', 'include')

def find_ext_references(somefile):
    with open(somefile) as file:
        filecontent = file.readlines()
        for i in filecontent:
            search_results = re.findall(r"(\\)(includegraphics|include|lstinputlisting|input)(\[?.*\]?{)(.+?)(})", i)
            for j in search_results:
                print(j)
                nodes.append((somefile, j[3], j[1]))
                if j[1].endswith(commandsToFollow):
                    find_ext_references(j[3]+'.tex') # assume that no extension is used for input/include
               
find_ext_references('Master.tex')

print(nodes)

if len(nodes)>0:
    with open('graph.dot','w') as output:
            output.write('digraph IncludesInputs {\n')
            output.write('node [shape=box];\n\n')
            for k in nodes:
                if k[2].endswith(commandsToFollow):
                    output.write('"'+k[0] + '"->"' + k[1] + '.tex" [color="green"];\n')
                elif k[2].endswith('graphics'):
                    output.write('"'+k[0] + '"->"' + k[1] + '" [color="blue"];\n')         
                elif k[2].endswith('listing'):
                    output.write('"'+k[0] + '"->"' + k[1] + '" [color="red"];\n')                             
                   
            output.write('}')

Übersetzt man die Graph.dot dann mit der dot.exe aus Graphviz, so erhält man für ein kleines Beispiel den folgenden Graphen. (Beispielaufruf: dot -Tpng Graph.dot)

Uwe

Uwe Ziegenhagen likes LaTeX and Python, sometimes even combined. Do you like my content and would like to thank me for it? Consider making a small donation to my local fablab, the Dingfabrik Köln. Details on how to donate can be found here Spenden für die Dingfabrik.

More Posts - Website

üääü­üöü–üöü–üöü–

by Uwe at Wednesday, 2020-04-15 19:54

2020-04-13

TUG

TeX Collection 2020 and TeX Live 2020 released

TeX Live 2020 has been released, along with the entire TeX Collection 2020, which also includes MacTeX, proTeXt, and a CTAN snapshot. The DVD is in production and will be mailed to TUG, and most other TeX user group, members when manufacturing is complete (hopefully this summer, depending on the global health situation). The software can also be downloaded in various ways, and the DVD ordered from TUG store. Please consider joining or renewing your membership in TUG or another TeX user group, and thanks if you already have. Thanks to all the contributors.

Monday, 2020-04-13 16:44

2020-04-11

Uwes kleines Technikblog - Kategorie LaTeX (by Uwe Ziegenhagen)

Einen Gutschein mit LaTeX entwerfen

Hier der Code, um einen Gutschein mit LaTeX zu entwerfen. Der Code stammt vor allem aus der Anleitung von pgfornaments. Der Code ist nicht „minimal“, das liefere ich gelegentlich nach.

\documentclass[ngerman, a5paper]{article} 
\usepackage[utf8]{inputenc}
\usepackage[T1]{fontenc}
\PassOptionsToPackage{dvipsnames,svgnames}{xcolor}  
\usepackage{graphicx,rotating} 
\usepackage[object=vectorian]{pgfornament}
\usepackage{tkzexample,tikzrput,pict2e,picture} 
\usetikzlibrary{shapes.geometric,calc} 
\usepackage{eso-pic,calc} 
\usepackage{fancyvrb}
\fvset{fontsize=\normalsize}  

\makeatletter
\AddToShipoutPicture{%
  \begingroup 
    \setlength{\@tempdima}{2mm}%
    \setlength{\@tempdimb}{\paperwidth-\@tempdima-2cm}%
    \setlength{\@tempdimc}{\paperheight-\@tempdima}%
    \put(\LenToUnit{\@tempdima},\LenToUnit{\@tempdimc}){%
            \pgfornament[color=Maroon,anchor=north west,width=2cm]{63}} 
    \put(\LenToUnit{\@tempdima},\LenToUnit{\@tempdima}){%
            \pgfornament[color=Maroon,anchor=south west,width=2cm,symmetry=h]{63}}
    \put(\LenToUnit{\@tempdimb},\LenToUnit{\@tempdimc}){%
            \pgfornament[color=Maroon,anchor=north east,width=2cm,symmetry=v]{63}} 
    \put(\LenToUnit{\@tempdimb},\LenToUnit{\@tempdima}){%
            \pgfornament[color=Maroon,anchor=south east,width=2cm,symmetry=c]{63}}    
  \endgroup  
} 
\let\strippt\strip@pt     
\makeatother   
\newcommand{\eachpageornament}{%
\begin{picture}(0,0)
\put(0,0){\pgfornament[width=1cm]{41}};
\put(\strippt\textwidth,0){\pgfornament[width=1cm,symmetry=v]{41}}; 
\put(0,-\strippt\textheight){\pgfornament[width=1cm,symmetry=h]{41}}; 
\put(\strippt\textwidth,-\strippt\textheight){\pgfornament[width=1cm,symmetry=c]{41}};  % 
\end{picture}}   

 \usepackage{aurical}
\setkeys{Gin}{width=\linewidth,totalheight=\textheight,keepaspectratio}
\usepackage{array,booktabs} % book-quality tables
\usepackage{multicol} % multiple column layout facilities
\usepackage[babel=true]{microtype}
\usepackage[ngerman]{babel}   
\usepackage{blindtext} 
\pagestyle{empty}

\begin{document}
 \Fontlukas
 
\begin{center}
{\Huge Gutschein}
\end{center}

\large\noindent \blindtext

\vspace*{1.5em}\noindent Max\hfill Moritz

\end{document}  

Uwe

Uwe Ziegenhagen likes LaTeX and Python, sometimes even combined. Do you like my content and would like to thank me for it? Consider making a small donation to my local fablab, the Dingfabrik KĂśln. Details on how to donate can be found here Spenden fĂźr die Dingfabrik.

More Posts - Website

üääü­üöü–üöü–üöü–

by Uwe at Saturday, 2020-04-11 15:26

2020-04-10

TeX & Friends (by Jürgen Fenn)

TeX Live 2020 veröffentlicht

TeX Live 2020 ist fertig und wurde jetzt auch offiziell veröffentlicht. Es wird noch bis zu einer Woche dauern, bis alle CTAN-Spiegelserver mit der neuen Version bestückt sind. Alle Neuigkeiten wurden hier zusammengefasst, der Link entlastet mich von einer Nacherzählung. Die meisten Anwender werden von den Neuerungen wahrscheinlich nichts bemerken, und diejenigen, die es angeht, sind mit dem Link gut informiert.

Speziell für den Mac ist zu beachten, dass MacTeX 2020 unter macOS 10.13 und höher läuft. Wie schon in den Vorjahren, werden nur die drei letzten von Apple noch gepflegten Plattformen von macOS unterstützt. Wer eine ältere Plattform einsetzt, muss also x86_64-darwinlegacy verwenden; die Binaries setzen Mac OS X 10.6 oder höher voraus.

Da die TeX Live Utility und BibDesk (noch) nicht notarisiert worden sind, sind sie nicht Teil von MacTeX, sondern müssen separat heruntergeladen werden (wir erinnern uns an Dick Kochs Posting vom letzten Sommer). Näheres dazu findet man im README von MacTeX und in der Datei MISSING APPS.pdf. Wer die beiden erwähnten Programme bereits von einer früheren Version von MacTeX in Applications/TeX installiert hatte, wird sie auch weiterhin dort vorfinden und benutzen können. Sie sollten, wie alle Bestandteile der Distribution, auf dem aktuellen Stand gehalten werden, und zumindest die Verwendung der TeX Live Utility ist für den teXnischen Unterbau dringend anzuraten, denn sie bedeutet eine enorme Erleichterung gegenüber dem Kommandozeilentool tlmgr als Paketmanager.

Wer sich längere Zeit nicht mehr mit der Entwicklung von LaTeX beschäftigt hatte, möge sich dringend die LaTeX News zu Gemüte führen, mindestens seit der Ausgabe Nr. 28, als UTF-8 zur Standardkodierung wurde. Vor zwei Monaten wurde das LaTeX3-Interface expl3 zu großen Teilen in den LaTeX-Kernel integriert. NFSS wurden schöne Erweiterungen spendiert. Und LuaTeX wurde mit HarfBuzz kombiniert.

by schneeschmelze at Friday, 2020-04-10 21:12

2020-04-05

STM Publishing: Tools, Technologies and Change (by Graham Douglas)

Site update

This site has been dormant since my last post back in December 2016 but, yesterday, I had to make some upgrades, including the PHP version used on this site. Unfortunately, those upgrades broke a few things—which I fixed just to get the site operational. However, for now, the site is running with a vanilla-style WordPress theme and, no doubt, some old posts need remedial work to fix them (I will get around to that…). However, upgrading/fixing the site re-kindled my interest in writing some new articles which I’ll try to do over the forthcoming weeks (time permitting!).

For several years I had not been checking the contact e-mail address I use for this site—due to the tsunami of junk e-mail, despite efforts to block spam. Now, I’ve simply removed the contact page which is not ideal but if you run a blog you will know the curse of spammers only too well. Sigh. However, I braved the task of checking that backlog list of e-mails (thousands!) and found a few legitimate e-mails that I never replied to, so my sincere apologies for that.

Since my last post I was fortunate to secure a new job with Overleaf where, among other things, I write articles about various TeX-related topics. Here, on this site, I will probably write posts which cover various eclectic topics which reflect whatever has caught my interest at the time.

Until the next post, stay safe and take all necessary precautions to protect yourself from the dreadful coronavirus (COVID-19) pandemic. Cheers!

by Graham Douglas at Sunday, 2020-04-05 11:10

2020-04-01

TeX & Friends (by Jürgen Fenn)

TeX Live 2020 Pretest abgeschlossen

TeX Live 2020 ist eingefroren worden. Der Release war ursprünglich für den 10. April 2020 geplant, jedenfalls zu Ostern, am Ende der Karwoche.

by schneeschmelze at Wednesday, 2020-04-01 04:22

2020-03-26

TUG

TUG early bird deadline extended to April 30

We have extended the deadline for the TUG early bird membership discount to April 30. Thanks so much to everyone who has already renewed or joined.
Two bits of news, while we're here: 1) The first issue of TUGboat for 2020 is well underway, and we will happily accept submissions for a few more days; 2) TeX Live 2020 is in the last days of pretesting before the release, so give it a try if you like.
Happy TeXing to all.

Thursday, 2020-03-26 17:45

2020-03-20

2020-03-12

2020-03-11

Beautiful Type

Trying to come back on track 😅 with that beautiful H from...



Trying to come back on track 😅 with that beautiful H from @a_letter_a_day https://ift.tt/2Q4vSEE

Wednesday, 2020-03-11 10:40

2020-03-03

TeX & Friends (by Jürgen Fenn)

TeX Live 2020 Pretest hat begonnen

Nach dem Freeze für TeX Live 2019 hat Karl Berry nun den Beginn des Pretest für TeX Live 2020 angekündigt. Alles weitere ist der entsprechenden Seite zu entnehmen. Dort werden auch die Neuerungen gesammelt. Der Master für die Release Notes ist hier im Webbrowser zu lesen. Alles etwas schrittweise dieses Jahr, sage ich mal.

MacTeX 2020 setzt mindestens macOS 10.13 High Sierra voraus, die x86_64-darwinlegacy Binaries laufen ab Mac OS X 10.6. BibDesk und die TeX Live Utility sind diesmal nicht in MacTeX enthalten, weil sie nicht notarised werden. Sie müssen separat heruntergeladen werden.

by schneeschmelze at Tuesday, 2020-03-03 05:22

2020-03-01

TeX & Friends (by Jürgen Fenn)

TeX Live 2019 eingefroren und LaTeX Spring Release

Karl Berry hat auf der TeX-Live-Mailingliste bekanntgegeben, dass die Distribution nun eingefroren werde. Der Paketmanager tlmgr zeige das schon an. Gleichwohl würden noch einige Updates zusätzlich installiert, so die gerade erst hochgeladene neue LaTeX-Version. Der Pretest für TeX Live 2020 stehe bald bevor.

Der gerade erwähnte LaTeX-Frühlings-Release (der mir bisher entgangen war) bringt vor allem zwei Neuerungen:

  • LaTeX3 muss nicht mehr separat hinzugeladen werden, sondern steht nun standardmäßig im Format LaTeX zur Verfügung, was die Verarbeitung von Features, die expl3 voraussetzen, beschleunigen soll. Dazu zählen alle Dokumente, die mit einer Unicode-Engine (LuaTeX, XeTeX) bearbeitet werden sollen. Aber auch das Ziel, tagged und barrierefreie PDFs mit LaTeX zu erzeugen, soll dadurch näher rücken.

  • Weiterhin wurde das New Font Selection Scheme (NFSS) modernisiert. Schriftschnitte, die bisher nicht mit dem NFSS aufgerufen werden konnten, werden nun kanonisch berücksichtigt, beispielsweise kursive Kapitälchen oder eine Serie von condensed-Schnitten.

Mehr darüber ist den LaTeX News Nr. 31 zu entnehmen.

by schneeschmelze at Sunday, 2020-03-01 19:35

2020-02-25

2020-02-23

Reden ist Silber ¶ Druckerey Blog (Martin Z. Schröder, Drucker)

Maschinenschaden am Tiegel: Eine Woche nachdenken [1]

Als ich morgens die Maschine einschaltete, den Original Heidelberger Tiegel, 1952 gebaut und fast täglich im Einsatz für alles, was mit schwarzer Farbe gedruckt wird, gab es ein Kratzen, Kreischen, Schnarren.

Das kam aus der Richtung der Riemenscheiben, auf denen der das Schwungrad treibende Keilriemen vom Motor angetrieben wird. Ich nahm die Verkleidung ab und fand darunter eine gebrochene Druckfeder und eine durchgefressene Riemenscheibe. Auf dem Foto (Vergrößerung nach Anklicken) sieht man den größeren Teil der Feder, die ich von der Welle gezottelt hatte, daneben die Abdeckung mit dem eingefrästen Schlitz. Wie lange dauert es, bis eine Stahlfeder eine Eisenplatte durchfräst? Sechs Jahre? Zwölf? Fünfzig? Dieser Heidelberger Tiegel ist nun 68 Jahre alt.

Über das graue Pulver, das sich in den letzten Jahren auf der Bodenplatte sammelte, hatte ich mich zwar gewundert, aber ich ahnte die Ursache nicht und wischte es nur weg. Ein gescheiterer Drucker als ich hätte schon eher mal unter die Verkleidung geschaut.

Ich hatte das Glück, daß der Maschinenexperte Herbert Wrede in Bremen, dem ich die beiden Geräte verdanke, noch die beiden Ersatzteile hatte: die gebrochene Feder und die durchgefressene Scheibe. Dazu schickte er mir noch ein Werkzeug, um die Abdeckung wieder präzise aufzuschrauben. Aber wie bekomme ich die Scheibe auf die Stahlfeder? Drücken und schrauben, sagte der Kollege. Und ein zweiter Mann hält an der Riemenscheibe die Welle fest.

Allerdings ist das Gewinde, mit dem die Scheibe auf die Welle geschraubt wird, sehr fein. Und wenn man eine Stahlfeder zusammendrückt, hat man wenig Gefühl übrig dafür, das Gewinde exakt aufzusetzen. Also überlegte ich, wie ich die Feder komprimiert auf die Spindel bekomme. An die ersten Versuche ging ich durchaus mit Respekt. Eine dreißig Zentimeter lange Stahldruckfeder kann Waffe werden. Und trotzdem traf sie mich im Daumenballen. Mit Pflaster und Handschuhen ging es weiter. Die Kabelbinder rutschten ständig weg. Dann gelang es mir, die Feder mit Spanngurten zusammenzudrücken. Erst mit vier Stück, dann blieben zwei übrig. Aber die Feder hatte kaum Spiel auf der Spindel der Riemenscheibe, auch mit zwei Gurten paßte die Feder nicht auf die Spindel.

Ich korrespondierte und telefonierte mit insgesamt fünf Kollegen. Ich befragte die Facebookgruppe »Heidelberger Tiegel«. Mir wurden Werkzeuge empfohlen und Techniken, aber nichts davon war brauchbar. Diese Feder bricht so selten, daß niemand mir aus eigener Erfahrung raten konnte. Ich rief einen Schlosser an, der wollte am nächsten Morgen kommen. An diesem Morgen noch im Bett meinte ich deutlicher als bisher zu spüren, etwas übersehen zu haben.

Dann stand ich vor der Maschine, und auf einmal fiel der Groschen: Statt die Feder im ganzen Stück zusammenzupressen, müßte ich ihr mehr Raum schaffen. Mehr Raum schaffte ich, wenn ich den Keilriemen, der die Riemenscheiben auseinanderdrückt, herunternehme. Dadurch gibt die Spindel der Riemenscheibe ein Stück der Welle frei, und das müßte genügen, eine zur Hälfte kompirimierte Feder aufzusetzen und die Abdeckung ohne Federdruck aufzuschrauben. Ich sagte dem freundlichen Schlosser Dudek senior (sein Betrieb in Weißensee hatte mir schon Teile für die Sanierung von Bostontiegeln gefräst und Druckfedern zugeschnitten) seinen Besuch ab und machte mich ans Werk.

Es ging flott. Nach einer halben Stunde war alles erledigt. Die Feder habe ich erst mit Spanngurten gepreßt, dann Kabelbinder eingesetzt und die Gurte entfernt. Mit einer Hand konnte ich die Scheibe aufschrauben, mit einem Gummihammer und einem Greifer einhändig genau so festziehen, daß die Sicherungsschraube sich einsetzen ließ, mit der anderen Hand eine der Riemenscheiben und damit die Welle festhalten.

Zum Schluß, nach einem Probelauf, wurden die Kabelbinder abgeschnitten und die Abdeckung aufgesetzt. Beim Anlaufen ist die Maschine jetzt viel leiser als früher. Das schleifende Geräusch der gebrochenen Feder hatte mich seit Jahren begleitet, ohne daß ich es erkannt hatte. Jeder Tiegel hört sich ohnehin etwas anders an. Jetzt läuft die Maschine ruhiger an. Nun war ich erstens froh, daß mein Tiegel wieder läuft. Zweitens war ich stolz auf meine Denkleistung. Ich bin kein Maschinenbauer. Ich bin ein sicherlich ein brauchbarer Schriftsetzer und auch Drucker, aber in Motoren und Getriebe glotze ich wie ein Esel. Deshalb freute ich mich, die Lösung nun alleine gefunden zu haben. Und drittens erwies sich die Lösung als so simpel, daß ich mich schämte, eine Woche dafür gebraucht zu haben.

Die Freude aber war so groß, wie ich sie als Folge einer Denkleistung nicht kannte. Das brauchte mich wieder ins Grübeln. Wie wichtig ist eigentlich das konstruktive Denken? Warum gibt es außer ein bißchen Physik und Elektronik keinen Schulunterricht in Konstruktion? Denn an diesem Denkvorgang fand ich interessant, daß die Weiterung des Feldes die Lösung brachte. Ich verstand endlich einmal konkret, was es bedeutet, an einem System zu arbeiten, weil dieser Vorgang so eindrücklich war: erst die Lösung nicht sehen, dann ihre Einfachheit erkennen. Ich hatte erst tagelang nur die Feder und die Spindel angestarrt und versucht, die Lösung dort zu finden. Als ich den Horizont vergrößerte und schaute, was hinter der Spindel ist, fiel mir die Lösung gleich auf. Alles greift ineinander, schon beim Drucken hat man ja diverse Einstellungen vorzunehmen: Anblasen des Papierstapels, Transport des Stapels, wenn oben Bogen weggenommen werden, Kippung der Ansaugstutzen, Einstellungen der Paßmarken, Geschwindigkeit, die den Farbauftrag und das Auftreffen des Bogens auf den Marken beeinflußt, Einstellen der Farbkonsistenz und -menge, Bestäubung, Druckregulierung, Zurichtung. Wenn nur eines nicht stimmt, wird das Produkt weniger gut.

by Martin Z. Schröder at Sunday, 2020-02-23 18:10

2020-02-02

LaTeX Project

Spring 2020 LaTeX release available

The spring 2020 LaTeX release is available

When we realized that the next LaTeX release was scheduled close to a rather memorable day, we decided to delay the release by one day so that it will now appear officially on February the 2nd 2020 (or on 20 20 02 02 if you use the ISO-date and arrange the spaces somewhat unconventionally). This release introduces a number of important enhancements and so being able to use this date is a rather nice opportunity to mark its importance.

LaTeX now contains the L3 programming layer as part of its format

This results in noticeable speed improvements for many documents because these days they often (implicitly) load expl3 at some point in the preamble. This is the case for virtually all documents processed with a Unicode engine, but also true for many pdfTeX documents, because more and more packages use xparse or other expl3-based code. Also important (and maybe more so) it will enable us to successfully tackle large and difficult problems, such as automatically producing tagged and accessible documents, in the near future.

The LaTeX Font Selection Scheme (NFSS) got a face lift

It now integrates some important enhancements that have been developed in the recent years, including ideas from the packages fontspec, fontaxes and mweights. Beside other improvements this will better support the new font families that are made available for use with LaTeX, because a large number of them provide additional font faces (e.g., small caps italic, or a condensed series) and those could not be supported in the old NFSS in a natural way.

Where to learn more …

The new features and most the important bug fixes made in this release are documented in “LaTeX2e News Issue 31”. This document can be found on the LaTeX2e news page where you will also find release information for earlier LaTeX releases.

Topics are:

  • Experiences with the LaTeX-dev formats
  • Concerning this release … (LuaLaTeX engine)
  • Improved load-times for expl3
  • Improvements to LaTeX font selection: NFSS
  • Other changes to the LaTeX kernel
  • Changes to packages in the graphics category
  • LaTeX requirements on engine primitives

Happy LaTeXing — Frank

Sunday, 2020-02-02 00:00

2020-01-31

LaTeX Project

Issues 11 and 12 of LaTeX3 news released

Issues 11 and 12 of LaTeX3 news released

There has been quite a gap since the last LaTeX3 News and there is a lot to report. By mistake LaTeX3 News 11 (nominally February 2018) did not get published when written and we discovered this only when we prepared issue 12. We have kept the information it contains separate as it is a good summary of the work that had happened in 2017.

LaTeX3 News 12 then covers the work done in 2018 and 2019.

We also added a document that combines all twelve LaTeX3 newletters in historical order. This gives some interesting insights into the activities of the last decade and clearly shows the increased activities of the team in the recent years, now that expl3 (the programming layer of LaTeX3) gains more and more followers.

You will find all newletters as individual PDF files also on the LaTeX3 newletter page on this site.

—Enjoy Frank

Friday, 2020-01-31 00:00

2020-01-17

LaTeX Project

Second prerelease of LaTeX 2020-02-02 is available for testing

The second (and final) LaTeX prerelease for 2020-02-02 is available for testing

A few days ago we submitted a new LaTeX development format to CTAN and by now it should be available to all users using MikTeX or TeX Live (on any operating system).

This format allows you to test the upcoming LaTeX release scheduled for 2020-02-02 with your documents or packages. Such testing is particularly important for package maintainers to verify that changes to the core LaTeX haven’t introduced incompatibilities with existing code. We try to identify any such problem beforehand, but such an undertaking is necessarily incomplete, which is why we ask for user testing.

Besides developers we also ask ordinary users to try out the new release candidate, because the more people who test the new format, the higher the chances that any hidden problems are identified before the final release in February hits the streets.

Processing your documents with the prerelease is straight forward. All you have to do is to replace the invocation command by appending -dev to the executable, e.g., on the command line you would run

pdflatex-dev myfile    or    lualatex-dev myfile    or    xelatex-dev myfile

instead of using pdflatex, lualatex or xelatex. If you use an integrated editing environment, then it depends on the system how to configure it to use an alternative format; but in any case the necessary modification should be straight forward.

Please test now — this is the final pre-release prior to the main 2020-02-02 release

We ask you to test now, because this pre-release (with the execption of some further documentation improvements) is representing the 2020-02-02 in the way we want to roll it out in February.

If you encounter any issues with it, please open a bug report at our Issue Tracker for LaTeX2e.

Main features of 2020-02-02 prerelease 3

For technical reasons this second prerelease is labeled “prerelease 3”. Prerelease 1 existed only for a few hours on CTAN, as it was missing some files due to a problem in the build process, but because it was already on CTAN we had to increase the prerelease number.

Improvements for selecting and managing modern fonts

Many modern fonts available these days offer additional font faces, e.g., small caps italics or a condensed font series, etc. To better support these fonts NFSS (the New Font Selection Mechanism for LaTeX) was extended by incorporating ideas from the mweights, fontaxes and fontspec packages. In addition all symbols formerly provided through the textcomp package are now available out of the box and the use of \oldstylenums got improved.

We also fixed a number of smaller bugs. A detailed description of the new features and the bug fixes can be found in a draft version of ltnews31 which you can access via texdoc ltnews31 if the pre-release is installed on your computer.

Main features already available in the previous pre-release

Below is a summary of the main features that were made available earlier in the first pre-release.

A new LuaTeX engine coming up …

In TeXLive 2020 the LuaLaTeX format will always use the new LuaHBTeX engine, which is LuaTeX with an embedded HarfBuzz library. HarfBuzz can be used by setting a suitable renderer in the font declaration. An interface for that is provided by the fontspec package. This additional font renderer will greatly improve the shaping of various scripts, which are currently handled correctly only by XeTeX.

To simplify the testing of the new engine, the necessary executables have been added to MiKTeX and TeXLive 2019 and both have changed the LuaLaTeX-dev format to use it. This means you can already now test the new engine by using the prerelease!

Even if you have no need for the new HarfBuzz functionality it might be worthwhile running some tests, because, from 2020 onwards this will be the only LuaTeX engine for which a LaTeX format will be distributed in TeXLive and MikTeX.

Improved load-times for expl3

The LaTeX3 programming layer, expl3, has over the past decade moved from being largely experimental to broadly stable. It is now used in a significant number of third-party packages, either explicitly or implicitly (e.g., by loading xparse to define user command interfaces), so that many documents load the expl3 code at some point in the preamble. Most LaTeX documents compiled using XeTeX or LuaTeX load fontspec, which is written using expl3 so in these engines it is nearly always used.

The expl3 layer contains a non-trivial number of macros, and when used with the XeTeX and LuaTeX engines, it loads a large body of Unicode data. This means that even on a fast computer, there is a relatively large load time penalty whenever expl3 is needed.

For this release, the team have made adjustments in the LaTeX 2e kernel to pre-load a significant portion of expl3 when the format is built. This is transparent at the user level, other than the significant decrease in document processing time: there will be no “pause” for loading Unicode data files. Loading of expl3 in documents and packages can be done as usual; it will, at some future time, become possible to omit

\RequirePackage{expl3}

entirely, but, to support older formats, this explicit loading is at present still recommended.

The prereleases will never go stale in the future (we hope)

The release policy we will follow should ensure that the -dev releases are never older than the main release. When a full 2020-02-02 release is made, the -dev release will be updated to be the same format, or possibly a pre-release of the next release. It will not be left as a pre-release of 2020-02-02 at that point.

More details on prereleases please …

More details and some background information about these concepts and the process of the development formats is available in a TUGboat article:

The LaTeX release workflow and the LaTeX dev formats

  • Frank Mittelbach
  • TUGboat 40:2, 2019
  • Abstract

    How do you prevent creating banana software (i.e., software that gets ripe at the customer site)? By proper testing! But this is anything but easy.

    The paper will give an overview of the efforts made by the LaTeX Project Team over the years to provide high-quality software and explains the changes that we have made this summer to improve the situation further.


Enjoy — Frank

Friday, 2020-01-17 00:00

2020-01-15

Some TeX Developments (by Joseph Wright)

Case changing in expl3

A few years ago I wrote about the work the LaTeX team were doing on providing case changing functions in expl3. Since then, the code has been tested and revised, and very recently has moved to a ‘final’ home within expl3. It therefore seems like a good time to look again at what the challenges are and what tools we’ve provided.

It’s worth noting up-front that all of the expl3 functions work with UTF-8 input, and as far as possible case changing (and other text manipulation) follows the Unicode Consortium guidelines.

Different kinds of input, different kinds of case changing

To understand what functions we’ve provided for case changing, we first have to know what different types of input we might be dealing with. There are broadly two types

  • Text: material that we will want to typeset or similar, and which contains natural language content. This material might also have some formatting, and may be marked up as being in a particular language.
  • Strings: material used in code, for example as identifiers, to construct control sequences or to find files. This material will never have formatting, and should always give the same outcome, irrespective of the language a document is written in.

Unsurprisingly, case-changing strings is a lot more straight-forward than case-changing text. They ‘live’ in different parts of the expl3 code, so I’ll look at them separately.

Strings

In TeX terms, a string is a series of characters which are all treated as ‘other’ tokens (except spaces, which are still spaces). That’s important here because it means strings won’t contain any control sequences, and because with pdfTeX there can’t be any (useful) accented characters.

The most obvious need to handle case in programming strings is when comparing in a caseless manner: ‘removing’ the case. Programmers often do that by lowercasing text, but there are places where that’s not right. For example, Greek has two forms of the lowercase sigma (σ and ς), and these should be treated as the same for a caseless test. Unicode define the correct operation: case folding. In expl3, that’s called \str_foldcase:n.

\exp_args:Ne \str_show:n
  { \str_foldcase:n { AbC } }

Much more rare is the need to upper- or lowercase a string. Unicode do not mention this at all, but in TeX we might want to construct a control sequence dynamically. To do that, we might want to uppercase the first character of some user input string, and lowercase the rest. We can do that by combining \str_uppercase:n and \str_lowercase:n with the \str_head:n and \str_tail:n functions:

\exp_args:Ne \str_show:n
  {
    \str_uppercase:f { \str_head:n { SomeThing } }
    \str_lowercase:f { \str_tail:n { SomeThing } }
  }

Text

The basics

Case changing text is much more complicated because it has to deal with control sequences, accents, math mode and context. The first step of case changing here is to expand the input as far as possible: that’s done using a function called \text_expand:n which works very similarity to the LaTeX2e command \protected@edef, but is expandable. We don’t really need to worry too much about this: it’s built in to the case changing system anyway.

Upper- and lowercasing is quite straight-forward: the functions have the natural names \text_uppercase:n and \text_lowercase:n. These deal correctly with things like the Greek final-sigma rule and (with LuaTeX and XeTeX) cover the full Unicode range.

% Try with XeTeX or LuaTeX
\exp_args:Ne \tl_show:n
  {
    \text_uppercase:n { Ragıp~Hulûsi~Özdem } ~
    \text_lowercase:n { ὈΔΥΣΣΕΎΣ }
  }

A variety of standard LaTeX accents and letter-like commands are set up for correct case changing with no user intervention required.

\exp_args:Ne \tl_show:n
  {
    \text_uppercase:n { \aa{}ngstr\"{o}m  ~ caf\'{e} }
  }

Case changing exceptions

There are places that case changing should not apply, most obviously to math mode material. There are a set of exceptions built-in to the case changer, and that list can be extended: it’s easy to add the equivalent of \NoCaseChange from the textcase package.

\tl_put_right:Nn \l_text_case_exclude_arg_tl
  { \NoCaseChange }
\exp_args:Ne \tl_show:n
  {
    \text_uppercase:n { Hello ~ $y = max + c$ } ~
    \text_lowercase:n { \NoCaseChange { iPhone } ~ iPhone }
  }

Titlecasing

Commonly, people think about uppercasing the first character of some text then lowercasing the rest, for example to use it at the start of a sentence. Unicode describe this operation as titlecasing, as there are some situations where the ‘first character’ is handled in a non-standard way. Perhaps the best example is IJ in Dutch: it’s treated as a single ‘letter’, so both letters have to be uppercase at the start of a sentence. (We’ll come to language-dependence in a second.)

Depending on the exact nature of the input, we might want to titlecase the first ‘character’ then lowercase everything else, or we might want just to titlecase the first ‘character’ and leave everything else unchanged. These are called \text_titlecase:n and \text_titlecase_first:n, respectively.

\exp_args:Ne \tl_show:n
  {
    \text_titlecase:n { some~text } ~
    \text_titlecase:n { SOME~TEXT } ~
    \text_titlecase_first:n { some~text } ~
    \text_titlecase_first:n { SOME~TEXT }
  }

As we are not simply grabbing the first token of the input, non-letters are ignored and the first real text is case-changed.

\exp_args:Ne \tl_show:n
  {
    \text_titlecase:n { 'some~text' }
  }

Language-dependent functions

One important context for case changing text is the language the text is written in: there are special considerations for Dutch, Lithuanian, Turkic languages and Greek. That’s all handled by using versions of the case-changing functions that take a second argument: a BCP 47 string which can determine the path taken.

\exp_args:Ne \tl_show:n
  {
    \text_uppercase:n { Ragıp~Hulûsi~Özdem } ~
    \text_uppercase:nn { tr } { Ragıp~Hulûsi~Özdem }
  }
\exp_args:Ne \tl_show:n
  {
    \text_uppercase:n { ὈΔΥΣΣΕΎΣ} ~
    \text_uppercase:nn { el } { ὈΔΥΣΣΕΎΣ }
  }

Over time, mechanisms to link this behaviour to babel will be developed.

Conclusions

Case-changing functions in expl3 are now mature and stable, and ready for wider use. It’s likely that they will be made available as document-level commands in the medium term, but programmers can use them now to make handling this important aspect of text manipulation easier.

Wednesday, 2020-01-15 00:00

2020-01-07

Weblog von Markus Kohm

Das neue Jahr bringt nicht nur Vorsätze

Wer auf komascript.de aufmerksam mitliest, weiß schon länger, dass einige Entscheidungen anstanden und teilweise noch anstehen. Diese sollen dringend notwendige Veränderungen für mich mit sich bringen, haben aber natürlich auch Auswirkungen auf die Anwender. Mit dem neuen Jahr – oder eigentlich bereits kurz vor Weihnachten – sind die meisten dieser Entscheidungen gereift.

Seit mehr als 25 Jahren entwickle und pflege ich KOMA-Script quasi als One-Man-Show. So ganz stimmt das natürlich nicht, denn schon sehr früh bekam ich erste Unterstützung. Damals übernahm Luzia den Upload von KOMA-Script auf den CTAN-Server, weil es für mich extrem umständlich war, per Modem und VT220-Terminal-Zugang zur Uni das damals schon nicht kleine Paket zuerst auf meinen Benutzeraccount an der Uni zu übertragen und dann per ftp auf den CTAN-Server. Axel/Harald schrieb die erste vollständige Anleitung zu KOMA-Script. Der andere Axel steuerte die erste Briefklasse bei. Jahre später investierte Jens-Uwe sehr viel Zeit, die Überarbeitung der Anleitung und die Veröffentlichung als Buch zu betreuen und auch einige Kapitel der neuen Anleitung beizusteuern. Torsten wurde in dieser Zeit unverzichtbar als Ratgeber bei der Entwicklung einer komplett neuen Briefklasse aber auch bei der fortgesetzten Korrektur der Anleitung. Wenige Jahre später wurde Elke für die Weiterentwicklung des Pakets und die Korrektur der Anleitung und der Buchfassung aber auch für den Support im Allgemeinen unverzichtbar. Falk wiederum findet seit Jahren die Fehler, die so tief in KOMA-Script vergraben sind, dass ihre Beseitigung mir immer wieder erhebliche Kopfschmerzen bereitet. Dasselbe gilt für die vielen Anregungen und Erweiterungen, die auf ihn zurück gehen. Uwe hat seit der Jubiläumstagung von DTANE in Heidelberg die Verwaltung der CTAN-Uploads übernommen. Nach Elke, die die komplette Familie dafür eingespannt hat, haben zunächst unter anderem Gernot, dann Krickette und schließlich Karl sich der Übersetzung der KOMA-Script-Anleitung angenommen und diese massiv überarbeitet. Viele weitere stießen im Laufe der Jahrzehnte zum inoffiziellen Team, manche sind wieder in der Versenkung verschwunden. Ohne die ganzen Mitstreiter wäre die Arbeit aber auch der immer wieder entstehende Frust nicht zu bewältigen gewesen. Eine wichtige Rolle haben dabei natürlich auch diejenigen gespielt, die mich auf unterschiedliche Weise aufgemuntert und ermutigt haben. Eine zentrale Rolle spielte auch meine Familie, die mir überhaupt ermöglicht haben, Jahre in die Entwicklung und Pflege dieses Monsters zu stecken.

Neben den ganzen Arbeiten an KOMA-Script habe ich seit mehr als 30 Jahren auch allgemeinen LaTeX-Support geleistet. Auch dieser Bereich unterlag in den 25 Jahren vielfältiger Veränderung. Anfangs erfolgte der Support noch per Post und in einem Mailbox-Netz, in dem Laufzeiten von 12–24 Stunden für E-Mails und öffentliche Beiträge normal waren. Damals hat kaum jemand erwartet, innerhalb weniger Stunden oder gar Minuten eine fertige Lösung zu erhalten. Inzwischen findet der Support hauptsächlich in WWW-Foren statt. Davon gibt es Dutzende, deren Bezug zu LaTeX teilweise sehr klar, teilweise aber auch sehr undeutlich ist. Zeitweilig war ich in bis zu neun dieser Foren gleichzeitig aktiv. Dazu kamen Mailinglisten und Usenet-Gruppen. In mehreren Foren war ich auch als Moderator oder Administator tätig. Gerade die WWW-Foren verleiten extrem dazu, sich mit Dingen zu beschäftigen, die vor allem Zeit kosten und mit meiner Kernkompetenz – so ich eine solche überhaupt habe – wenig zu tun haben. Persönlich lagen und liegen mir dort Anfänger sehr am Herzen. Immer wieder habe ich auch gegen besseres Wissen versucht, solche an die Hand zu nehmen und ihnen bei ihrem mehr als holprigen Start in die Welt der Foren und die LaTeX-Welt zu helfen. Manchmal zerrte das erheblich an den Nerven. Denn genau genommen bin ich dabei komplett fehlt am Platz. Ich nehme mir viel zu sehr zu Herzen, wenn ich es nicht schaffe, jemanden auf die Erfolgsspur zu bringen.

Auch die Arbeit am Monster hat sich über die Jahre verändert. Auch da wird inzwischen teilweise erwartet, dass die Lösung eines Problems nicht nur vom Himmel fällt, sondern den Boden bereits in dem Moment erreicht, in dem ein wage Ahnung des Problems wie ein laues Lüftchen durch die Baumwipfel der Kalahari zieht. Würde mich nicht hin und wieder jemand auf das Rauschen der Blätter hinweisen oder mir den Wind hinter der Luftbewegung zur Kenntnis bringen, würde ich oftmals nicht einmal die Bäume finden. Auf der anderen Seite hat beispielsweise die Entwicklung von LaTeX nach Jahren des Stillstands derart an Fahrt aufgenommen, dass ich ständig hinter irgendwelchen Veränderungen, die Auswirkungen auf KOMA-Script haben, her hechle. In den Anfangsjahren von KOMA-Script hatte ich mich noch schlicht geweigert, Interna von LaTeX anzutasten. Trotzdem wurde ich schon damals von Änderungen an LaTeX manchmal überfahren, wenn Dinge, die KOMA-Script selbst anbot, plötzlich in den LaTeX-Kern übernommen wurden. Bei KOMA-Script 3 war es unvermeidbar, einige Interna von LaTeX anzupacken, obwohl ich schon damals befürchtete, dass mir das irgendwann auf die Füße fallen könnte. Das geschieht inzwischen. Gleichzeitig ist KOMA-Script so riesig geworden, dass es erheblichen Aufwand bedeutet, allein schon eine Ahnung davon zu bekommen, was sich wo auswirken könnte. Die Art der Auswirkung und ggf. die Lösung von daraus entstehenden Problemen ist dann noch lange nicht gefunden.

Trotz beruflicher Reduktion und schließlich komplettem Rückzug hat all das schon seit Jahren zu einem latenten Gefühl der ständigen Überlastung geführt. Ein Zustand, den ich durchaus schon einmal kannte und den ich inzwischen dringend zu vermeiden suche. Schon länger weiß ich, dass sowohl meine Zeit als auch meine Energie nicht genügen, um all die Dinge, die ich gefühlt tun sollte, die man von mir erwartet und die ich tatsächlich tun will, auch wirklich zu schaffen. Schon vor längerer Zeit habe ich deshalb diversen Foren den Rücken gekehrt. Manchmal fiel das eher schwer, manchmal wurde es durch unschöne Vorfälle wie den Nazivorwurf an mehr oder weniger alle deutschen Teilnehmer auf TeX.SX sehr erleichtert. Eine Verletzung, die ich gerne als Sportverletzung bezeichnen würde, die aber tatsächlich eher meiner Eitelkeit und Ungeduld zuzuschreiben ist, behindert mich darüber hinaus leider nach über einem Jahr noch immer bei meiner Arbeit am Computer. Ende Oktober, Anfang November musste ich mir dann eingestehen, dass alle Kompensationsversuche und alle bisherigen Reduktionsversuche nicht genügten. Die überraschend dringlichen Arbeiten an der siebten Auflage des KOMA-Script-Buchs reichte aus, mich ins Straucheln zu bringen. Dass ich in dieser Situation auch noch mit dem Wiederaufflammen von alten Anfeindungen konfrontiert wurde, deren tiefere Ursache ich in den Lizenzdiskussion – einem der unsäglichen Nebenschauplätze, welche die KOMA-Script-Entwicklung leider mit sich brachte – um die Jahrtausendwende vermute, war nicht unbedingt hilfreich.

Jedenfalls war ab dem Zeitpunkt klar, dass es eine radikale Veränderung geben muss. Ich kann KOMA-Script nicht mehr weiterentwickeln, pflegen, Support dafür und für alle möglichen LaTeX-Fragen leisten, mich in die Verwaltung von TeX-Seiten einmischen, DTK-Artikel schreiben, Anleitungen in zwei Sprachen auf dem Laufenden halten, ein Buch zu KOMA-Script alle durchschnittlich 18–24 Monate komplett überarbeiten und einen Großteil der Administrationsarbeit für komascript.de erledigen. Und ich will es auch nicht mehr.

Wie einige wissen dürfte, liegt mein Interesse tatsächlich in der Entwicklung. Wie oben erklärt, fühle ich mich außerdem verantwortlich, Anwendern Hilfestellung zu leisten. Die schönste Entwicklung nützt wenig, wenn sie mangels Dokumentation und auch darüber hinaus gehender Hilfe keine Anwendung findet. Die schönste Entwicklung nutzt nichts, wenn darin enthaltene Fehler ihre Verwendung verhindern. Die schönste Entwicklung nutzt nichts, wenn sie durch externe Entwicklungen unbrauchbar wird. Gleichzeitig bedeutet neue Entwicklung auch immer neue Fehler und immer neuen Bedarf für Dokumentation und Hilfestellung. Die Stelle, an der ich recht einfach mit erheblichen Auswirkungen etwas ändern kann, ist also die Entwicklung. Da das leider mein Hauptinteresse ist, habe ich mich viel zu lange schwer damit getan, die logische Konsequenz zu akzeptieren. Die Entwicklung neuer Dinge in KOMA-Script, so schön sie sein mögen, so weit die Ideen dazu bereits gereift sein mögen, muss aufhören!

Mit dem jüngst veröffentlichten KOMA-Script 3.28 ist daher ein für mich grundlegender Wechsel markiert: Alles, was angedacht, begonnen oder auch schon ansatzweise oder als Prototyp vorhanden ist, ist ab sofort eingestellt. Erweiterungen wird es in KOMA-Script in der Regel nicht mehr geben. Kosmetik und Detailverbesserungen sind möglich, wenn sie wohl begründet sind. Alpha- und Beta-Pakete und überholtes wird aus KOMA-Script verschwinden. Das betrifft ausdrücklich auch tocstyle und scrpage2. Bei komamarks ist es bereits erfolgt. scrlayer-fancyhdr ist ein Wackelkandidat. scrhack als eines der Pakete, bei denen ich ständig der Entwicklung anderer hinterher hecheln muss, steht ebenfalls zur Disposition. Bereits bei KOMA-Script 3.27 habe ich den Workaround für titlesec aus den Klassen entfernt. Damit wurde (erneut) eine seit Jahren verbreitete Warnung Realität. Workarounds für andere Pakete werden eventuell folgen. Erwähnt sei beispielsweise der für das ohnehin veraltete caption2 (von dem ich gar nicht weiß, ob der noch irgend einen Sinn hat). Die meisten dieser Workarounds waren nur als Übergangslösung gedacht, verbunden mit der Hoffnung, dass die entsprechenden Paketautoren selbst irgendwann KOMA-Script-Klassen unterstützen würden, was im Falle von caption2 in Form von caption3 und caption ja auch erfolgt ist. Wäre dies generell erfolgt, wäre übrigens die Verwendung von scrlfile durch die Klassen überflüssig und mir ein GAU des letzten Jahres erspart geblieben.

Da inzwischen durch externe Entwicklungen eine Kompatibilität alter Dokumente mit neuen TeX-Installationen ohnehin nicht mehr gewährleistet werden kann, stehen auch Kompatibilitätseinstellungen wie Option version auf dem Prüfstand. Derartige komplett zu entfernen reduziert einerseits den Pflegeaufwand und das Risiko bei zukünftigen Änderungen etwas zu übersehen. Andererseits birgt das Entfernen selbst ein gewisses Fehlerrisiko. Allerdings nützt die vorhandene Option zunehmend wenig und würde zudem ad-absurdum geführt, wenn sie bei zukünftigen Änderungen unbeachtet bliebe. Ein erster Schritt könnte daher sein, die Dokumentation der Option zu entfernen und sie als veraltet zu markieren. Die logische Konsequenz aus einer solchen Markierung ist dann im nächsten Schritt aber in der Tat das Entfernen.

Mein ohnehin nur wenig bekanntes öffentliche Engagement im allgemeinen LaTeX-Support werde ich ganz nach Lust und Laune ebenfalls zeitweilig bis auf Null reduzieren und jedenfalls einer ständigen, kritischen Prüfung unterziehen. Dass ich oftmals über eine Stunde investiere, um alle erkannten Fehler und Ungereimtheiten in einem gezeigten, nicht minimalen Code aufzuführen, ist letztlich verschwendete Zeit. Die Mehrzahl der entsprechenden Fragesteller ignoriert diese Hinweise ohnehin oder verschwindet sang- und klanglos und schneller als ich meine Hinweise liefern kann.

Ziel des ganzen ist, den Support und die Pflege von KOMA-Script weiterhin leisten zu können. Ideal wäre, wenn dabei auch noch etwas Zeit für andere Dinge abfallen würde.

So bringt das neue Jahr für mich also bereits erste Veränderungen. Weitere liegen im Bereich der Vorsätze.

Bis demnächst
Markus

by Markus Kohm at Tuesday, 2020-01-07 10:08

2019-12-30

Thomas Schramm

Fussball-Halbjahreskalender 2020 Januar bis Juni mit ics-Datei

Der Halbjahreskalender Januar bis Juni 2020 A4 Querformat mit den Terminen der 1. Bundesliga, CL, EL, Länderspielen und DFB-Pokal als PDF mit zugehörigem LaTeX-Quelltext. Dazu die ics-Kalenderdatei zum Einbinden in Kalenderprogramme mit allen Terminen der 1. Bundesliga, Champions League, Europa League, Pokal und Länderspielen. Der TeX-Quelltext beinhaltet aus praktischen Gründen auch die Termine Juli bis […]

by Thomas Schramm at Monday, 2019-12-30 18:40

2019-12-15

TUG

G21C conference: Grapholinguistics in the 21st Century

A biennial conference bringing together disciplines concerned with grapholinguistics and more generally the study writing systems and their representation in written communication. Submission deadline: January 13, 2020.

Sunday, 2019-12-15 23:08

2019-12-11

Typeroom

Print is back! The renowned culture brand to be relaunched in 2020

This December, Print magazine, an online authority on graphic, interactive and brand design and their influences on visual culture, announces new ownership and its continued presence in the graphic arts industry.

Print began publishing in 1940 as Print, A Quarterly Journal of the Graphic Arts led by William Edwin Rudge to demonstrate, in his words, the far-reaching importance of the graphic arts.

Through the decades Print became an iconic design and visual culture brand serving as the go-to industry resource for design dialogue and inspiration, design education, profiles of leading design minds and everything in between via a top-ranking website and social media platforms, The Daily Heller column, and one of the most well-respected design competitions in the industry—the Print Regional Design Awards—Print has been the leading authority on all things design today. 

The Print brand and Printmag.com was acquired by Print Holdings LLC comprised of industry veterans, Debbie Millman (Design Matters), Steven Heller (The Daily Heller), Andrew Gibbs (Dieline), Jessica Deseo (Dieline), Jessica Deseo (D’NA Company) and Laura Des Enfants (D’NA Company).

Together, this team is committed to offering the design industry a hub for inspiration, education, and community, while remaining committed to Print’s original mission of commenting on, critiquing, discussing and documenting the work, thinking and business of design. 

The site will continue to publish its legacy site including up-to-date Daily Heller columns until 2020 when a new format will launch under the vision and guidance of this ownership team.

“While we are thrilled to rescue this piece of design history,” said Debbie Millman, “we are even more excited about the opportunity to reinvent Printmag.com” 

Relaunched in 2020, Print aims to continue building a dialogue about design by detailing the intersections of art and culture. “Rather than focusing on the how-to of design, Print’s content covers the why—why the world of design looks the way it does, how it has evolved, and why the way it looks matters and while Print will continue to identify and analyze important trends, history and insights from thought leaders in the industry, the online magazine will begin to include an expanded 21st-century view of design and its contribution to our world” notes the team.

“While we are thrilled to rescue this piece of design history,” said Debbie Millman, “we are even more excited about the opportunity to reinvent Printmag.com as a reflection of what designers are interested in reading today.” 

“The magazine we all call Print has had a half dozen different names since its inception in June of 1940” writes J. J. Sedelmaier. “It was originally a limited-edition periodical that discussed the endless techniques used in the graphic arts industry, and even included original prints and tipped-in features within. From its first edition up to Volume VII, Number 6, in March 1953, it was a 7 1/4-by-10-inch journal-sized publication.”

“William Edwin Rudge was the publication’s original publisher and managing editor. He was the third generation of a family of printers (all named William Edwin Rudge), and he worked out of his publishing business office in New Haven, Connecticut. From the first issue in 1940 through Volume VI, Number 4, the publication was called Print: A Quarterly Journal Of The Graphic Arts. Beginning with the following edition, Volume VII, Number 1, it changed its name to Print after combining its previous title with another magazine called The Print Collector’s Quarterly. Rudge continued as the publisher and managing editor, but the publishing office had moved to Burlington, Vermont, with the editorial offices in Hartsdale, New York. By the spring of 1953, Rudge was the president/editor, and the publishing and editorial offices had moved to 17 West 44th Street in Manhattan.”

Volume 1, Number 1. Cover by Howard Trafton.

An editorial introduction to the journal’s mission (with an interesting mention of television in the footnote)

Explore more of Print's legacy here and reengage with Print here

by loukas at Wednesday, 2019-12-11 14:01

2019-12-10

Typeroom

Social Poster Exhibition: time for graphic design to teach us kindness

Promoting tolerance among nations through the art of the poster is a mission Typeroom approves and Gabriel Benderski is an artivist whose portfolio speaks of the importance of human rights in times of need . 

Born in Montevideo, Uruguay into a Jewish home in 1988, Benderski started his formal training by studying graphic design at ORT University where he obtained a BA. Prior to going freelance, he worked for five years in design studios where he discovered his enthusiasm for editorial and brand design -his visual identity of Campeón del Siglo FIFA Stadium for Club Atlético Peñarol is one of the projects he highlights.

Yet it is not just his talent but his agenda that makes Benderski a creative to follow.

“Recently, I started to focus on the social aspect of graphic design and as a result, the Social Poster Exhibition is part of The Uruguayan Plan for Human Rights Education and it was declared of Educational Interest by the Uruguayan Ministry of Education and Culture” he notes of his latest exhibition. 

From New York through Warsaw, Toronto, Ottawa and in four cities around Uruguay Benderski demonstrates how graphic design allows us to use it as a tool that promotes tolerance.

Tolerance Poster Show: Mirko Ilić campaigns for humanity with Milton Glaser & more

While the posters are been displayed in embassies and consulates around the world this man on a mission “analyzes the background of poster design and it’s social function.”

Human type: Gabriel Benderski's posters is letterpress activism for a better world

“The aim is to transmit to the participants an introduction to graphic design that allows them to use it as a tool to promote tolerance” he adds. 

In case you interested in holding the exhibition get in touch with Benderski here.

“Thanks to the hand of Max Phillips (Signal Type) an old sketch I made a few years ago is now finished. Infinite Union is to understand that we are all together in this world”

“I was invited by Mirko Ilić to participate in The Tolerance Show, a traveling exhibition that brings together artists from around the world to create posters of Tolerance. To tolerate is to behave as one”

 “The aim is to transmit to the participants an introduction to graphic design that allows them to use it as a tool to promote tolerance”

 

by loukas at Tuesday, 2019-12-10 11:10

2019-12-06

Typeroom

Graphic Design Festival Scotland 2019: Lamm & Kirch & more winning posters to inspire

Since launching in 2014 Graphic Design Festival Scotland has received more than 38,000 poster submissions from over 100+ countries fo its International Poster Competition from established agencies and budding designers alike from all corners of the earth.

This year's International Poster Competition winners are inspiring as their designs “alter perceptions or ways of thinking, offer creative solutions to problems, contribute to current affairs, open dialogues for debate, provoke discussions and make innovative use of media or medium” notes GDFS

The 2019 competition received 8,567 submissions from 86 countries with Lamm & Kirch, Mark Bohle & Nam Huynh and Formes Vives! winning big. 

You can visit the International Poster Exhibition at an all-new space dedicated to graphic design aka Olympia Gallery – where the exhibition opened on November 22nd, 2019.

The following are the top three winning posters with the jury's comments as noted on GDFS

Ruhr Ding: Territorien and Tony Cokes – Mixing Plant by Lamm & Kirch, 3rd place

“The bold factual typography combined with soft, fading colors and blended imagery melts together a unique fusion of industrial communication and creative expression”

The posters were commissioned by Urbane Kuenst Ruhr, an institution for contemporary art, to promote two exhibitions. One titled Territories which explores territorial definitions and the territorial aspects of establishing an identity. The other, an exhibition of artist Tony Cokes’ work which “reflects on capitalism, subjective perceptions, knowledge transfer and (visual) stimuli”. The posters are designed within a wider identity as part of Lamm & Kirch's on-going work with the institution.

Aiming to reflect the “unstable” identity of the institution which operates in the “decentralised" post-industrial area of Ruhr in West Germany, Lamm & Kirch utilise “a tool kit” of “variable components”; photography, abstract imagery, artist material, formless shapes, graphic symbols and a custom variable font built with Dinamo Typefaces. The elements come together in seemingly haphazard compositions; images layer over each other, lines and shapes intersect and information appears sporadically dotted around in perceptively chaotic compositions.

Bigger and Better: Graphic Design Festival Scotland 2018 in 230 posters!

The bold factual typography combined with soft, fading colors and blended imagery melts together a unique fusion of industrial communication and creative expression.

“The posters are successful on many levels; effective as stand-alone pieces of work, perform seamlessly as part of a wider identity, visually exciting at first glance with various levels of interest up to small details within the typography, an exciting variety of visual elements, all of the elements connect and disconnect with one another in an interesting way, the posters strongly reflect the concept and approach of the institution, exceptionally high quality of finish, typography is solid and the aesthetic is original” notes the jury of this year's competition. 

The Big Bang by Mark Bohle and Nam Huynh, 2nd place 

The poster was commissioned by community-focused student space ODAS to promote a doors-open day which offered the opportunity to visit the private studios of various creatives.

Through the poster, creative spaces were compared to children’s bedrooms, places “of imagination and pure freedom”. Continuing the idea of child-like imagination, inspiration was drawn from a Joan Miro illustration for the central motif – a star. The area around the star is embellished with a scattering of pop-culture icons, characters, and symbols – comparable to children plastering stickers all over belongings as a way of personalizing or customizing.

“Posters do not always have to be direct or immediate. A successful poster may not be straight forward and could be open to interpretation”

Custom 3D typography is paired with a geometric sans at the bottom of the poster to communicate the title and event information.

“The poster is instantly memorable; strange, playful and engaging. The concept comparing the children’s bedrooms to creative studios is original and communicated in a fascinating way. The “stickers”, Joan Miro inspired 3D-rendered toothpaste, thick blobby custom lettering and light grotesk geometric typeface at the bottom are an unforgettable combination of elements and assembled together in a tactile, unexpected way. The poster is highly contemporary but well-grounded and formed. Very enjoyable to look at” comments the jury members of the winning project. 

Extra Ball and Golden Ball by Formes Vives!, 1st place

The posters were commissioned by art/craft/DIY design publisher Ultra for public display next to their offices in Brittany, France, and Formes Vives were offered complete creative control.

Formes Vives aimed to "support" the Gilet Jaune, a populist, political movement for economic justice started in France during 2018, and “talk about police repression”. An illustrated pinball machine is combined with typographic cues relating to the "Gilet Jaune” in a tactile frenzy of colors, lines, and forms. Some of the scrawled cues on the posters include “gasoline”, “fire”, “capital gains”, “tax-free”, “power” and “play”. The posters do not explicitly comment but perhaps compare the actions between the Gilet Jaune and the police on the streets of Paris to a game of fatal pinball.

The posters were created using hand-drawn illustrations, layered digital collage and printed digitally before being displayed publicly in Brittany.

“Posters which engage with current affairs or on-going global topics are important, particularly within graphic design where designers have such a strong position to communicate ideas and educate those around them. The posters are totally original, visually engaging, spark curiosity and contribute to on-going political discussion” notes this year's jury members which included The Rodina from the Czech Republic, Spassky Fischer from France and Warriors Studio from the United Kingdom.

Koos Breen knows how to design a typographic poster for the win

“Formes Vives approach feels human, organic and personal which implies a level of authenticity and expressive emotion which strengthens the message and the connection between the poster and the subject. Posters do not always have to be direct or immediate. A successful poster may not be straight forward and could be open to interpretation. This may be atypical of how we expect “good” design to behave but hopefully, through the competition we can open up new ways of thinking and demonstrate that immediacy and directness are not always an indicator of quality.”

“The posters hit the sweet spot in many ways: they contribute to on-going political discussions, are visually striking, the use of colors and forms in the composition is strong, the tactility of the work is quite unique. The approach of Formes Vives is inimitable with real emotion and craft. The posters demonstrate that the artistic aspect and the craft of poster design are alive and well.”

All 252 posters selected are featured inside the International Poster Book edition. The limited publication of 1,000 copies available hits all the right notes so do make sure to grab your own copy here.

by loukas at Friday, 2019-12-06 16:22

2019-12-05

Typeroom

From the New York Times to Pentagram sky is the limit for multi-awarded Matt Willey

I was born with severe hearing loss. My parents were told I would never be able to speak properly or go to a normal school. That, more than anything else, shaped my childhood” says Matt Willey, arguably one of the most influential art directors of the industry and an official Pentagram partner as announced earlier this week in his latest interview with Creative Review.

Born in Bristol, Willey has managed to be the talk of the city that never sleeps as the art director at The New York Times Magazine -a position he has held since 2014 before becoming Pentagram's latest MVP. 

Willey studied graphic design at Central St. Martins in London, graduating in 1997. In 2002, he joined Vince Frost at Frost Design London, ultimately rising to the position of creative director and three years later he co-founded the London-based firm Studio 8 Design with Zoë Bather. After the company's closure in 2012 Willey relocated to New York for more adventures in type and design.

“In his editorial design, Willey combines strong typography and photography to create powerful settings for the content at hand” notes Pentagram

Currently the creative director and senior editor of Port, the gentlemen’s quarterly he co-founded with editor Dan Crowe, Willey has art directed the arts magazine Elephant, the travel and culture magazine Avaunt (which he co-founded) and in 2013 he completed the game-changing redesign of UK newspaper The Independent.

A magazine lover since his early days in the industry Willey felt “for a long time quite ambivalent about the idea of being a graphic designer... I maybe gravitated towards magazines because they offered proximity to other worlds – to writers and editors, to art and photography and film, and illustration, and music and so on” says Willey to Crowe for WePresent. 

“Design is almost always – or should almost always be – just a small part of something else. Usually something bigger and more interesting. I don’t think design is much in and of itself; it’s only interesting when it’s in service to something else. But I felt comfortable with magazines, they made some sort of sense to me” he adds.

Feeling “a small cog in a big and incredibly impressive machine” by the name New York Times, Willey has just swapped one of the industry's most coveted jobs for another.

A type designer as well, Willey's custom typefaces such as AType, BWord, NewPort, and NSW01 to name a few are created for the specific context of the brands, publications, stories and other projects in which they appear.

Willey, obviously a creative with a kind agenda, has made several of the fonts commercially available and proceeds from select fonts are donated to charity. “His MFred and TIMMONS typefaces have helped raise more than £70,000 for Cancer Research UK and Macmillan Cancer Support through the Buy Fonts Save Lives initiative, while all money made from the sale of Blakey Slab goes to the American Civil Liberties Union (ACLU)” notes Pentagram. 

“I don’t think design is much in and of itself; it’s only interesting when it’s in service to something else”

Named Designer of the Year by Creative Review in 2014 Willey’s work has been recognized with numerous awards. “During his five-year tenure at The New York Times Magazine, the publication was honored with SPD Magazine of the Year in 2016 (and Finalist for 2017-2019) and SPD Brand of the Year in 2019, and more than 40 gold medals; five D&AD yellow pencils; more than 20 Gold and Silver cubes from the Art Directors Club; and awards from the Creative Review Annual and the Type Directors Club. Avaunt was a finalist for SPD Magazine of the Year and received a D&AD yellow pencil, and Port won SPD Independent Magazine of the Year in 2019.”

Willey’s titles for Phoebe Waller-Bridge’s spy series Killing Eve were nominated for a BAFTA Award and received a Royal Television Society Craft Award and obviously sky is the limit for this son of a poet and the latest creative to join Pentagram's New York office. 

Explore more of Matt Willey's instinctively visual brilliance here

Willey, obviously a creative with a kind agenda, has made several of the fonts commercially available and proceeds from select fonts are donated to charity

All images via Matt Willey

by loukas at Thursday, 2019-12-05 15:48

PlayStation turns 25: the good, the bad and the glorious logos revealed

On December 3rd, 1994, history happened with a little help from SONY. On that day, the PlayStation launched in Japan and the company entered the video game industry with Playstation -changing the core of it almost overnight. 

Twenty-five years later the Playstation saga continues and SONY's influential product being undoubtedly one of the world's biggest modern franchises in entertainment ever. 

The PlayStation impact upon our culture is evident -from the iconic games aka Final Fantasy through its sartorial influence per The Verge- and that little gray box which started it all came with a logo not to be forgotten.

Conceptualized in 1994 by Manabu Sakamoto, the Japanese designer, who had also worked on other logos for SONY aka VAIO, the PlayStation logo was not an easy task to accomplish. 

The final logo, which comprises the letter ‘P’ standing over the letter ‘S’, resulted after numerous concepts being put forward for selection revealed a compilation of rejected PlayStation logo iterations that surfaced recently online on Reddit.

All of the concepts were created by Sakamoto himself who played with variations in color and typefaces in search of the perfect emblem. 

Shortened to just two letterforms the logo, strong as the console itself, carried the product to its dominance and is one of the most recognized logos of our times. 

Sakamoto also designed the custom typeface used in the PS logo and the font, simple and clean, is part of SONY’s global success story that was launched a quarter of a century ago. 

Eventually the PlayStation (officially abbreviated as PS and commonly known as the PS1 or its codename PSX) was first released on 3 December 1994 in Japan, on 9 September 1995 in North America, on 29 September 1995 in Europe, and on 15 November 1995 in Australia, and was the first of the PlayStation lineup of video game consoles. 

Photo by Hello I'm Nik on Unsplash

“Starting from a humble beginning as an upstart within SONY, Ken Kutaragi and team delivered on a vision to elevate video games as a form of entertainment that everyone could enjoy, and to make a platform for game developers to express their creativity,” writes Jim Ryan, President & CEO, SIE. 

The original PlayStation sold 100,000 units in Japan on its first day and went on to become the first-ever home console to surpass 100 million units sold globally.

On 19 September 2018, Sony unveiled the PlayStation Classic to mark the 24th anniversary of the original console. This miniature recreation of the original console was released on 3 December 2018, the exact date the console was released in Japan in 1994.

SONY has already confirmed that the PlayStation 5 will launch during the 2020 holiday season because what started with a bang will go volcanic.

Playstation wall with hundreds of signatures at Gamescom 2019, Cologne, Germany, photo by Andreas Heimann on Unsplash

by loukas at Thursday, 2019-12-05 13:36

2019-12-04

Typeroom

Remembering Gerard Unger: thanks to Ashler Design every type design of the legend is back online

From Markeur, his first-ever professional type design for Joh back in 1972 to Sanserata, his latest type design in 2016 Gerard Unger’s legacy is alive and kicking in full force again online. 

Spain based Ashler Design, a collaboration between two people, Elena Ramírez, web designer and UX expert, and Octavio Pardo, graphic and type designer decided to pay a very welcoming tribute to Unger, resurrecting his website back from oblivion. 

“Gerard Unger died on 23 November 2018. His personal website went down a few months later, which I found out while talking about him in a type design course” explains Ashler of Remembering Gerard Unger

The website is up and running, almost exactly-ish the way it was. As Ashler puts it, “almost like he was still among us”

“It was a personal shock, like a sudden realization that he was really gone. Using web archive, we have brought his website back to life as a small homage to a professor that not only inspired several generations of type designers but was also an outstanding human being. Gerry Leonidas describes him beautifully here.”

In Memoriam: Gerard Unger 1942-2018

“To be precise, this is not exactly his original website. We have removed a subtle but charming animation he created in his logo because it was using Flash technology. We also added another slot in his type design collection because Sanserata, his latest release, was missing from the collection. We thought it would be great to have it there as well. The rest is exactly-ish the way it was. Almost like he was still among us.”

A perfect gift to all by Ashler aka one grateful student of a type design legend is live here.

Unger on Markeur, 1972

My first professional type design was Markeur, for Joh. Enschedé & Zonen, Haarlem. By 1970 Enschedé’s last punchcutter, Henk Drost, had few opportunities to practice his original craft, so the type foundry had looked for a new sort of work: signage, with letters, cut into laminated plastic sheets. For this purpose, Sem Hartz had designed Panture (1971), a series of seriffed capitals.” 

“Markeur was designed as a sans serif alternative. Usually engraved lettering of this kind is the same thickness all over, like din letters. Drost pointed out that if he wanted to produce perfect letters he had to go through the groove twice anyway.”

“ If these two tracks were made not to overlap each other exactly but were slightly offset, it was possible to create letters with different thicknesses. The rounded corners are the result of using rotating bits.”

Unger on Sanserata, 2016

“Sanserata's originality does not overtly present itself at text sizes. Rather, at those sizes, it draws upon its enormous x-height, short extenders, and articulated terminals to improve readability, especially on screens.”

“Having articulated terminals means characters flare as they near their end, but readers likely won’t notice. What they would notice is that their ability to take in more content in a line of text is improved because the letter shapes are more defined. Articulation also makes clearer text from digital sources, where rectangular endings tend to get rounded by the emission of light from the screen.”

Gerard Unger's Theory of Type Design is the must-buy book of the season

“Lately there seems a whispered discontent with the lack of progress in the sans serif category. Designs can either stretch too far beyond what is accepted or be too bland to be considered new. Sanserata’s strength is in being vivid and unique without being off-putting. This bodes well for designers of paragraphs and of branding schemes since, with Sanserata’s two flavors, it is well able to capture attention or simply set the tone.”

“Sanserata’s first voice is a generous, friendly, and even cheerful sans serif. But when using the alternate letterforms its voice becomes more businesslike, though still with nice curves, generous proportions, and a pleasant character.”

by loukas at Wednesday, 2019-12-04 13:10

2019-12-03

Typeroom

FENTY for the win: how Commission Studio branded Rihanna’s urban luxe label

After just six months in business, Rihanna's historic luxury fashion house FENTY has been presented with the Urban Luxe award by the British Fashion Council at this year’s Fashion Awards 2019.

“Contextualising Rihanna isn’t easy – which meant the role of creating the branding for her first fashion house had to be handled by the experts” writes Leanne Cloudsdale. Founded by creative directors David McFarline and Christopher Moorby, Commission Studio’s heavyweight portfolio of luxury clients was enough proof for the synergy to happen.

“The new wordmark was drawn from scratch. The distinctive letter ‘F’ with the strokes overlapping references Rihanna’s own handwriting and the reverse ‘N’ is a legacy nod to the Fenty Beauty logo” said McFarline describing the inspiration behind the design of an entirely new wordmark for the brand. 

“We used a special adaptation of the Grilli typeface GT America Compressed Light for the brand typeface with the ‘N’s reversed as part of the standard character set. Everything is clean and modern – specifically designed to work well on small scale devices such as phones and tablets. Across the packaging, the logos were applied with gold foil and a 3D sculpted diamond emboss to give that luxe feel” he added.

“Typically, monograms have always been an integral communication tool for luxury brands and Commission Studio wanted to develop a contemporary version that could be used throughout the Fenty collection. ‘The Mazelogo was a fresh interpretation of this and includes every letter of the brand name. At first glance, it could be a QR code, electronic circuitry, a Chinese character, or a Greek key – what makes ‘The Maze’ unique is how it manages to be modern and familiar at the same time.” 

Alongside the visual identity, London based design and branding consultancy Commission created the complete packaging suite and garment branding for Robyn Rihanna Fenty's awarded fashion house.

“The Maze logo includes every letter of the brand name. What makes ‘The Maze’ unique is how it manages to be modern and familiar at the same time”

“The distinctive letter ‘F’ with the strokes overlapping references Rihanna’s own handwriting and the reverse ‘N’ is a legacy nod to the Fenty Beauty logo”

Explore more of FENTY’s graphic branding, reflective of the “artistic director's complex style, creative vision, and progressive attitude across many digital and physical applications” here.

All photos via Commission Studio @ Luke Evans and Rihanna's official twitter account

by loukas at Tuesday, 2019-12-03 14:18

2019-12-02

Typeroom

Nouvelle type: Jean Fouchet’s title design legacy will make you weep

Long before La La Land, there was a French anti-musical which launched Catherine Deneuve’s iconic career and made filmgoers weep for love does not conquer all in Jacques Demy’s movie musical Les parapluies de Cherbourg (The Umbrellas of Cherbourg). 

Set in northwest France in the late 50s the drama of Geneviève, a lovesick 17-year-old and Guy, played by Deneuve and the equally charming Nino Castelnuovo, broke boundaries with its aesthetics. 

Per Nouvelle Vague’s pioneer Demy his Oscar-nominated and Palme d’Or-winning masterpiece “used color like a singing Matisse” and thanks to restorations pushed forward by Demy’s widow Agnès Varda, The Umbrellas of Cherbourg’s enchanting palette is today just as vivid 55 years after its original release. 

Eventually the title design of the soon to be re-released in UK oh-so-melancholic-it-hurts musical, set in Michel Legrand’s poetic and haunting melodies, movie by the French New Wave director Jacques Demy is a fine example of Jean Fouchet’s typography and talent. 

The opening credits, with it’s bird’s eye view of pastel bikes and titular umbrellas on a rainy day in Cherbourg, are frequently acclaimed as some of the best of all time. “Brilliantly, title designer Jean Fouchet uses the cobbles of the street as a grid that guides the typography” notes Yale Daily News of Fouchet’s pop type. Deneuve, Demy, Legrand and Fouchet were reunited in Les Demoiselles de Rochefort, another movie which displays Fouchet’s talent in introducing us to a story through typography and design. 

Fouchet was a film titles designer born in France in 1918. One of France’s most well-known title designers during the 1960s and ’70s, Fouchet began his career in film as a set designer, decorator, and assistant operator. 

In 1950, he was hired as a designer of special effects and credits for the Lax company and soon he founded his own company, F.L., with which he created title sequences, trailers, and special effects per Art of The Title.

Fouchet designed special effects for films such as The Longest Day (1962), The Train (1964), and Up To His Ears (1965) and his title sequence credits include Last Year at Marienbad (1961), Phaedra (1962),The Umbrellas of Cherbourg (1964), Topkapi (1964), Les Demoiselles de Rochefort (1967) and more.

Following is a tribute to Fouchet’s title design legacy before the brand new tears flood the movie theatres again.

The Umbrellas of Cherbourg is released in the UK on 6 December. 

Slider images captions: Phaedra by Jules Dassin (1962),  Le Président by Henri Verneuil (1961). All images via Annyas 

by loukas at Monday, 2019-12-02 12:40

2019-12-01

Typeroom

World AIDS Day: brutal and humane, the US national AIDS crisis in posters

In 1981, a new disease appeared in the United States. As it spread, fear and confusion pervaded the country. The infectious “rare cancer” bewildered researchers and bred suspicion, but the worry was not the same for everyone. Many feared contact with those who were ill. Others, particularly but not exclusively gay men, feared for their lives and the lives of loved ones.

Reactions to the disease, soon named AIDS (acquired immune deficiency syndrome), were as varied as the uncertainties about it. Early responders cared for the sick, fought homophobia, and promoted new practices to keep people healthy. Scientists and public health officials struggled to understand the disease and how it spread. Politicians remained largely silent until the epidemic became too big to ignore. Activists demanded that people with AIDS be part of the solution.

Takings its title from “Surviving and Thriving,” the book written in 1987 by and for people with AIDS that insisted people could live with AIDS, not just die from it, “Surviving and Thriving: AIDS, Politics and Culture” is a traveling exhibit and online adaptation curated by the National Library of Medicine that explores the rise of AIDS in the early 1980s, as well as the medical and social responses to the disease since. 

Posters, comic books, and postcards focused on AIDS are as old as the disease itself” notes the exhibition's curators. “Representing some of the best forms of public health outreach, this ephemera contains images and slogans that communicate a range of ideas about AIDS—from how to prevent it from spreading, to how to care for people with AIDS, to how to talk to children about the disease.”

“Inexpensively printed and distributed, these colorful materials performed a great deal of social and political work over the course of the 1980s and 1990s. Whether originally wheat-pasted on bus stops or the sides of buildings, hung in municipal office spaces, or placed in the waiting rooms of doctors’ offices, these documents provide powerful historical evidence about how people did and did not confront HIV/AIDS.”

“Posters, comic books, and postcards focused on AIDS are as old as the disease itself” 

Safe sex is hot sex, 1991. Developed in 1990 by the Red Hot Organization, a leading international initiative dedicated to fighting AIDS through popular culture, this campaign featured diverse people locked in intimate poses. The posters combined text and visuals to normalize and eroticize safe sex. Both provocative and instructional, the carefully positioned subjects aimed to encourage viewers to change their sexual practices. The voyeuristic presentation worked in conjunction with the message: sex can be enjoyable and safe for all couples regardless of sexual orientation

Stop worrying about how you won’t get AIDS and worry about how you can, 1980s. Misinformation ran rampant through much of the 1980s as people struggled to interpret the new information about risk behaviors and safe practices. By providing the beginnings of a list of safe practices, this poster from American Responds to AIDS helped spread accurate information and dispel some of the stigma and fear associated with people with AIDS.

“Please Be Safe” by the Northwest AIDS Foundation. In 1987, with funding from the U.S. Conference of Mayors, the Seattle-based Northwest AIDS Foundation launched the “Please Be Safe” campaign to help gay and bisexual men reimagine their sexual behaviors.  Using a different creative visual strategy than the sexually charged imagery of some contemporaneous public health efforts, this campaign used road signs—a straightforward, familiar set of symbols—to discuss and advertise sexual safety. The “Please be Safe” or “Rules of the Road” campaign used road signs and compelling, straightforward, community-specific language to help gay men engage in safer sex. The campaign sought to establish these practices as the new norm for all. The “Sexual Safety Card” featured on many of the posters provided quick and accessible information on activities at every level of safety.

The more you score, the greater your chances of losing the game, 1980s. This fear-mongering poster offered a simple solution for preventing AIDS: Just say no. . Advice like this ignored the complexity of human behavior and, as such, missed out on the opportunity to educate people on realistic, alternative strategies.

Some people think you can catch AIDS from a glass, you can’t, 1980s by the California Medical Association. Evidence that AIDS was spread by the exchange of “bodily fluids” provided many opportunities for misunderstanding. Even after researchers proved that saliva could not transmit AIDS, the fear of used drinking glasses, shared eating utensils and kissing persisted well into the 1980s.

Perform a death-defying act, 1987 by the Oregon Health Division. This huge headline communicated a simple solution. By depicting condoms as a common, easy-to-use solution, this poster from the Cascade AIDS Project in Portland made protection approachable.

AIDS is a white man’s disease, famous last words, 1980s by People of Color Against AIDS. Directed to the black community, this poster used straightforward language to debunk the all-too-widespread idea that AIDS was a gay, white disease.

You Won’t Get AIDS From. With misinformation too readily available, the You Won’t Get AIDS From campaign from America Responds to AIDS attempted to reach a wide audience with neutral images and straightforward information on ways people could not “catch” AIDS.

From the “Safe Sex is Hot Sex” campaign dubbed too hot to handle during the Reagan era to the “America Responds to AIDS” campaign which promoted the “everyone is at risk” message of AIDS prevention, generalizing the epidemic beyond its core targeted communities which suffered the most, through the “Fear Mongering” ominous posters which declared in big bold type the danger of “sleeping around” whilst providing little -if any- information on how to prevent the spread of AIDS epidemic, the exhibition presents through posters and visual elements the US national AIDS crisis in pictures.

Explore more here

Slider images captions: ACT UP is watching: Photo from one of thousands of demonstrations nationwide, reminding officials that activists were watching. ACT UP (The AIDS Coalition to Unleash Power) was founded in March 1987. As of 2012, ACT UP chapters in nearly every major city continued to champion for rights for people with AIDS. Ignorance = fear, silence = death, 1989: In the 1980s and 90s, artist Keith Haring’s widely recognized figures championed AIDS education and compassion, what was then a new cause. Today, Haring’s foundation continues to support AIDS organizations nationwide, including AIDS Project Los Angeles, the Elizabeth Glaser Pediatric AIDS Foundation, and Gay Men’s Health Crisis. Read my lips, 1988Gran Fury, an artists’ collective within ACT UP (The AIDS Coalition to Unleash Power), created iconic materials, including postcards, to spread information and promote education about HIV and AIDS. The mantra featured here subverts President George H. W. Bush’s notable quip about no new taxes, delivered during the 1988 Republican National Convention, to call attention to his ambivalent support on AIDS outreach, education, research, and support.

by loukas at Sunday, 2019-12-01 10:56

2019-11-28

Typeroom

2024 Paris Olympic Games: Graphéine's visual identity is better than yours

Almost a month ago Paris unveiled a gold-medal-shaped emblem for the 2024 Olympics and Paralympic Games. According to the official press release “the emblem embraces the shape and color of the most beautiful medal of all to express one of the core values of sport: striving for excellence. That same commitment also informs every step that Paris 2024 is taking in organizing the Olympic and Paralympic Games Paris 2024, so that it can fulfill the pledges it has made to stage a different, grounded, sustainable and inclusive Games.”

Eventually, Twitter-verse had its own take of the Paris 2024 Olympic logo which was mercilessly mocked -even though the gold design (apart from representing gold medals), the variable custom-designed typeface and Art Deco styling is a nod to the last time Paris hosted the games, in 1924.

"Its pure, understated lines and its original typeface take their inspiration from Art Deco, the first complete artistic movement, which reached its height at the 1924 Games in Paris," says a statement announcing the logo which per Creative Bloq's Rosie Hilder “looks like either a) the Tinder app logo, aka a sexy flame b) a condom packet or c) a hair salon logo. There's nothing about it that makes us want to jump off the sofa/out of bed/away from our screen to do the javelin or the long jump or pursue some other Olympic dream.” 

After all the buzz another visual identity emerged with rave reviews from many -even though it was not selected. Graphéine's team studied the question of the visual identity of the 2024 Paris Olympic Games and its answer is presented in the agency's blog.

“Imagine an impossible timetable (only 3 weeks to send a response....) and all the conditions that we usually denounce (no financial compensation for submitting a project). Nevertheless... it's the Olympic Games! What's more, in Paris, our city capital! So we offered a creative sprint to our teams. 48 hours to find an idea. Here is the project we have collectively imagined” writes Graphéine.

“For a branding project to be successful, the logo must be based on an idea that is sufficiently 'fertile' to inspire the multitude of necessary variations. Indeed, it is in the polysemy of a sign that the richness of the declinations will be able to sprout.  Thus some signs can easily be imagined in movement, volume, space... while others will seem 'static and uninspiring'. In our opinion, this visual polysemy is a guarantee of sustainability. Indeed, the strong repetition of a sign inevitably causes a form of habit, even weariness. A sign that has several levels of reading will be much more resistant to time.”

“Baudelaire used to define modernity as the meeting of fashion ("mode" in French) and eternity.  It is precisely in the conjunction of these two temporal notions that a logo must be found. On the one hand, he must be in tune with the times to make himself desirable in the eyes of its contemporaries. On the other hand, it is destined to go down in the annals of sport and Paris, and it will continue to live for many years after the end of the Games” notes the agency of its proposal. 

Building the logo around the concept of an Eiffel Tower sports track the proposal is sporty with “fluid curves, fast lines. A sports track emerges before our eyes, telling the story of movement and speed.” 

Solidly placed on the ground, a 3D Eiffel Tower is emerging and “the magic of this sign comes from its possible double reading. In turn sports track and Eiffel Tower. The eye seems not to want to choose.”

This is a logo “that tells a story of encounter and convergence. It is a profoundly humanistic and optimistic sign that offers a perspective of collective progress.” 

The agency's open invitation to the world with five colors for the five continents of our world echoes “the IOC motto with the symbol of the Eiffel Tower. It affirms the audacity to go further, thus illustrating the idea of surpassing oneself.”

“For a branding project to be successful, the logo must be based on an idea that is sufficiently 'fertile' to inspire the multitude of necessary variations”

“The two logos are from the same shape” notes the agency of its two logos for a shared vision.

”A simple change of angle of view makes it possible to switch from the logo of the Olympic Games to that of the Paralympic Games. This symbolic rapprochement between the two logos is a strong gesture that can act as a bridge between the world of able-bodied athletes and that of athletes with disabilities.”

“This project was not selected... But, as Pierre de Coubertin said, the most important thing is to participate!”

”A simple change of angle of view makes it possible to switch from the logo of the Olympic Games to that of the Paralympic Games. This symbolic rapprochement between the two logos is a strong gesture that can act as a bridge between the world of able-bodied athletes and that of athletes with disabilities.”

Check more here

All images via Graphéine

by loukas at Thursday, 2019-11-28 10:55

LaTeX Project

First prerelease of LaTeX 2020-02-02 is available for testing

The first LaTeX prerelease for 2020-02-02 is available for testing

A few days ago we have submitted a new LaTeX development format to CTAN and by now it should be available to all users using MikTeX or TeX Live (on any operating system).

This format allows you to test the upcoming LaTeX release scheduled for 2020-02-02 with your documents or packages. Such testing is particularly important for package maintainers to verify that changes to the core LaTeX haven’t introduced incompatibilities with existing code. We try to identify any such problem beforehand, but such an undertaking is necessarily incomplete, which is why we ask for user testing.

Besides developers we also ask ordinary users to try out the new release candidate, because the more people are testing the new format, the higher the chances that any hidden problems are identified before the final release in February hits the streets.

Processing your documents with the prerelease is straight forward. All you have to do is to replace the invocation command by appending -dev to the executable, e.g., on the command line you would run

pdflatex-dev myfile    or    lualatex-dev myfile    or    xelatex-dev myfile

instead of using pdflatex, lualatex or xelatex. If you use an integrated editing environment, then it depends on the system how to configure it to use an alternative format; but in any case the necessary modification should be straight forward.

Main features of 2020-02-02 prerelease 2

For technical reasons this first prerelease is labeled “prerelease 2”. Prerelease 1 existed only for a few hours on CTAN, as it was missing some files due to a problem in the build process, but because it was already on CTAN we had to increase the prerelease number.

A new LuaTeX engine coming up …

In TeXLive 2020 the LuaLaTeX format will always use the new LuaHBTeX engine, which is LuaTeX with an embedded HarfBuzz library. HarfBuzz can be used by setting a suitable renderer in the font declaration. An interface for that is provided by the fontspec package. This additional font renderer will greatly improve the shaping of various scripts, which are currently handled correctly only by XeTeX.

To simplify the testing of the new engine, the necessary executables have been added to MiKTeX and TeXLive 2019 and both have changed the LuaLaTeX-dev format to use it. This means you can already now test the new engine by using the prerelease!

Even if you have no need for the new HarfBuzz functionality it might be worthwhile running some tests, because, from 2020 onwards this will be the only LuaTeX engine for which a LaTeX format will be distributed in TeXLive and MikTeX from 2020 onwards.

Improved load-times for expl3

The LaTeX3 programming layer, expl3, has over the past decade moved from being largely experimental to broadly stable. It is now used in a significant number of third-party packages, either explicitly or implicitly (e.g., by loading xparse to define user command interfaces), so that many documents load the expl3 code at some point in the preamble. Most LaTeX documents compiled using XeTeX or LuaTeX load fontspec, which is written using expl3 so in these engines it is nearly always used.

The expl3 layer contains a non-trivial number of macros, and when used with the XeTeX and LuaTeX engines, it loads a large body of Unicode data. This means that even on a fast computer, there is a relatively large load time penalty whenever expl3 is needed.

For this release, the team have made adjustments in the LaTeX 2e kernel to pre-load a significant portion of expl3 when the format is built. This is transparent at the user level, other than the significant decrease in document processing time: there will be no “pause” for loading Unicode data files. Loading of expl3 in documents and packages can be done as usual; it will, at some future time, become possible to omit

\RequirePackage{expl3}

entirely, but, to support older formats, this explicit loading is at present still recommended.

Please test now — even though there is more to come

We ask you to test now, because (unless we made some mistakes) the above changes should be transparent to the user — other than speeding up the load process or allowing to handle complicated scripts with LuaLaTeX that before would not work correctly.

Probably early January we expect to distribute another prerelease which will most likely contain improvements to the font support for all engines.

The prereleases will never go stale in the future (we hope)

Starting with this prerelease we will make sure that the prerelease in the distributions either matches the main release code or will be newer than the current main release. This was not the case in the last few weeks when the main release was 2019-10-01 with a few hot fixes added, but the -dev format was still from some day prior to October.

In the future we intend to ensure that this does not happen, so that, to avoid confusion, the distributed development release is always newer or the same as the current main release.

More details on prereleases please …

More details and some background information about these concepts and the process of the development formats is available in a TUGboat article:

The LaTeX release workflow and the LaTeX dev formats

  • Frank Mittelbach
  • TUGboat 40:2, 2019
  • Abstract

    How do you prevent creating banana software (i.e., software that gets ripe at the customer site)? By proper testing! But this is anything but easy.

    The paper will give an overview of the efforts made by the LaTeX Project Team over the years to provide high-quality software and explains the changes that we have made this summer to improve the situation further.


Enjoy — Frank

Thursday, 2019-11-28 00:00

First prerelease of LaTeX 2020-02-02 is available for testing

The first LaTeX prerelease for 2020-02-02 is available for testing

A few days ago we have submitted a new LaTeX development format to CTAN and by now it should be available to all users using MikTeX or TeX Live (on any operating system).

This format allows you to test the upcoming LaTeX release scheduled for 2020-02-02 with your documents or packages. Such testing is particularly important for package maintainers to verify that changes to the core LaTeX haven’t introduced incompatibilities with existing code. We try to identify any such problem beforehand, but such an undertaking is necessarily incomplete, which is why we ask for user testing.

Besides developers we also ask ordinary users to try out the new release candidate, because the more people are testing the new format, the higher the chances that any hidden problems are identified before the final release in February hits the streets.

Processing your documents with the prerelease is straight forward. All you have to do is to replace the invocation command by appending -dev to the executable, e.g., on the command line you would run

pdflatex-dev myfile    or    lualatex-dev myfile    or    xelatex-dev myfile

instead of using pdflatex, lualatex or xelatex. If you use an integrated editing environment, then it depends on the system how to configure it to use an alternative format; but in any case the necessary modification should be straight forward.

Main features of 2020-02-02 prerelease 2

For technical reasons this first prerelease is labeled “prerelease 2”. Prerelease 1 existed only for a few hours on CTAN, as it was missing some files due to a problem in the build process, but because it was already on CTAN we had to increase the prerelease number.

A new LuaTeX engine coming up …

In TeXLive 2020 the LuaLaTeX format will always use the new LuaHBTeX engine, which is LuaTeX with an embedded HarfBuzz library. HarfBuzz can be used by setting a suitable renderer in the font declaration. An interface for that is provided by the fontspec package. This additional font renderer will greatly improve the shaping of various scripts, which are currently handled correctly only by XeTeX.

To simplify the testing of the new engine, the necessary executables have been added to MiKTeX and TeXLive 2019 and both have changed the LuaLaTeX-dev format to use it. This means you can already now test the new engine by using the prerelease!

Even if you have no need for the new HarfBuzz functionality it might be worthwhile running some tests, because, from 2020 onwards this will be the only LuaTeX engine for which a LaTeX format will be distributed in TeXLive and MikTeX from 2020 onwards.

Improved load-times for expl3

The LaTeX3 programming layer, expl3, has over the past decade moved from being largely experimental to broadly stable. It is now used in a significant number of third-party packages, either explicitly or implicitly (e.g., by loading xparse to define user command interfaces), so that many documents load the expl3 code at some point in the preamble. Most LaTeX documents compiled using XeTeX or LuaTeX load fontspec, which is written using expl3 so in these engines it is nearly always used.

The expl3 layer contains a non-trivial number of macros, and when used with the XeTeX and LuaTeX engines, it loads a large body of Unicode data. This means that even on a fast computer, there is a relatively large load time penalty whenever expl3 is needed.

For this release, the team have made adjustments in the LaTeX 2e kernel to pre-load a significant portion of expl3 when the format is built. This is transparent at the user level, other than the significant decrease in document processing time: there will be no “pause” for loading Unicode data files. Loading of expl3 in documents and packages can be done as usual; it will, at some future time, become possible to omit

\RequirePackage{expl3}

entirely, but, to support older formats, this explicit loading is at present still recommended.

Please test now — even though there is more to come

We ask you to test now, because (unless we made some mistakes) the above changes should be transparent to the user — other than speeding up the load process or allowing to handle complicated scripts with LuaLaTeX that before would not work correctly.

Probably early January we expect to distribute another prerelease which will most likely contain improvements to the font support for all engines.

The prereleases will never go stale in the future (we hope)

Starting with this prerelease we will make sure that the prerelease in the distributions either matches the main release code or will be newer than the current main release. This was not the case in the last few weeks when the main release was 2019-10-01 with a few hot fixes added, but the -dev format was still from some day prior to October.

In the future we intend to ensure that this does not happen, so that, to avoid confusion, the distributed development release is always newer or the same as the current main release.

More details on prereleases please …

More details and some background information about these concepts and the process of the development formats is available in a TUGboat article:

The LaTeX release workflow and the LaTeX dev formats

  • Frank Mittelbach
  • TUGboat 40:2, 2019
  • Abstract

    How do you prevent creating banana software (i.e., software that gets ripe at the customer site)? By proper testing! But this is anything but easy.

    The paper will give an overview of the efforts made by the LaTeX Project Team over the years to provide high-quality software and explains the changes that we have made this summer to improve the situation further.


Enjoy — Frank

Thursday, 2019-11-28 00:00

2019-11-26

Typeroom

Typography as raw material: République Studio on their AJAP 2018 visual fest

Simple is not boring -at least not in the portfolio of one of our favorite design studios made in France aka République Studio, who infused their modernity to AJAP 2018.

AJAP” (Albums des Jeunes Architectes et Paysagistes) is a biennial competition organized by the French Ministry of Culture. They distinguish young European architects and landscape designers under 35 who have realized a project or participated in a competition in France.

The exhibition presents the work of the twenty winners constituting the 2018 promotion, fifteen teams of architects, five landscaping teams.

République Studio, a team of creatives “often inspired by the zeitgeist” did all the design, catalogs, and signage for AJAP 2018 which will be exhibited in all the French Institutes around the world for the next couple of years.

Typeroom's Loukas Karnis caught up with the award-winning creative direction and graphic design practice based in Paris to discover how type makes the life of creatives a place of wonders and versatility.

How did you approach the design of AJAP 2018?

As the exhibition is traveling all around the world, we discussed with the scenographer on which kind of printed matter would be best to showcase the architects and their works.

It had to be lightweight, easy to build, easy to print, in short, easy to travel. Gaspard came up with the idea of a wooden module, easy to assemble, that could carry two posters aboard. 

The traveling exhibit is bilingual, French and the native language of the country where it is shown. We figured that the posters would display only big images and the explanatory texts would be shown in small booklets on the board. Therefore the visitor could take a booklet with him and bring a small part of the exhibit home. 

As for the design, we wanted something that would work well with the wooden modules, unique and light. 

Which are the most important elements in AJAP 2018’s visual language?

Type was definitely the most important part. The typeface Past Perfect, used for titles, emphasizes the wooden modules. The letters look like wooden sticks and give to the whole project an «architectural and playful vibe».

Plus, it contrasts with Basis Grotesque, used for body text, which is more shapely yet more comfortable for the reader to focus on content. 

Which was the most difficult part of this project?

Keeping things simple without being boring. 

How long have you been working on it?

It took us around 3 weeks to design all the booklets, catalog, signage and make the layout of the posters. 

How important is type in creating a unique voice for the branding of a project?

At Republique Studio every work begins with a typeface. We think a typeface is a big part of the identity of the project because it can provoke different emotions. That’s why you really have to choose well the fonts you will use.

We always want to find the typefaces which will tell the good story, and which speak the same language as the project itself.

A typeface, and the way you use it, has real power on the identity of a brand. By looking at a typeface, it can remind you of a brand, or a movie poster, or an album, etc. That’s why when you start working with a new client, it’s important to choose a typeface that doesn’t have a recognizable history already.

The typeface has a real impact on your memory whether you want it or not. That’s why we try not to use the same typeface on many projects.

Would you consider bespoke typefaces the most important factor to express attitude though letterforms?

Typography is not just about choosing a typeface, what is most important is also how you use it, where, for which audience, in what size, in which color, etc. If you choose a well-known typeface but use it in a unique and intelligent way, then your message can be just as bold. 

What is Republique Studio’s motto to live and work by?

Don’t overuse the same typefaces, because you don’t want the whole world to look the same! Or at least, change the way you are using them.

We don’t want to see the same identity everywhere. Diversity makes our world beautiful!

What are you working on now?

We are working on signage for museums, which is really exciting because we love to see big type in action. Also websites, branding and poster/communication for exhibitions. 

How would you describe the French typographic scene? What's your take on it?

There are many good type designers in France! Too many to name all of them. Sure thing is that the type game is global today, and you can find good designers all around the world.

We are living in a liberating era where type design became very accessible and talented people can very easily make their point and live from their art. Typography is our raw material and type designers our best friends :)

“The typeface has a real impact on your memory whether you want it or not. ”

All images via République Studio. Set design by Gaspard Pinta / Photos by Julien Lelièvre

by loukas at Tuesday, 2019-11-26 14:51

2019-11-25

Typeroom

It stops now! Design for social good rules in Piquant's anti-violence campaign

Design can inform, educate and promote social change” notes Piquant Media of its “It Stops Now” campaign, which aims to end sexual violence and harassment in third level institutions.

Piquant Media worked with Ireland's National Women’s Council of Ireland (NWCI), the country's leading women’s membership organization seeking equality for women, on the branding, video, web design and development of “It Stops Now”, an international campaign which aims to raise awareness of the prevalence of gender-based harassment and violence in third-level education and to engage both staff and students in combatting all forms of harassment and violence against women students throughout Europe.

NWCI lead the campaign with project partners in Cyprus, Lithuania, Scotland and Germany. Each project partner worked with higher education institutions, statutory agencies and NGOs.

Piquant Media worked with the team at NWCI to create a brand identity and project content that would resonate with students and staff at third level institutions around Europe.

“This brand had to have the ability to capture people’s attention and provoke a response. The goal of the project is to generate awareness but also connect with the third-level community to show them that they have the power to make the change” notes Piquant. 

“As graphic designers, we should be aware of the responsibility we have in our community as we don’t only design nice and attractive shapes, we are social communicators, no matter what type of client we’re working for” writes Victoria Brunetta, the studio's graphic designer of the task to design for good.

“How we develop and visualize a communication strategy will eventually impact on people’s perception. For better or worse, graphic designers have become key actors in the consumption chain development.”

“I hold the opinion that our mission when visualizing a social movement is also to bring in to question the perceptions most people have around issues such as feminism, homelessness, immigration, poverty, racism, education, environment… In other words, how can we challenge dominating stereotypes that harm the dignity of the most vulnerable and communicate a clear, powerful message that encourages people to take action? Or, eventually, how can that message give the audience a different perspective on community development and social change? In my eyes, ultimately, the challenge when trying to visualize a social movement or any human rights-related issue, is how can I, as a graphic designer, help people to live in a more equal society?”

“How we develop and visualize a communication strategy will eventually impact on people’s perception. For better or worse, graphic designers have become key actors in the consumption chain development.”

Check out the It Stops Now project here.

by loukas at Monday, 2019-11-25 13:22

Beazley Designs of the Year 2019: Artificial Intelligence, stencil & more winners

Drum-roll, please! The overall winner of Beazley Designs of the Year is “Anatomy of an AI System” and Artificial Intelligence is under the microscope for this groundbreaking “anatomical case study of the Amazon Echo as an artificial intelligence system made of human labour, data and planetary resources.”

“You will never look at your smart home hub the same way again,” said chair of the judges and Royal College of Art vice-chancellor Paul Thompson of the project. 

“The consensus among climate scientists is that human activity is the root cause of an ongoing planetary crisis. The way in which everyday decisions and the devices we buy can add to this issue is sometimes difficult to comprehend. Taking a consumer’s conversation with Alexa, Amazon’s voice-activated assistant, as its starting point, designers and researchers Kate Crawford and Vladan Joler created a map and essay to represent the impact of the creation, use and disposal of just one of the many Amazon Echo units that have been purchased to date.”

Whilst the “Anatomy of an AI System”  charts the birth, life and death of a voice assistant and its impact on our planet Pentagram's Sascha Lobe and his team won the Graphics award for Beazley Designs of the Year for their “clever way-finding system that uses pictograms designed for Amorepacific, one of the world’s largest cosmetic companies, operating over thirty health, beauty, and personal care brands including Etude House, Innisfree and Laneige. 

Sascha Lobe and his team have designed the architectural branding, environmental graphics, and signage for Amorepacific's new corporate headquarters in Seoul, South Korea.

Lobe's team worked closely with David Chipperfield Architects who designed the building. Reflecting the headquarters’ unique geometry, the team devised a matrix that uses Hangul-inspired pictograms to orientate visitors notes Pentagram

“Taking the form of a grid divided into nine sub-squares, level position is communicated through the middle square, with the framing eight squares providing information relative to position, as defined by the ‘corner number’ and ‘geographic location’ hierarchies.”

“Lobe and his team created a Hangul-inspired Latin typeface specifically for the Amorepacific Headquarters. As with the numerals and pictograms, the pared-back, stencilled font is constructed using the Hangul method, resulting in a balanced and consistent visual language”

“English, Chinese and Korean are all used throughout the building and presented a unique challenge for the designers. These systems each possess different complexities, and while Korean and Chinese scripts have some formal relation to one another, it is difficult to achieve a satisfactory integration of Latin characters with a line thickness and font size of similar appearance.”

“Since no typeface existed that would fulfill these criteria completely, Lobe and his team created a Hangul-inspired Latin typeface specifically for the Amorepacific Headquarters. As with the numerals and pictograms, the pared-back, stencilled font is constructed using the Hangul method, resulting in a balanced and consistent visual language throughout the space.”

A total of 76 innovative designs including clothing, buildings, apps, books, homewares and posters that have made a strong global impact were nominated for this year's award, which is given by London's Design Museum.

Following are all the winners of this year's competition.

Anatomy of an AI System from Design Museum on Vimeo.

Beazley Design of the Year Overall Winner: Anatomy of an AI System by Kate Crawford and Vladan Joler

Beazley Architecture Design of the Year: Maya Somaiya Library

Beazley Graphic Design of the Year: Amorepacific architectural branding by Pentagram and David Chipperfield Architects

Beazley Product Design of the Year: CATCH by Hans Ramzan

Beazley Fashion Design of the Year: Adidas Originals by Ji Won Choi

Beazley Transport Design of the Year: GACHA Self-driving Shuttle Bus by MUJI and Sensible 4

Beazley Digital Design of the Year: Anatomy of an AI System by Kate Crawford and Vladan Joler

Beazley Design of the Year - People's Choice: MySleeve by Marie Van den Broeck

Explore more Beazley Designs of the Year 2019 here.

by loukas at Monday, 2019-11-25 12:39

2019-11-24

Uwes kleines Technikblog - Kategorie LaTeX (by Uwe Ziegenhagen)

Kalender mit tikz-calendar erstellen

Hier ein Beispiel, wie man mit tikz-calendar Jahreskalender erstellen kann. Kalenderereignisse müssen in einer externen Datei abgelegt werden, im Beispiel ist das die meineevents.events.

Die Farbnamen für die einzelnen Elemente lassen sich in https://www.sciencetronics.com/greenphotons/wp-content/uploads/2016/10/xcolor_names.pdf nachlesen.

\documentclass{tikz-kalender}

\setup{%
lang=german,
year=2020,
showweeknumbers=true,
title={Urlaub},
xcoloroptions={x11names},
titleColor=cyan, 
eventColor=brown,
periodColor=lime,
monthBGcolor=red,
monthColor=Purple0,
workdayColor=yellow,
saturdayColor=magenta,
sundayColor=orange,
events={meineevents} % Einbinden der events-Datei
}

\begin{document}
\makeKalender
\end{document} 

Hier der Inhalt der meineevents.events:

\event{\year-10-09}{John Lennon (1940)}
\event{2020-10-03}{Tag d. dt. Einheit}
\event*{2020-04-12}{Ostersonntag}
\period{2020-02-01}{2020-02-06}[color=Gray0,name={Urlaub}]; 

Uwe

Uwe Ziegenhagen likes LaTeX and Python, sometimes even combined. Do you like my content and would like to thank me for it? Consider making a small donation to my local fablab, the Dingfabrik Köln. Details on how to donate can be found here Spenden für die Dingfabrik.

More Posts - Website

üääü­üöü–üöü–üöü–

by Uwe at Sunday, 2019-11-24 19:02

2019-11-15

Typeroom

It's all in the A! The Atlantic is stunningly redesigned with a bespoke typeface & a monogram

Led by creative director Peter Mendelsund and senior art director Oliver Munday, the well-respected book publishing design team who came on board full-time at The Atlantic almost a year ago, the magazine unveiled a new visual identity complete with a new logo, custom typeface, updated website, and iOS app with its December issue.

“It is the most dramatic new look for our magazine in its 162-year history, and one that, we hope, reflects boldness, elegance, and urgency,” writes Jeffrey Goldberg, the editor in chief of The Atlantic, author and a recipient of the National Magazine Award for Reporting. 

“The interesting thing to me about the first cover in 1857 is how clear the hierarchy of information is. The forthrightness, the omitting of needless information, the seriousness of purpose and mission—I would say those are all components of the design that represent what The Atlantic, as an institution, does well” notes Mendelsund of his team effort to redesign the historic title. 

“The most notable change in this redesign is the new nameplate, the move to the A as representative of the whole” adds Goldberg of the wordmark aka “an emblem—a logo.”

Set in the all-new bespoke typeface Atlantic Condensed based on the type forms that the founders chose for the first issue the font speaks of the magazine's legacy as well. 

“We started with a condensed capital A that pointed toward an old version of The Atlantic logo drawn by Boston Type Foundry in the mid-19th century,” Munday explains to AdAge. With some adjustments for something "that felt weightier or slightly more bespoke” the design team added a notch to the top of the A, adjusted the feet and then hired typographer Jeremy Mickel for final refinement. 

Eventually, the process led to the creation of a new custom condensed typeface, a full alphabet based on the original Boston Type Foundry sample used for the A.

This serif typeface known also as a “Scotch” face -a term describing the way the serifs are designed- is “an extremely legible, classical kind of typography, but also transmits a certain kind of vehemence and urgency that works nicely for our contemporary purposes.”

Atlantic Condensed, the bespoke and extremely legible serif typeface, aims to transmit a certain kind of vehemence and urgency that works nicely for The Atlantic's contemporary purposes

Per Mendelsund the new design had to be “readerly” and “to feel confident” without “clamoring for your attention in too many ways.” The task has been accomplished per the designers in a number of different ways. 

“One is through good grids, making sure that the page itself has a rigorous, almost Euclidean logic to the way it’s laid out. Another is by ensuring that the type is interrupted as little as possible and that when it is interrupted by imagery, the imagery is contained within its own cordoned-off space.” 

With the single-letter logo being the most striking aspect of this year's redesign -the magazine’s last full redesign happened in 2009- the A is set in roman, or normal, type. The upright letter conveys more authority, Mendelsund says. “Italics are typically not meant to work as a main graphical component but rather call attention in a body of text that is upright.”

Accompanied by a whole ecosystem of “engraved” nautical emblems to serve as visual elements of the Atlantic's latest redesign this is by far one of the most daring and visual pleasing rebranding projects of the year for a title which was launched in the autumn of 1857 by Boston publisher Moses Dresser Phillips

The first issue of The Atlantic Monthly was published in November 1857 and quickly gained fame as one of the finest magazines in the English-speaking world. The magazine, which billed itself as a "journal of literature, politics, science, and the arts," was an immediate success and within two years the circulation of the magazine had risen above 30,000. 

Two centuries later, on July 28, 2017, The Atlantic announced that multi-billionaire investor and philanthropist Laurene Powell Jobs (the widow of former Apple Inc. chairman and CEO Steve Jobs) had acquired majority ownership through her Emerson Collective organization. Now the magazine, subscribed to by over 500,000 readers, publishes ten times a year. Obviously the saga continues with an A.  

Discover more here.

All images via The Atlantic

by loukas at Friday, 2019-11-15 13:33

2019-11-11

Typeroom

Rüdiger Schlömer will teach us how to type knit our lives for good

Swiss designer Rüdiger Schlömer is the kind of creative you can count on even under the worst and most extreme cold weather conditions. After all, he is a man who knows how to infuse type into knitting. 

Schlömer's book, Pixel, Patch und Pattern: Typeknitting (published by Verlag Hermann Schmidt) has been recently awarded the «Certificate of Typographic Excellence» from Type Directors Club, is currently on display in The World’s Best Typography exhibition (TDC65) and as Pentagram's Eddie Opara notes, is filled with surprises.

Innovative, insightful and playful, the book -which has also been nominated for Design Prize Switzerland 2019/20 for «Communication Design»- challenges the readers to learn to knit a variety of typefaces modeled on digital designs by well-known type foundries including Emigre, Lineto, Parachute, and Typotheque and emblazon ones hats, scarves, and sweaters with smartly designed monograms, letters, or words. 

“At first glance, the cover is extremely unassuming and without further inspection you would walk right past it. With a closer examination, its apparent pixelated, title in all caps wears a distorted, woven-textured effect with its added hybrid brew of Anglo-German: Pixel, Patch und Pattern. As your eyes peer down to the bottom of the cover, innocuously set in Futura, Typeknitting leaves you intrigued and you are overcome with curiosity” notes TDC's Communication Design Judge Opara of the publication which was published last winter by Verlag Hermann Schmidt. 

“When opening the book, behold the incredible surprises that wait! You are taken into an alternative culture of typography that is knitted! The different stitching techniques are endless and transformative. The open, playful, thought-provoking, effortless, enterprising, and dynamic qualities are profound. You relish the fact that every design is made by hand. The end results are awe-inspiring. It brings life to modularity and systems. After viewing this book, you have to ask yourself, why do you sit at your computer, hour after hour? Are you really making something that is tangible?” he adds. 

Typeroom's Loukas Karnis asked Schlömer some questions between his own type knitting lessons now that winter marches on in Europe. 

So please introduce yourself to us.

I grew up in Paris, France, and Bremen, Germany and studied Visual Communication in Aachen and Art in Context in Berlin. In my studies, I became interested in experimental communicative formats that combined analog and digital principles, like hacking, reverse engineering or programming.

I find that many of these "new" principles can be found in older media or techniques, in music notation or textiles. I have been living in Zurich since 2011, Switzerland, where I work mostly on exhibition design, books, and self-initiated projects.

How did you come up with the idea of mixing type with knitting?

For the occasion of the soccer World Cup in Germany, I developed a "Fan-Scarf-Remix"-Webtool together with my friend, designer and programmer Jan Lindenberg.

On the project website, you could remix existing fan-scarfs into individual messages and then download your remix as a knitting pattern. It was a playful, deconstructive answer -a hack- to the idea of national identity and its instrumentalization.

I started the project without any practical knitting skills, thinking that the knitting part would be just like printing out the final result ... which was very wrong. Suddenly I discovered the multitude of personal and regional knitting techniques and got interested in their practical exploration. Eventually I started a private knitting circle in my flat in Berlin with a couple of friends.

Which was the first knitted typographic product you came up with?

The first machine-knit product I developed was remix-fan scarves editions, which I released in small editions. There are big differences between hand and machine knitting.

Hand knitting is very accessible, you just need two needles and some yarn. Machine knitting allows larger editions, but it needs a technical infrastructure and brings you back to the computer. I like to keep switching between both, analog and digital.

“My Remix Fan Scarf editions are made of low-quality Jpegs found on the Internet. I started with soccer scarves only, now they contain letters of various source and context. Each edition come with an open-source pattern”

You have called your project “generative typography”. Please elaborate on this genre.

When looking at all the different hand knitting techniques, I see a lot of parallels to digital tools, layout programs or plugins. “Patchwork Knitting” for example, is based on geometric patches as basic elements. Instead of following a pixel-by-pixel knitting pattern, Patchwork Knitting gives you a basic grid, which you can freely combine into color and pattern combinations.

It's basically a modular, generative approach to pattern design, just very slow, because you knit it by hand. And since it's based on patches, it's perfect for knitting modular typography.

“Patchwork knitting is basically a modular, generative approach to pattern design, just very slow, because you knit it by hand. And since it's based on patches, it's perfect for knitting modular typography”

What is your book "Pixel, Patch und Pattern – Typeknitting" all about?

My book shows typographers how to knit letters, and knitters how to include typography. It's a systematic introduction to how different types of letters can be constructed using different hand knitting techniques.

The four main chapters, PIXEL, PATTERN, PATCH, and MODULE go from intarsia knitting to slip-stitch patterns to patchwork knitting. These approaches are shown through knitted prototypes with typefaces from type designers like Andrea Tinnes or Christian Schmalohr, and type foundries like Emigre, Lineto, Nouvelle Noire, Parachute and Typotheque. It also contains instructions for some specific knitting projects, like pullovers, Selbu mittens or cushions. 

How did you decide to collaborate with typographers on this project of yours?

When developing the book, I wanted to show "Typeknitting" as an open practice, not a collection of finished results. I saw my role as a mediator and translator between knitting and typography, two fields I know well enough to initiate this dialogue, without being a hardcore specialist in either.

As Typeknitting is a lot about communication through knitting and letters, the very process of it becomes communicative as well. The exchange with the contributing knitters and typographers was and still is essential in this process.

In most cases, I started with the different knitting techniques, and their structural, constructive characteristics. From there I looked for typefaces that would suit theses characteristics and then I knitted prototypes to illustrate the different principles. TypeJockey by Andrea Tinnes (Typecuts) turned this process around: it inspired an approach of combining structural and color patterns, which can be applied to many other fonts.

Typefaces are most knittable when they are pixel-based or have a strong constructive character. My book contains a collection of more than 30 typefaces, from bitmap classics like Emigre's Lo-Res and Oblong, to contemporary interpretations like Fidel Peugeot's Walking Chair. Some can be used for a specific technique, for example, dot-matrix fonts like Panos Vassiliou's Online One are perfect for slip-stitch-knitting.

Once you start exploring, the amount of knittable typefaces, knitting techniques, and the possible combinations is almost endless.

What are the similarities between the craftsmanship of knitting and type design if any?

I think that the knitting structure has many parallels to a piece of text -even if you are not knitting letters. Loops form a texture, just like letters form a text. Roland Barthes described the text as “a tissue [or fabric] of quotations”. Vice versa, you can also look at the knitting fabric as a text or code. But I would rather compare knitting to digital practice in general. The whole Typeknitting approach is somehow about the relationship between digital craft and analog programming, which form a kind of digital craftsmanship.

“When developing the book, I saw my role as a mediator and translator between knitting and typography, two fields I know well enough to initiate this dialogue”

Lots of people compare knitting to the protocols of software and the “Zen” qualities of the loops themselves. What's your take?

It depends whether you look at the process or at the physical result. When looking at finished knitwear you only see the visual characteristics of knitting textures and materials. This is like looking at a finished painting compared to experiencing the single brushstrokes. When you get into the process of knitting, the subjective experience is very repetitive, meditative, almost hypnotic. It's a very logical, algorithmic process, which definitely shows parallels to software. And as knitting is mostly based on two variables (knit and purl) it can be seen as a form of manual programming. I see many similarities and think it's no coincidence that knitting patterns look like code.

Do you consider yourself a fashion or a typography hacker?

“Hacking” as I originally understand it, describes a subversive intervention of a system from the outside. Today you find so-called “Life Hacks” in every mainstream magazine. Using your leftover coffee grounds for plants has become a hack, or using empty shampoo bottles to store valuables. Most of these tricks my grandmother used a long time ago, they were just basic household knowledge.

Which makes you think, either hacking wasn't as subversive in the first place ... or that the imaginative usage of older things and practices (like knitting) have a lot of potentials to be discovered, which is helpful to many. I see my book as a “Plug-In”, a joint between typography and knitting.

If you were a knitted typographic element which one would you be and why?

My favorite element right now it the slip-stitch. Instead of knitting a loop, you simply slip the yarn over from the left to the right needle. This makes a pre-writing, pre-typographic element, which later can become a sign (or part of a sign), although it's basically a pause in the structure. I like the simplicity of this. It also looks to me like it could have evolved as a mistake at first, and was later systematized into a method.

What are you working on right now?

Right now I am preparing a workshop program. The next will be at the Amsterdam knitting academy "De Amsterdamse Steek", and another at the Swiss Yarn Festival. I am really excited about working with experienced knitters, to get practical feedback on my methods and to exchange ideas for further methods and patterns.

Parallel to that, I am constantly researching typefaces and ways of graphic applications, which I plan to explore in workshops on the graphic side. Typeknitting is made for dialogue, it's a process of constant translation between the two fields, knitting, and typography. And besides the physical projects you make with it, the evolving interdisciplinary communication is one of its most interesting side effects. I am very curious about where this will lead to.

Learn how to loop your creativity with type here (German edition) and here (English edition).

All photos © Linda Suter

by loukas at Monday, 2019-11-11 17:08