TeX & Friends (Planet by DANTE e.V.)

2017-02-22

2017-02-18

2017-02-17

Weblog von Markus Kohm

Die Sache mit den Zöpfen

Die Parole vom Abschneiden alter Zöpfe wird immer wieder gerne gebraucht. Einige Zeitgenossen meinen ja, TeX selbst sei so ein alter Zopf. Tatsächlich aber ist die Gemeinschaft derer, die mit und an TeX arbeiten, aktiver als selten zuvor. LaTeX galt lange Zeit als stabil, womit eigentlich eher statisch bis hin zur Stagnation gemeint war. Seit letztem Jahr ist das wieder erkennbar anders. Das LaTeX-Team baut Verbesserungen wieder direkt in LaTeX ein, Kompatibilität gibt es nur bei expliziter Auswahl einer bestimmten Version.¹ Stabil im besten Sinne ist LaTeX trotzdem weiterhin. Die Progressivität spielt sich hauptsächlich unter der Oberfläche und in der Verbesserung des Ergebnisses ab.

Auch KOMA-Script enthält Dinge, die heute weder dem allgemeinen Stand von Wissen und Wollen entsprechen noch meinem eigenen. Einiges davon habe ich in den letzten beiden Jahren teilweise extrem überarbeitet. Auch dabei habe ich mich bemüht, die Änderungen unter der Oberfläche zu verstecken. Offensichtlich wurden sie jedoch in erster Linie durch neue Schnittstellen für den Anwender.

Aber auch alte Zöpfe wurden nach mehr als deutlicher Ankündigung und trotzdem für einige überraschend abgeschnitten. So habe ich nach rund zwanzig Jahren, in denen ihre Verwendbarkeit mit KOMA-Script-Klassen undokumentiert war, und längerer Zeit, in der vor ihrer Verwendung mit KOMA-Script-Klassen ausdrücklich und zunehmend intensiv gewarnt wurde, die veralteten Font-Befehle mit einer Fehlermeldung versehen. Der nächste Schritt wird vermutlich sein, dass man an die alten Font-Befehle nur noch heran kommt, wenn man die Kompatibilität auf version=first stellt. Sollte es je KOMA-Script 4 geben, wird es diese Befehle ganz bestimmt nicht mehr enthalten.

Mich selbst betreffen alte Zöpfe auch immer wieder. So stecken in KOMA-Script 3 noch immer einige Code-Teile aus KOMA-Script 2, die mir vor Augen halten, wie wenig ich vor 20 Jahren und mehr tatsächlich von LaTeX verstanden habe. Bei Änderungen bin ich auch ständig im Zwiespalt, zwischen der Aufrechterhaltung von einer Art Gesamtbild und Verfolgung eines raffinierteren Ansatzes, der manchmal vielleicht schwerer zu verstehen aber dafür deutlich eleganter wäre.

Mancher mag sich fragen, was Quelltext mit einem Bild zu tun hat. Jeder Programmierer hat seinen eigenen Stil und der wandelt sich im Laufe der Jahre. Quelltext aus einem Guss ist meist leichter zu durchschauen und zu überblicken. Bei mehreren zehntausend Zeilen Code kann es also durchaus sinnvoll sein, sich bei Änderungen an den vorhandenen Stil anzupassen, statt auf dem zu bestehen, was man für die optimale Implementierung hält.

Ich stand beispielsweise vor der Entscheidung, eines der essentiellen KOMA-Script-Pakete, dessen Fähigkeiten enorm erweitert wurden und dessen Code sich im Zuge dessen fast Verdreifacht hat, komplett auf l3 umzustellen. Um keinen Fremdkörper in KOMA-Script zu schaffen und keine Abhängigkeit von einem noch immer nicht wirklich stabilen System, habe ich mich dagegen entschieden.

Es gibt aber auch Teile, die mir immer wieder Schwierigkeiten bereiten. Ein nie endendes Thema sind die Abbildungen in der Anleitung. Als ich mit KOMA-Script begonnen habe, war die LaTeX picture-Umgebung mit einige Erweiterungen state of the art. Alles andere wurde extern gezeichnet und als Bild eingefügt. Externe Bilder wollte ich nicht. Die Erweiterungen waren häufig proprietär. Also habe ich bewusst, Abbildungen mit picture erstellt. Als Jens-Uwe Anfang des Jahrtausends Teile der Anleitung übernommen hat, fand er das allerdings wenig schön². Er war ein großer MetaPost-Fan und hat deshalb diverse Grafiken bis hin zum Buch-Cover damit realisiert.³ Mir erschien das damals ein sehr guter Ansatz und das war es auch. Mit PS-Tricks gab es zur damaligen Zeit im Zusammenspiel mit pdflatex ständig Probleme⁴. Dieses pgf hielten damals noch viele für einen Schreibfehler.⁵

Um es deutlich zu sagen: MetaPost ist eine tolle Sache. Anfangs waren auch alle Änderungen, die ich an den damit programmierten Bildern gemacht habe, eine Sache von ein paar Minuten. Dadurch, dass LuaTeX MetaPost quasi als Zugabe mitbringt, ist es genau genommen auch alles andere als ein alter Zopf. Vielmehr wäre die Verwendung von luamplib eine moderne, großartige Lösung. Das Problem liegt hier weniger bei MetaPost. Das Problem bin ich. Ich verwende MetaPost seit min. 10 Jahren nur noch für die KOMA-Script-Anleitung. Alle paar Jahr muss ich das Cover überarbeiten. Aufgrund von Unsauberkeiten im Code desselben hat das zuletzt mit TeX Live 2013 problemlos funktioniert. Alle paar Jahre muss ich etwas an einer der MetaPost-Abbildungen in der Anleitung ändern, die allesamt extern via docstrip extrahiert und mit mpost in PDFs verwandelt und dann in die Anleitung eingebunden werden. Für mich bedeutet daher jede Änderung, dass ich mich wieder neu in MetaPost zurecht finden muss.

Erst dieser Tage wurde genau das wieder einmal zu einem Problem. Die externen MetaPost-Abbildungen konnten nicht generiert werden und ich musste nach einer Möglichkeit suchen, das zu debuggen. Die vielen »Wie ging das noch mal? Welche temporäre Datei kann man wie finden und testweise direkt verarbeiten?« haben mich wieder einmal unnötig aufgehalten. Erschwert wurde das noch dadurch, dass es scheinbar hier lokal auf meinem Rechner gar nicht aufgetreten ist, während auf dem Remote-Server für die Release-Erzeugung ein Prozess regelmäßig bei 100% Last für einen CPU-Kern stehen geblieben ist. Auf dem Remote-Server ist alles Script gesteuert und auf unbewachtes Arbeiten ausgelegt. Das macht das Debuggen, wenn es denn einmal dort notwendig wird, nicht leichter.

Langer Rede kurzer Sinn, das schon lange anstehende Projekt, alle Abbildungen entweder mit picture oder TikZ zu erledigen, wurde wieder einmal dringender. Allerdings bin ich alles andere als ein TikZ-Experte. Und wenn ich das schon verwende, dann soll die Lösung natürlich ebenso elegant werden wie die MetaPost-Lösung. Ob das dann wirklich eine substantielle Verbesserung darstellen würde, steht in den Sternen. So kann es sein, dass ich auch in Jahren noch mit dem MetaPost-Code herumwerkle.

Das erinnert mich nun irgendwie an die Zöpfe meiner Oma. Die waren über Jahrzehnte immer hochgesteckt. Zu sehen bekamen die nur wir Enkel, wenn wir bei ihr übernachtet haben. Alle anderen sahen das hochgesteckte Ergebnis, bei dem von Zöpfen nichts zu erkennen war. Erst in hohem Alter konnte sie sich dazu durchringen, die Zöpfe abscheiden zu lassen. Wir werden sehen, in welch hohem Alter KOMA-Script noch welche Zöpfe abgeschnitten werden.⁶

In diesem Sinne
Markus

PS: Fußnoten habe ich auch einmal für Zöpfe gehalten, die man dringend abschneiden sollte:


1 Dass die meisten Anwender gar nicht wissen, dass und wie man Kompatibilität zu einer bestimmten LaTeX-Version festschreiben kann, ist in meinen Augen dabei eher ein Vor- als ein Nachteil. Die wenigsten benötigen das. Allerdings darf man gespannt sein, wann die ersten Pakete nur zuverlässig mit bestimmten LaTeX-Versionen arbeiten und bei Verwendung mehrerer solcher Pakete ein Konflikt auftritt. Das wird dann hoffentlich, dem einen oder anderen vor Augen halten, dass Pakete auch so weit veralten können, dass man sie nicht nur vielleicht sondern zwingend durch andere ersetzen sollte – manch alter Zopf also irgendwann wirklich abgeschnitten werden muss.

2 Wobei sich seine Kritik ausdrücklich nicht auf das Ergebnis, sondern auf die Herstellung desselben bezog.

3 Gerne gebe ich zu, dass ich selbst ebenfalls nicht nur vom Ergebnis, sondern auch vom Vorgehen begeistert war.

4 Einige der PS-Tricks-Fans und -Entwickler werden das sicher nicht gerne hören und objektiv für eine schwachsinnige Äußerung betrachten. Das bleibt ihnen unbelassen. Aber für viele Anwender und auf der Support-Seite stellte es sich subjektiv so tatsächlich dar.

5 Übrigens ganz so wie auch heute noch einige Leute glauben, das Koma in KOMA-Script komme daher, weil man beim Lesen der Anleitung ins Koma fällt, und das Script in KOMA-Script beziehe sich allein auf die Anleitung. Das führt dann zu so Stilblüten wie: »Ich habe mit Koma ein Problem. Ihr Script habe ich gelesen. Leider verstehe ich trotzdem nicht, wie ich …« Gerade den ersten Satz habe ich so oft gelesen, dass ich doch sehr hoffe, dass diejenigen, die ihn geschrieben haben, nicht wirklich alle ein Problem mit mir haben.

6 Wobei bewusst offen gelassen sei, ob sich das hohe Alter auf meine Wenigkeit oder auf die von KOMA-Script bezieht.

by Markus Kohm at Friday, 2017-02-17 08:56

2017-02-15

2017-02-14

TUG

TUG 2017 election voting open

For TUG President, one individual was nominated: Boris Veytsman. Thus, he is duly elected and will serve for a two-year term. For the Board of Directors, 14 individuals were nominated for the 10 open positions; hence, an election ballot is required. All current TUG members are eligible to vote. This year, it is strongly recommended to vote electronically in the TUG members area. Full election information is on the TUG election page.

Tuesday, 2017-02-14 02:59

2017-02-12

2017-02-06

2017-02-03

2017-01-29

Some TeX Developments (by Joseph Wright)

Hosting upgrade

As I mentioned a little while ago, I’ve moved the hosting for my blog and added https access as-standard. Doing that, I of course had to take a stab at how big a hosting package to buy. I get a steady set of hits, according to WordPress’s JetPack plugin about 5000/month. So I went for a hosting package that would easily cover that.

However, it seems that somewhere there are some more visitors! I got a mail from SiteGround saying that the site’s been using too much CPU time based on what I’ve paid for. Looking over the stats, this is not anything about my setup but is one way or another ‘real’ hits. I can’t be sure if they are real people or spam-like bots, of course, but there’s only one solution: upgrade my hosting account. Hopefully this is all transparent to readers: should mean things continue to work smoothly.

by josephwright at Sunday, 2017-01-29 08:40

2017-01-27

TUG

TUG election deadline of Feb 1

Nominations for the TeX Users Group election for this year are due Feb 1 (5pm PST). The positions of TUG president and of several directions are up for election. A nomination form and full details are available.

Friday, 2017-01-27 14:59

Internationale Gutenberg-Gesellschaft in Mainz e.V.

27.01.17 XXII. Mainzer Kolloquium: Buch und Fotografie

Das alljährliche Mainzer Kolloquium widmet sich dieses Jahr dem Thema "Buch und Fotografie". Wie immer beinhaltet das Programm zahlreiche Vorträge zu unterschiedlichen Aspekten der Fotografie im Buchbereich. Die Referenten sind unter anderem Marlene Taschen (Taschen Verlag), Dipl.-Des. Matti Wachholz-Hasumann (Cornelsen Schulbuchverlag) und Freddy Langer (FAZ). Das detaillierte Programm befindet sich als Download unten. Uhrzeit: 10.00 Uhr bis ca. 16.30 Uhr Veranstaltungsort: Alte Mensa (Atrium Maximum), Johannes Gutenberg-Universität, Mainz

Friday, 2017-01-27 12:11

2017-01-26

LaTeX Project

New LaTeX release --- Issue 26 of LaTeX2e news released

New LaTeX release — Issue 26 of LaTeX2e news released

We have made a new LaTeX distribution with improves the support for the Unicode engines and has a number of other updates and improvements. Also important: we now officially require the engines to support the eTeX primitives in order to build a LaTeX format (all major engines do that already for a long time).

The most important changes and additions are documented in ltnews26.pdf. It can be found on the LaTeX2e news page.

Thursday, 2017-01-26 00:00

2017-01-23

TikZ

Eine Mindmap vom Matheplaneten

Heute begegnete mir eine TikZ-Mindmap – in freier Wildbahn! Ich sah sie auf der Matheplanet-Award-Verleihung. Ich konnte nicht umhin, ein bisschen daran herumzubasteln, aber nur wenig. Weil ich Serifen in Diagrammen nicht so dolle mag, lieber Styles definiere statt Schriftbefehle in nodes zu setzen, bisschen kürze und einrücke und mit Leerzeichen den Code etwas für mich leserlicher entspanne… also abgesehen vom Codegebastel sah die Mindmap fast genauso aus:

mindmap

Und der Code hierfür ist, ausgehend von Martins Code leicht optimiert:

\documentclass[margin=10pt]{standalone}
\PassOptionsToPackage{usenames,svgnames}{xcolor}
\usepackage[ngerman]{babel}
\usepackage[T1]{fontenc}
\usepackage{microtype}
\usepackage{tikz}
\usetikzlibrary{mindmap}
\begin{document}
\pagecolor{black!50!green!50!}
\begin{tikzpicture}[
             nodes = { font = \sffamily\bfseries, align = center,
                       execute at begin node = {\hspace{0pt}}},
      large/.style = { concept, font = \Large\sffamily\bfseries},
       huge/.style = { concept, font = \Huge\sffamily\bfseries},
      black/.style = { color = black},
      white/.style = { color = white},
    black45/.style = { color = black, sibling angle = 45},
    white40/.style = { color = white, sibling angle = 40},
 
]
  \path[mindmap, concept color = purple!30!yellow!70!, text = white]
    node[huge, scale = 0.85, text = purple] {Matheplanet}
    [clockwise from = -60]
    child[concept color = blue] {
      node[large, level distance = 2.4cm] {Fachbereiche}
      [clockwise from = 25]
      child[white, level distance = 2.4cm]
        { node[concept, scale = 0.8] {Informatik} }
      child[white40, level distance = 2.4cm] {
        node[concept, scale = 0.7] {\LaTeX} }
      child[white, sibling angle = 38,
        level distance = 2.4cm
] { node[concept, scale = 0.7] {3M} }
      child[white, sibling angle = 42] {
        node[concept, scale = 1.25] {Mathematik}
      [clockwise from = -15]
      child[white, level distance = 2.5cm] {
        node[concept, scale = 0.9] {Olympiadeaufgaben} } }
      child[white, sibling angle = 43] {
        node[concept, scale = 0.9] {Physik}
      [clockwise from = 85]
      child[white, level distance = 2.1cm]
        { node[concept] {Ingenieurwesen} } }
    }
    child[concept color = brown!80!yellow] {
      node[large, text = black] {Rubriken}
      [clockwise from = -75]
      child[black45, level distance = 2.6cm] {
        node[concept] {Buchreviews} }
      child[black45] { node[concept, scale=0.6] {WMM} }
      child[black45] { node[concept, scale=0.8] {Links} }
      child[black45] { node[concept] {Artikel}
        [clockwise from = 25] child[black]
          { node[concept] {Montagsreport} } }
    }
    child[concept color = red!50!white] {
      node[concept, text = black] {Besucher}
      [clockwise from=90]
      child[black45] { node[concept] {Chat} }
      child[black45, level distance = 2.5cm] {
        node[concept, scale = 0.8] {Nachtwache} }
    }
    child[concept color = teal] {
      node[large] {Events}
      [clockwise from = 160]
      child[white] { node[concept, scale = 0.85] {Schach}
      [clockwise from = 82] child[white,
        level distance = 1.7cm
] { node[concept] {MPCC} } }
      child[white] { node[concept] {Treffen}
          [clockwise from = 25]
      child[white] { node[concept] {Impulsvortrag} }
      child[white, sibling angle = 240] { node[concept]
        {Regional} } }
      child[white] { node[concept] {Award} }
      child[white] { node[concept, scale=0.7] {Challenge} }
    }
  child[concept color = green!50!LimeGreen] { node[large] {Mitglieder}
 [clockwise from = 90]
   child[white] { node[concept] {Senioren}
     [clockwise from = 30] child[white] {
       node[concept, scale = 1.4] {Wissen} } }
  child[white40] { node[concept, scale = 0.8] {Moderatoren}
    [clockwise from = 20]
  child[white, level distance = 1.8cm] {
    node[concept] {Ordnung} } }
  child[white40] { node[concept, scale = 0.7] {Spender}
    [clockwise from=33]
   child[white, level distance = 1.7cm] {
     node[concept, scale = 0.8] {Geld} }
  }
  child[white40] { node[concept, scale=0.8] {Matroid}
    [clockwise from = -145] child[white] {
    node[concept] {Technik} }}
  }  
    child[concept color = red!90!cyan] {
      node[large, grow = -30] {Forum}
      [clockwise from = 61]
      child[white, level distance = 5cm, scale = 0.9] {
        node[concept] {Aktuelles \& Interessantes} }
      child[white, sibling angle = 28, level distance = 2.8cm] {
        node[concept] {Erfahrungsaustausch} }
      child[white, sibling angle = 36,
        level distance = 2.5cm
] {
        node[concept, scale = 1.1] {Hilfestellung} }
      child[white, sibling angle = 39] {
        node[concept] {Zusammenarbeit}
  [clockwise from = -63]
  child[white] { node[concept] {Collatz} }
  child[white, level distance = 4cm, sibling angle = 17] {
    node[concept] {Streichholzgraphen} }
  child[white, sibling angle = 17] { node[concept] {Al Zimm.} } }
    child[white, sibling angle = 49, level distance = 2.5cm] {
      node[concept, scale = 0.95] {Knobelecke} }
  };
\end{tikzpicture}
\end{document
}

Eine kleine hier eingefügte Sache ist execute at begin node = {\hspace{0pt}: TeX trennt das erste Wort eines Absatzes nicht, mit diesem kleinen Hack sorge ich dafür, dass die teils langen aber einzelnen Wörter in den nodes dennoch automatisch getrennt werden dürfen, da sie nicht mehr absolut am Absatzbeginn stehen. Kein manuelles Trennen mit – und \\ nötig.

Dort auf dem Matheplaneten fand wie gesagt gerade die Verleihung der Awards für beste Beteiligung statt, das 15. Mal nun. Damit werden die besten Helfer in Mathematik, Physik, Informatik, LaTeX, Mathematik-Software und mehr gewürdigt, gewählt durch die Mitglieder.

Ich freue mich darüber, einen Award als LaTeX-Helfer erhalten zu haben. Seit 2005 bin ich beim Matheplaneten dabei, das war wohl sogar meine erste aktive Beteiligung an einem Internet-Forum.

by Stefan at Monday, 2017-01-23 00:04

2017-01-19

2017-01-15

Uwes kleines Technikblog - Kategorie LaTeX (by Uwe Ziegenhagen)

LaTeX-Dateien vergleich mit latexdiff

latexdiff ist Bestandteil von TeX Live und erlaubt es, die Unterschiede zwischen zwei LaTeX-Dateien hervorzuheben. Hier ein Beispiel mit einem kurzen Textschnipsel aus der Wikipedia:

Das Original (Giraffe1.tex)

\documentclass[12pt,ngerman]{scrartcl}
\usepackage[utf8]{inputenc}
\usepackage[T1]{fontenc}
\usepackage{babel}
\usepackage{csquotes}
\begin{document}

Die Giraffen (Giraffa) sind eine Gattung der Säugetiere aus der Ordnung der Paarhufer. Ursprünglich wurden ihr mit Giraffa camelopardalis und der Trivialbezeichnung \enquote{Giraffe} nur eine einzige Art zugewiesen. Molekulargenetische Untersuchungen aus dem Jahr 2016 zeigen jedoch, dass die Gattung wenigstens vier Arten mit sieben eigenständigen Populationen umfasst. Die Giraffen stellen die höchsten landlebenden Tiere der Welt. Zur Unterscheidung vom verwandten Okapi (sogenannte \enquote{Waldgiraffe}) werden sie auch als Steppengiraffen bezeichnet.

\end{document}

Eine Version mit leichten Änderungen (Giraffe2.tex)

\documentclass[12pt,ngerman]{scrartcl}
\usepackage[utf8]{inputenc}
\usepackage[T1]{fontenc}
\usepackage{babel}
\usepackage{csquotes}
\begin{document}

Die Giraffen (Giraffa) sind eine Gattung der Säugetiere aus der Ordnung der Paarhufer. Ursprünglich wurden ihr mit Giraffa camelopardalis und der Trivialbezeichnung \enquote{Giraffe} nur eine einzige Art zugewiesen. Untersuchungen aus dem Jahr 2016 zeigten jedoch, dass die Gattung wenigstens 4 Arten mit 7 eigenständigen Populationen umfasst. Die Giraffen stellen die höchsten landlebenden Tiere der Welt. Zur Unterscheidung vom verwandten Okapi (der sogenannten \enquote{Waldgiraffe}) werden sie auch als Steppengiraffen bezeichnet.

\end{document}

Auf der Kommandozeile ruft man jetzt auf latexdiff Giraffe1.tex Giraffe2.tex > Giraffediff.tex und übersetzt die neu erzeugte Datei nach PDF, das dann wie folgt aussieht:

Uwe

Uwe Ziegenhagen has been working with LaTeX for more than a decade. Besides TeX/LaTeX he likes to work with Python, Rasberry/Arduino and his digital camera.

More Posts - Website

<script type="text/javascript"> (function($){ var options = {"info_link":"http:\/\/www.heise.de\/ct\/artikel\/2-Klicks-fuer-mehr-Datenschutz-1333879.html","txt_help":"Wenn Sie diese Felder durch einen Klick aktivieren, werden Informationen an Facebook, Twitter oder Google in die USA übertragen und unter Umständen auch dort gespeichert. Näheres erfahren Sie durch einen Klick auf das i<\/em>.","settings_perma":"Dauerhaft aktivieren und Datenüber­tragung zustimmen:","cookie_path":"\/","cookie_expire":365,"cookie_domain":"","css_path":"http:\/\/uweziegenhagen.de\/wp-content\/plugins\/wp-socialshareprivacy\/socialshareprivacy.css","services":{"facebook":{"status":"on","dummy_img":"http:\/\/uweziegenhagen.de\/wp-content\/plugins\/wp-socialshareprivacy\/images\/dummy_facebook.png","txt_info":"2 Klicks für mehr Datenschutz: Erst wenn Sie hier klicken, wird der Button aktiv und Sie können Ihre Empfehlung an Facebook senden. Schon beim Aktivieren werden Daten an Dritte übertragen – siehe i<\/em>.","txt_fb_off":"nicht mit Facebook verbunden","txt_fb_on":"mit Facebook verbunden","display_name":"Facebook","referrer_track":"","language":"de_DE"},"twitter":{"status":"on","dummy_img":"http:\/\/uweziegenhagen.de\/wp-content\/plugins\/wp-socialshareprivacy\/images\/dummy_twitter.png","txt_info":"2 Klicks für mehr Datenschutz: Erst wenn Sie hier klicken, wird der Button aktiv und Sie können Ihre Empfehlung an Twitter senden. Schon beim Aktivieren werden Daten an Dritte übertragen – siehe i<\/em>.","txt_twitter_off":"nicht mit Twitter verbunden","txt_twitter_on":"mit Twitter verbunden","display_name":"Twitter","referrer_track":"","tweet_text":""},"gplus":{"status":"on","dummy_img":"http:\/\/uweziegenhagen.de\/wp-content\/plugins\/wp-socialshareprivacy\/images\/dummy_gplus.png","txt_info":"2 Klicks für mehr Datenschutz: Erst wenn Sie hier klicken, wird der Button aktiv und Sie können Ihre Empfehlung an Google+ senden. Schon beim Aktivieren werden Daten an Dritte übertragen – siehe i<\/em>.","txt_gplus_off":"nicht mit Google+ verbunden","txt_gplus_on":"mit Google+ verbunden","display_name":"Google+","referrer_track":"","language":"de"}}}; options.cookie_domain = document.location.host; options.uri = 'http://uweziegenhagen.de/?p=3579' $(document).ready(function(){ $('#socialshareprivacy_b8bb2475fad990949d5fbc720b351429').socialSharePrivacy(options); }); })(jQuery); </script>

by Uwe at Sunday, 2017-01-15 05:53

2017-01-08

Some TeX Developments (by Joseph Wright)

Checking over the beamer codebase

Anyone who is watching the beamer GitHub repository will notice quite a lot of checkins, starting from the beginning beamer.cls and working forward. Most of these are ‘internal’ changes, so readers might wonder what is going on.

I’ve been involved with beamer for a while, but have not really taken a good look over the code yet. Several of the entries in the issue list are quite subtle, and it’s clear that a proper tidy-up of the code is needed to sort them out. That means addressing several internal inconsistencies, but it’s also helpful for me to get the code into a form I’m used to. So there are a mix of cosmetic changes in with some real improvements in the code.

Hopefully I’ll not introduce and new issues doing this work. However, it’s not easy to avoid changing behaviour in LaTeX, especially as beamer is somewhat delicate in the first place. So if anyway is keen to help with the work, simply grabbing the development code from time to time and making sure it doesn’t break existing presentations would be very helpful!

by josephwright at Sunday, 2017-01-08 22:43

TeX Stack Exchange

24 hours left: LaTeX books for $5

Update: Packt closed the $5 sale.

Right now, Packt Publishing sells my two LaTeX books for just $5 each. (ebook version: PDF, EPUB, Mobi, Kindle)

The links lead you to the book page at Packt Publishing with description and sample chapter. This $5 sale ends tomorrow, Monday 9, including the 9th until the end of the day, Europe time I think.

Full disclosure: I get 16% of the book sales,  so $0.80 per ebook. So if a few LaTeX friends take this opportunity, the royalties will allow to have a hell of a party. 😀  Just to mention that: writing a book means often just doing it for fun, not for money.

Update: Packt closed the $5 sale.

by stefan at Sunday, 2017-01-08 12:53

2017-01-01

Some TeX Developments (by Joseph Wright)

A beamer release

After a bit of a hiatus, I’ve got back into fixing the bugs in beamer, starting with some ‘low hanging’ ones and scheduling a few more. Hopefully there won’t be too many new issues as a result, but if there are then please log them!

by josephwright at Sunday, 2017-01-01 17:52

TeXblog - Typography with TeX and LaTeX (by Stefan Kottwitz)

Becoming a TUG member becomes easier

Good news for 2017!

Students, seniors and TeX friends from countries with modest economies can become TeX Users Group members for an annual rate of just $15 - for an electronic membership, early bird until March 31, 2017. This means full membership, so the TUGboat journal and TeX software are online available to you, instead of getting it on paper or DVD. Saves money and is good for our environment. :-)

Join us, plant the flag, elect our president, visit our conferences! If you look for a way to support TeX, that’s a great one!

A regular early electronic membership is $45, hopefully still affordable. Non-electronic late bird ;-) regular membership stays the same as last year.

By joining now, you can directly be part of and support the TUG TeX activities. You know,

  • Maintaining and developing the TeX Live distribution, that is also the base of MacTeX
  • Yearly international TeX conferences
  • The TUGboat journal
  • Supporting projects, such as LaTeX3, CTAN, LuaTeX and MetaPost and font creation

Let me know what you think! I love comments here.

It’s all up to us. Want to join in 2017? Link: http://tug.org/join.html

 

by stefan at Sunday, 2017-01-01 00:05

2016-12-31

2016-12-29

TeX Stack Exchange

Building snowmen

It’s winter here, and we are waiting for snow. So we will use TeX instead of snow to build some fancy snowmen. For the impatient, there’s an Unicode snowman: . It’s U+2603. Copy and paste to use.

For us TikZ friends, Hironobu Yamashita released a new package on CTAN: scsnowman. It pro­vides a com­mand \sc­snow­man which can dis­play various snow­men, customizable by options.

Just load the package in addition to TikZ:

\usepackage{scsnowman}

Then use the new command such as in these examples:

\scsnowman[scale=2, body, hat=red, muffler=blue]
\scsnowman[scale=3, hat, snow, arms, buttons]
\scsnowman[scale=2, mouthshape=tight, muffler=red]
\scsnowman[scale=2, mouthshape=frown, hat=green]

snowmen

You can find the package and instructions on CTAN and on github.

by stefan at Thursday, 2016-12-29 13:08

2016-12-24

TeXblog - Typography with TeX and LaTeX (by Stefan Kottwitz)

Currently Packt sells my LaTeX books for $5

Just to let you know, these days Packt Publishing sells all ebooks (not paper books) for just $5, until beginning of January. This includes my two LaTeX books:

The links lead you to the book page at Packt publishing with description and sample chapter.

I get 16% of the book sales, let me quickly calculate – $0.80 per ebook. So if a few LaTeX friends take this opportunity, the royalties will allow to give a hell of a party. :-D  Just to mention that: writing a book means often just doing it for fun, not for money.

by stefan at Saturday, 2016-12-24 15:45

TeX Stack Exchange

Happy Christmas!

Happy Christmas to everyone!

For all, who don’t celebrate christmas, have a great holiday season too!

The picture of this greeting card was made in TikZ by cfr, our catcode specialist. (You noticed the cat.)

christmas-tree

What would christmas be without code!

\documentclass[tikz, border=5pt, rgb, x11names, svgnames, dvipsnames]{standalone}
\usepackage{forest}
\usetikzlibrary{shapes.geometric,backgrounds,shadows,fit,fadings,calc}
\pgfdeclarelayer{foreground}
\pgfdeclarelayer{boncyff}
\pgfdeclarelayer{canghennau}
\pgfdeclarelayer{goleuni}
\pgfsetlayers{background,canghennau,boncyff,goleuni,main,foreground
}
\makeatletter
% adapted (simplified version) from tex/generic/pgf/frontendlayer/tikz/libraries/tikzlibrarybackgrounds.code.tex
\tikzset{%
  on foreground layer/.style={%
    execute at begin scope={%
      \pgfonlayer{foreground}%
      \let\tikz@options=\pgfutil@empty%
      \tikzset{every on foreground layer/.try,#1}%
      \tikz@options%
    },
    execute at end scope={\endpgfonlayer}
 
},
  on canghennau layer/.style={
    execute at begin scope={%
      \pgfonlayer{canghennau}%
      \let\tikz@options=\pgfutil@empty%
      \tikzset{every on canghennau layer/.try,#1}%
      \tikz@options%
    },
    execute at end scope={\endpgfonlayer}
 
},
  on boncyff layer/.style={
    execute at begin scope={%
      \pgfonlayer{boncyff}%
      \let\tikz@options=\pgfutil@empty%
      \tikzset{every on boncyff layer/.try,#1}%
      \tikz@options%
    },
    execute at end scope={\endpgfonlayer}
 
},
  on goleuni layer/.style={
    execute at begin scope={%
      \pgfonlayer{goleuni}%
      \let\tikz@options=\pgfutil@empty%
      \tikzset{every on goleuni layer/.try,#1}%
      \tikz@options%
    },
    execute at end scope={\endpgfonlayer}
  },
  aa/.store in=\cfr@aa,
  aa=0,
  pics/anrheg/.style n args={2}{
    code={
      \node (-bocs) [anchor=north, rounded corners=2pt, outer sep=0pt, minimum width=5mm, minimum height=5mm, fill=#1, rotate=\cfr@aa] {};
      \path [fill=#2, draw=#2, rotate=\cfr@aa]
      ([xshift=-.5mm]-bocs.south)
      [out=80, in=-105] to ([xshift=.5mm]-bocs.north)
      [out=175, in=5] to ([xshift=-.5mm]-bocs.north)
      [out=-85, in=100] to ([xshift=.5mm]-bocs.south)
      [out=-175, in=-5] to cycle;
      \path [fill=#2, draw=#2, rotate=\cfr@aa]
      ([yshift=.5mm]-bocs.west)
      [out=-10, in=-175] to ([yshift=-.5mm]-bocs.east)
      [out=85, in=-85] to ([yshift=.5mm]-bocs.east)
      [out=-175, in=-10] to ([yshift=-.5mm]-bocs.west)
      [out=95, in=-95] to cycle;
    },
  },
}
\makeatother
\tikzfading[
  name=disglair,
  inner color=transparent!0,
  outer color=transparent!100
]
\colorlet{lliw1}{Crimson}
\colorlet{lliw2}{DeepPink}
\colorlet{lliw3}{Violet}
\colorlet{lliw4}{Purple3}
\colorlet{lliw5}{Cyan}
\colorlet{lliw6}{Green1}
\colorlet{lliw7}{Gold}
\colorlet{lliw8}{Aquamarine1}
\colorlet{lliw0}{OrangeRed1}
\colorlet{lliwanrheg1a}{WildStrawberry}
\colorlet{lliwanrheg1b}{Ivory}
\colorlet{lliwanrheg2a}{Red1}
\colorlet{lliwanrheg2b}{Yellow1}
\colorlet{lliwanrheg3a}{Turquoise2}
\colorlet{lliwanrheg3b}{WildStrawberry}
\colorlet{lliwanrheg4a}{Yellow1}
\colorlet{lliwanrheg4b}{DodgerBlue3}
\colorlet{lliwanrheg5a}{Ivory}
\colorlet{lliwanrheg5b}{Turquoise2}
\colorlet{lliwanrheg0a}{DodgerBlue3}
\colorlet{lliwanrheg0b}{Red1}
\begin{document
}
\forestset{
  declare count register=nodes y goeden,
  nodes y goeden'=0,
  declare count register=tier count,
  tier count'=1,
  % canghennau
  cangen chwith/.style={
    edge path={
      \noexpand\scoped[on canghennau layer]{
        \noexpand\path [draw=ForestGreen, thick, \forestoption{edge}]
          (!u.parent anchor) +(0,20pt) [out=-90, in=150] to (.child anchor)\forestoption{edge label}
          (!u.parent anchor) +(0,15pt) [out=-90, in=150] to (.child anchor)\forestoption{edge label}
          (!u.parent anchor) +(0,10pt) [out=-90, in=150] to (.child anchor)\forestoption{edge label}
          (!u.parent anchor) +(0,5pt) [out=-90, in=150] to (.child anchor)\forestoption{edge label}
          (!u.parent anchor) [out=-90, in=150] to (.child anchor)\forestoption{edge label};
      }
    },
  },
  cangen dde/.style={
    edge path={
      \noexpand\scoped[on canghennau layer]{
        \noexpand\path [draw=ForestGreen, thick, \forestoption{edge}]
          (!u.parent anchor) +(0,20pt) [out=-90, in=30] to (.child anchor)\forestoption{edge label}
          (!u.parent anchor) +(0,15pt) [out=-90, in=30] to (.child anchor)\forestoption{edge label}
          (!u.parent anchor) +(0,10pt) [out=-90, in=30] to (.child anchor)\forestoption{edge label}
          (!u.parent anchor) +(0,5pt) [out=-90, in=30] to (.child anchor)\forestoption{edge label}
          (!u.parent anchor) [out=-90, in=30] to (.child anchor)\forestoption{edge label};
      }
    },
  },
  plant/.style={
    before typesetting nodes={
      repeat=#1{
        prepend={[, bud, cangen chwith]},
        append={[, bud, cangen dde]},
      },
    },
  },
  boncyff/.style={
    edge path={
      \noexpand\scoped[on boncyff layer]{
        \noexpand\path [line cap=round, line width=2.5pt, draw=Chocolate4, \forestoption{edge}] (!u.parent anchor) -- (.child anchor)\forestoption{edge label};
      }
    },
  },
  seren/.style={
    star, minimum size=25pt, star points=7, star point ratio=3, parent anchor=south, outer sep=2pt, inner color=white, outer color=Gold1, draw=Goldenrod1, inner color=white, outer color=Silver, draw=Snow3, tikz={\scoped[on goleuni layer]\node (sglein) [minimum width=50pt, circle, inner color=white, outer color=Silver, path fading=disglair] {};}
  },
  enw/.style={
    nodes y goeden'+=1,
    name/.process={Rw}{nodes y goeden}{n##1},
  },
  anrheg/.style={
    no edge, enw, anchor=north
  },
  bud/.style={
    inner sep=0pt,
    outer sep=0pt,
    parent anchor=center,
    child anchor=center,
    enw,
  },
  gwag/.style={
    inner sep=0pt,
    outer sep=0pt,
    parent anchor=center,
    child anchor=center,
    no edge,
  },
}
\begin{forest}
  for tree={
    if level=0{}{
      bud,
    },
    boncyff,
    if n=1{
      tier count'+=1,
    }{},
    s sep+=7.5,
    before computing xy={
      l/.process={Rw+n+d}{tier count}{4*#1 pt}
    },
    before typesetting nodes={
      if n children=0{
        append={
          [, bud, boncyff
            [, bud, l'=2pt, boncyff
              [, anrheg, alias=troed, before computing xy={l'=0pt}
             
]
            ]
          ]
        },
        delay={
          for children={
            if={

<blockquote>
  OOw+n={n}{!u.n children}{(#1+1)/2}
              }{}{
                append={
                  [, gwag, l'=2pt
                    [, anrheg, before computing xy={l'=7.5pt}
                   
]
                  ]
                },
              },
            },
          },
        }{},
      },
    },
    [, seren, plant=1
      [, plant=2
        [, plant=3
          [, plant=4
            [, plant=5
              [, plant=6
             
]
            ]
          ]
        ]
      ]
    ]
    \foreach \i  in {1,...,50}
      {
        \ifnum\i=29\relax\else\ifnum\i=28\relax\else
          \pgfmathsetmacro{\result}{int(mod(\i,9))}
          \node [circle, ball color=lliw\result] at (n\i) {};
          \scoped[on goleuni layer] \node [circle, minimum width=20pt, inner color=lliw\result!50!white, outer color=lliw\result!50!Silver, path fading=disglair] at (n\i) {};
        \fi\fi
      }
    \foreach \i in {51,...,62}
      {
        \tikzset{aa=0}
        \pgfmathsetmacro{\result}{int(mod(\i,6))}
        \pgfmathsetmacro{\casgliad}{int(mod(4+\i,6))}
        \pgfmathsetmacro{\nesaf}{int(1+\i)}
        \ifnum\i>56
          \pic at (n\i) {anrheg={lliwanrheg\result a}{lliwanrheg\result b}};
        \else
          \pic at (n\i) {anrheg={lliwanrheg\result b}{lliwanrheg\result a}};
        \fi
        \tikzset{aa=45}
        \ifnum\i<56
            \pic at ([yshift=3.85mm,xshift=.25pt]$(n\i)!1/4!(n\nesaf)$) {anrheg={lliwanrheg\casgliad a}{lliwanrheg\casgliad b}};
        \fi
        \tikzset{aa=-45}
        \ifnum\i>56
          \ifnum\i<62
            \pic at ([yshift=3.85mm,xshift=-.25pt]$(n\i)!3/4!(n\nesaf)$) {anrheg={lliwanrheg\casgliad b}{lliwanrheg\casgliad a}};
          \fi
        \fi
      }
    \path
      (n29) ++(0,-1.5mm) coordinate (c0)
      (c0) arc (-90:0:5mm and 2.5mm) coordinate (c1)
      (c0) arc (-90:-180:5mm and 2.5mm) coordinate (c2)
      (c0) +(2.5mm,-4.5mm) coordinate (c3)
      (c0) +(-2.5mm,-4.5mm) coordinate (c4)
      (c1) +(0,-2mm) coordinate (c5)
      (c2) +(0,-2mm) coordinate (c6)
      ;
    \path [fill=Sienna1] (c2) arc (-180:0:5mm and 2.5mm) -- (c5) -- (c3) arc (0:-180:2.5mm and 1.25mm) -- (c6) -- cycle;
    \begin{scope}
      \clip (c2) arc (-180:0:5mm and 2.5mm) -- (c5) -- (c3) arc (0:-180:2.5mm and 1.25mm) coordinate [midway] (c7) -- (c6) -- cycle;
      \path [draw=Sienna3] (c6) arc (-180:0:5mm and 2.5mm);
      \path [draw=Sienna3] (c2) arc (-180:0:5mm and 2.5mm);
    \end{scope}
    \begin{scope}[on canghennau layer]
      \path [fill=Sienna1, draw=Sienna3] (c2) arc (-180:180:5mm and 2.5mm);
      \path [fill=Brown!50!black] (n29 |- c2) circle (4mm and 2mm);
    \end{scope}
    \begin{scope}[on background layer]
        \node [fill=MidnightBlue, inner sep=15pt, fit=(sglein) (troed) (n21) (n35) (c7) (n51) (n62) (cath)] {};
    \end{scope
}
  %   \foreach \i in {1,...,62} \node (anrheg \i) [red] at (n\i) {\i};
  \end{forest}
  \end{document
}

by stefan at Saturday, 2016-12-24 15:04

Weblog von Markus Kohm

Weihnacht

In diesem Jahr wird es (wieder einmal) keine Weihnachtsrelease von KOMA-Script geben. Zwar befindet sich Version 3.22 sogar in einem Zustand, der eine Release ermöglichen würde, aber mir waren in diesem Jahr andere Dinge wichtiger. Wie so oft wollte keine rechte Weihnachtsstimmung aufkommen. Das Wetter war sowohl in meteorologischer als auch politischer Sicht eher deprimierend als weihnachtlich. Wie heißt es doch im Schweinachtsmann: »Wenn alles hetzt und rennt, ist garantiert Advent!« oder so ähnlich.

Am 4. Advent im Weihnachtskonzert der Musikvereinigung – ja, genau, Blasmusik! … mit Jinge-Bells als Rhapsodie, Weihnachts-Rock-Medley und natürlich auch traditioneller Weihnachtsmusik – kam dann tatsächlich Weihnachtsstimmung auf. Seither versuche ich mich trotz aller menschlicher Dummheit und Ignoranz ein wenig davon tragen zu lassen und dem nach zu spüren, was Weihnacht eigentlich bedeutet. Wer von mir diese Woche eine E-Mail bekommen hat, hat auch ernst gemeinte Weihnachtsgrüße mit bekommen:

Frohe Weihnachten!

Die wünschte ich wirklich allen Menschen da draußen. Dazu Freude, Freiheit und den vielerorts im Kleinen und Großen so schwer erreichbaren Frieden. Mein alter Rat zu letzterem: Such gar nicht nach dem großen, schweren Frieden. Schau einfach einmal nach links, nach rechts, vor und hinter Dich. Irgendwo da stehen, sitzen, leben Menschen. Frag Dich nicht, was die für Dich tun könnten, sondern frag Dich einfach mal, was Du für sie tun könntest. Wenn jeder von uns sich nur um zwei andere sorgt, dann sorgen sich um viele von uns weit mehr Menschen, als wenn sich jeder nur um sich selbst sorgt.

In diesem Sinne
Markus

by Markus Kohm at Saturday, 2016-12-24 12:06

2016-12-21

Fontblog

Buchtipp: Vom Blatt zum Blättern

Vom Blatt zum Blättern

Jetzt ist die Zeit im Jahr, wo Druck und Verarbeitung den Unterschied machen! Weihnachtskartenzeit. Wie dick ist das Papier? Wie einfallsreich der Druck? Wurde gekonnt gefalzt, lackiert, geprägt oder gar gelasert? Kreativität ist gut, Professionalität besser. Karten, die sich gut anfühlen, bleiben in der Hand. Ihre Absender bleiben in Erinnerung.

Weil Printobjekte nicht nur am Jahresende, sondern ganz allgemein wieder im Trend liegen, haben die Grafikdesignerinnen Franziska Morlok und Miriam Waszelewski mit »Vom Blatt zum Blättern« ein Buch verfasst, das von der Planung einer Drucksache über den Ablauf der Herstellung, das Format, die Wahl des Papiers und die Ausstattung alle Fragen aufgreift, die im Print-Prozess entstehen.

Auf mehr als 400 Seiten und mit über 1000 Infografiken und Abbildungen erklären die Autorinnen, wie Bücher und Broschüren mit Falz, Draht, Leim, Faden, Spirale, Ring, Schraube oder Gummi raffiniert in Szene gesetzt werden. Die Buchausstattung erhält ein eigenes Kapitel. Tipps von Buchbindern und Hinweise zu Fehlerquellen helfen Stolpersteine zu vermeiden.  

Analoge Haptik kehrt zurück. Für Geschäftsberichte, Imagebroschüren oder Lifestyle-Magazine war sie nie weg. Dieses Buch hilft Printobjekte so zu produzieren, dass sie zur begehrten Anfass-Sache werden.

Über die Autorinnen

Franziska Morlok studierte visuelle Kommunikation an der Universität der Künste Berlin und der Hochschule der bildenden Künste Saar. Sie arbeitete unter anderem für Sagmeister inc, Leonardi.Wollein und Fons Hickmann, gründete 2007 das Designbüro Rimini in Berlin und lehrte an der Kunsthochschule Weissensee, der Universität der Künste Berlin und zur Zeit an der FH Potsdam. Morloks Arbeiten gewannen nationale und internationale Auszeichnungen, darunter die des Type Directors Club New York, der Stiftung Buchkunst, des Type Directors Club Tokio, den Red Dot Award, den Joseph-Binder-Award, 100 beste Plakate und DDC Award. 

Das urbanes Lebensgefühl nicht allein im Internet zuhause ist, zeigt  Miriam Waszelewski seit 2015 durch Konzeption und Art Direction von me.Urban, einem Magazin-Spinoff von Musikexpress. Ihre Buchgestaltung für den Verlag Moderne Kunst brachte ihr große Anerkennung ein. Als Expertin für Magazin- und Buchdesign sorgt sie für frische und Zeitgemäße Impulse der von ihr betreuten Drucksachen.

Franziska Morlok | Miriam Waszelewski
Vom Blatt zum Blättern: Falzen, Heften, Binden
für Gestalter, 420 Seiten mit über 1000 Infografiken und Abbildungen
Format 18 x 24 cm, Festeinband mit Prägung und dreiseitigem Blattschnitt,
mit integriertem 20-seitigen Flatbook und umfangreichen Glossar

Verlag Hermann Schmidt, Oktober 2016
– – –

Buchverlosung

Das vorgestellte Buch und zwei weitere Veröffentlichungen aus dem Verlag Herrmann Schmidt, den Band Typografie 37 des Type Directors Club New York und den CMYK-Kalender 2017 verlosen wir unter unseren Twitter-Followern.

Type Directors Club Annual 2017 cmyk_2017_packshot_1000px

 

 

Und so geht’s:

Zwischen dem 21. 12. 2016 und dem 13. 1. 2017 #Schmidt-Buchverlosung @MonotypeDACH liken und teilen. Unter allen Followern, die sich beteiligt haben, verlosen wir die Bücher und den Kalender.

Teilnahmebedingungen

1. Dieses Gewinnspiel wird von Monotype GmbH („Monotype“), Werner-Reimers-Straße 2-4, 61352 Bad Homburg, Deutschland, veranstaltet. Es unterliegt deutschem Recht.
2. Teilnahmeberechtigt sind alle Personen ab 14 Jahren mit einer Postadresse innerhalb Deutschlands, der Schweiz und Österreichs. Angestellte von Monotype und von mit Monotype verbundenen Unternehmen und ihre Angehörigen sind von der Teilnahme ausgeschlossen.
3. Alle Personen, die unserem Twitter-Profil @MonotypeDACH folgen und zwischen dem 21. 12. 2016, 12:00 Uhr mittags und dem 13. 1. 2017, 12:00 Uhr mittags den Hashtag #Schmidt-Buchverlosung liken und teilen werden Teilnehmer am Gewinnspiel.
4. Die Gewinner werden durch Losentscheid ermittelt und per Twitter-Direktnachricht am Ende des Gewinnspiels benachrichtigt. Die Gewinner erhalten je eine der drei verlosten Publikationen. Der Preis kann nicht umgetauscht oder in bar ausgezahlt werden.
5. Die von den Teilnehmern erhobenen Daten werden von Monotype ausschließlich zur Durchführung dieses Gewinnspiels genutzt.
6. Monotype behält sich das Recht vor, diese Teilnahmebedingungen jederzeit zu ändern oder das Gewinnspiel ganz oder teilweise vorzeitig zu beenden.
7. Der Rechtsweg ist ausgeschlossen.

The post Buchtipp: Vom Blatt zum Blättern appeared first on Fontblog.

by Sabine Gruppe at Wednesday, 2016-12-21 11:00

2016-12-20

Some TeX Developments (by Joseph Wright)

Standard font loading in LaTeX2e with XeTeX and LuaTeX

The LaTeX Project have been making efforts over the past few years to update support in the LaTeX2e kernel for XeTeX and LuaTeX. Supporting these Unicode-enabled engines provide new features (and challenges) compared to the ‘classical’ 8-bit TeX engines (probably pdfTeX for most users). Over recent releases, the team have made the core of LaTeX ‘engine-aware’ and pulled a reasonable amount of basic Unicode data directly into the kernel. The next area we are addressing is font loading, or rather the question of what the out-of-the-box (text) font should be.

To date, the LaTeX kernel has loaded Knuth’s Computer Modern font in his original ‘OT1’ encoding for all engines. Whilst there are good reasons to load at least the T1-encoded version rather than the OT1 version, using an 8-bit engine using the OT1 version can be justified: it’s a question of stability, and nothing is actually out-and-out wrong.

Things are different with the Unicode engines: some of the basic assumptions change. In particular, there are some characters in the upper-half of the 8-bit range for T1 that are not in the same place in Unicode. That means that hyphenation will be wrong for words using some characters unless you load a Unicode font. At the same time, both LuaTeX and XeTeX have changed a lot over recent years: stability in the pdfTeX sense isn’t there. Finally, almost all ‘real’ documents using Unicode engines will be loading the excellent fontspec package to allow system font access. Under these circumstances, it’s appropriate to look again at the standard font loading.

After careful consideration, the team have therefore decided that as of the next (2017) LaTeX2e release, the standard text font loaded when XeTeX and LuaTeX are in use will be Latin Modern as a Unicode-encoded OpenType font. (This is the font chosen by fontspec so for almost all users there will no change in output.) No changes are being made to the macro interfaces for fonts, so users wanting anything other than Latin Modern will continue to be best served by loading fontspec. (Some adjustments are being made to the package to be ready for this.)

It’s important to add that no change is being made in math mode: the Unicode maths font situation is not anything like as clear as the text mode case.

There are still some details being finalised, but the general approach is clear and should make life easier for end users.

by josephwright at Tuesday, 2016-12-20 20:49

2016-12-19

STM Publishing: Tools, Technologies and Change (by Graham Douglas)

LuaTeX token library: simple example of scanners to mix TeX and MetaPost

ShareThis

Introduction

In this short article I share the results of using LuaTeX’s token library as one way to intermix TeX and MetaPost code. I used LuaTeX 1.0.1 which I compiled from the source code in the experimental branch of the LuaTeX SVN repository but, I believe, it should also work with LuaTeX 1.0 (may also work with some earlier versions too). In addition, I used luamplib version 2016/03/31 v2.11.3 (I downloaded my copy from github). Note that I do not have a “standard” TeX installation—I prefer to build and maintain my own custom setup (very small and compact).

The LuaTeX token library

I will not try to explain the underlying technical details but simply point you to read this article by Hans Hagen and the relevant section in the LuaTeX Reference Manual (section 9.6: The token library). Here, I’ll just provide an example of using the token library—it is not a sophisticated or “clever” example but one simply designed to demonstrate the idea.

Objective

The goal is to have TeX macros that contain a mixture of TeX and MetaPost code and find a way to expand those macros into a string of MetaPost code which can be passed to LuaTeX’s in-built MetaPost interpreter: mplib.  Suppose, just by way of a simple example, that we defined the following simple TeX macros:

\def\mpbf#1{beginfig(#1);\space}
\def\fc{fullcircle\space}
\def\pp#1#2{pickup pencircle xscaled #1mm yscaled #2mm;\space}
\def\mpef{endfig; end}
\def\draw#1{draw \fc scaled #1;\space}

and we’d like to use them to write TeX code that can build MetaPost graphics. Note that the definition of \draw also contains the command \fc

The scanner

The following code uses LuaTeX’s token library function scan_string() to generate a string from a series of incoming TeX macros—by expanding them. It stores the resulting string into a toks register so that we can later use and obtain the result.

\newtoks\mpcode
\def\scanit{%
\directlua{
local p = token.scan_string()
tex.toks["mpcode"]=p
}}

We could use this macro like this:

 \scanit{...text and TeX macro in braces...}

but instead we’ll add just a little more functionality. Suppose we further define another TeX macro

\def\codetest{\mpbf{1} \pp{0.3}{0.75} \draw{12}\mpef }

which contains a sequence of commands that, once expanded, will generate MetaPost program to produce our graphic. To expand our TeX macro (\codetest) we can do the following:

\scanit{\codetest} and the output is \the\mpcode

Here, the braces "{" and "}" are needed (I think...) in order to make token.scan_string() work correctly (I maybe wrong about that so please run your own tests). Anyway, the \codetest macro is expanded and (due to \scanit) the resulting MetaPost code is stored in the toks register called mpcode. We can see what the toks register contains simply by typesetting the result: doing \the\mpcode—note you may get strange typesetting results, or an error, if your MetaPost contains characters with catcode oddities. You can use \directlua{print(tex.toks["mpcode"])} to dump the content of the mpcode toks register to the console (rather than typesetting it). What I see in my tests is that the mpcode toks register contains the following fully expanded MetaPost code:

beginfig(1); pickup pencircle xscaled 0.3mm yscaled 0.75mm; draw fullcircle scaled 12; endfig; end

And this is now ready for passing to MetaPost (via mplib). But how? Well, one option is to use the package luamplib. If you are using LaTeX (i.e., luaLaTeX format) then you can use the following environment provided by luamplib:

\begin{mplibcode}
...your MetaPost code goes here
\end{mplibcode}

However, our MetaPost code is currently contained in the mpcode toks register. So here’s my (rather ugly hack) that uses \expandafter to get the text out of mpcode and sandwiched between \begin{mplibcode} and \end{mplibcode}. I am sure that real TeX programmers can program something far more elegant!

\def\sa{\begin{mplibcode}}
\def\mcodex{\expandafter\sa\the\mpcode\relax\end{mplibcode}}

Consequently, we  just need to issue the command \mcodex to draw our MetaPost graphic. I hope this is an interesting suggestion/solution that others might find useful.

by Graham Douglas at Monday, 2016-12-19 18:20

2016-12-18

Some TeX Developments (by Joseph Wright)

TeX on Windows: TeX Live versus MiKTeX revisited

On Windows, users have two main choices of TeX system to install: TeX Live or MiKTeX. I’ve looked at this before a couple of times: first in 2009 then again in 2011. Over the past few years both systems have developed, so it seems like a good time to revisit this. (I know from my logs that this is one of the most popular topics I’ve covered!)

The first thing to say is that for almost all ‘end users’ (with a TeX system on their own PC just for them to use), both options are fine: they’ll probably notice no difference between the two in use. It’s also worth noting that there is a third option: W32TeX. I’ve mentioned this before: it’s popular in the far East and is where the Windows binaries for TeX Live come from. (There’s a close relationship between W32TeX and TeX Live, with W32TeX more ‘focussed’ and expecting more user decisions in installing.)

Assuming you are going for one of the ‘big two’, what is there to think about? For most people, it’s simply:

  • Both MiKTeX and TeX Live include a ‘full’ set of TeX-related binaries, including the engines pdfTeX, XeTeX, LuaTeX and support programs such as BibTeX, Biber, MakeIndex and Xindy.
  • The standard installer for MiKTeX installs ‘just the basics’ and uses on-the-fly installation for anything else you need; the standard install for TeX Live is ‘everything’ (about 4.5 Gb!). Which is right for you will depend on how much space you have: you can of course customise the installation of either system to include more or less of the ‘complete’ set up.
  • MiKTeX has a slightly more flexibly approach to licensing than TeX Live does: there are a small number of LaTeX packages that MiKTeX includes that TeX Live does not. (Probably the most odious example is thesis.)
  • TeX Live has a Unix background so the management GUI looks slightly less ‘standard’ than the MiKTeX one.
  • TeX Live has a strict once-a-year freeze,which means that to update you have to do a fresh install once a year. On the other hand, MiKTeX versions change only when there is a significant change and otherwise ‘roll onward’.

So the decision is likely to come down to whether you want auto-installation of packages. (If you do go for MiKTeX on a one-user PC, choose the ‘Just for me’ installation option: it makes life a lot simpler!)

For more advanced users there are a few more factors you probably want to consider

  • TeX Live was originally developed on Unix and so is available for Linux and on the Mac (and other systems) as well as Windows; MiKTeX is a Windows system so is (more-or-less) Windows-only. So if you want exactly the same set up on Windows and other operating systems, this of course means you need to use TeX Live.
  • Both systems have graphical management tools as well as command line interfaces. They have a lot in common, but they are not identical (in particular, MiKTeX tends to emulate TeX Live command line interfaces, but the reverse is not true).
  • The engine binaries in TeX Live are (almost) never updated other than in the yearly freeze period, meaning that for a given release you know which version of pdfTeX, etc., you’ll have: MiKTeX is more flexible with such updates. (At different times, one or other of the systems can be more ‘up to date’: this is not necessarily predictable! The W32TeX system often has very up-to-date testing binaries.)
  • The two systems differ slightly in handling how local trees are managed (places to add TeX files that are not controlled by the TeX system itself). TeX Live automatically expects <installation root>/texmf-local to hold system-wide ‘local’ additions and <user root>/texmf to hold per-user additions, whereas MiKTeX has no out-of-the box locations, but does make it easier to add and remove them from the command line. MiKTeX also makes it easy to add multiple per-user trees, whereas for TeX Live there’s more of an assumption that all user additions will be added in one place. (This makes it easier in MiKTeX to add/remove local additions by altering a setting in the TeX system rather than deleting files.)
  • TeX Live has a team doing the work; MiKTeX is a one-man project. This cuts both ways: you know exactly who is doing everything in MiKTeX (Christian Schenk), and he’s very fast, but there is more ‘spread’ in TeX Live for the work.
  • For people wanting to step quickly between different versions of TeX system, the fact that TeX Live freezes once a year makes life convenient (I have TeX Live 2009,2010, 2011, 2012, 2013, 2014, 2015 and 2016 installed at present, plus MiKTeX 2.9 of course!) You can switch installations by adjusting the PATH or by choosing the appropriate version from your editor, so have a ‘fall back’ if there is an issue when you update.
  • TeX Live has build-in package backup during maintenance updates.

by josephwright at Sunday, 2016-12-18 11:02

2016-12-17

Some TeX Developments (by Joseph Wright)

Blog spring-cleaning

As I’ve just moved where the blog is hosted it seemed like a good opportunity to do a bit of tidying up. Regular readers will notice that the categories have been updated, hopefully making it easier to find things. Suggestions on any new arrangements are welcome. I’ve also fixed a few missing files in older posts (the odd re-install means that not 100% of the older content is right!). I’ve also revised the static pages (About, Packages and Contact) to make sure they are up to date: the package list and my PGP key are now right! At the ‘back end’, I’ve adjusted a few WordPress plugins and generally made sure everything is as organised as it should be.  I’ve also tweaked a few parts of the layout, including adding the very-commong ‘quite share’ buttons. So hopefully the blog is tuned up for the future!

by josephwright at Saturday, 2016-12-17 06:50

2016-12-16

Some TeX Developments (by Joseph Wright)

texdev.net is now set up for https

A rare foray outside of the strictly TeX-related: as it’s about the blog itself I think its OK! As you might notice on visiting the site, I’ve enabled https for the site. Why have I done that? Well, if you read the WordPress News it’s clear that they are pushing toward more use of secure access. (As you might guess, the back-end for the blog is WordPress.) There are wider moves toward ‘https everywhere’, so it seemed like as good a time as any to do the work.

For readers, there should be no change at all in the site. For those interested in the detail, I’ve moved the hosting to SiteGround as they offer Let’s Encrypt: a free SSL certificate system suitable for small-scale sites like this one. (If anyone is feeling particularly generous, I’ve got a referral URL which will give me some free service in return for pointing to my new host!) The process was largely trouble-free: I’ve taken the opportunity to re-build the site from scratch (just using the dumps of the content) and to remove some older ‘back-end’ rubbish (old plugins and the like). Again, for readers this should be transparent!

by josephwright at Friday, 2016-12-16 17:56

2016-12-11

2016-12-10

2016-12-09

STM Publishing: Tools, Technologies and Change (by Graham Douglas)

Remote working: An opportunity to address publishing’s diversity problem?

ShareThis

Many areas of publishing are anxious to innovate and, quite rightly, want to increase the diversity of their workforce. Innovation usually focuses on solutions to improve publishing’s outputs—the revenue-generating products and services which pay the bills. But the tendrils of innovation should also reach out to address the needs of that most crucial of inputs—the people who make things happen. One way to “walk the walk” and enable diversity—not just hand-wring, produce endless virtue-signalling tweets and give nice talks—is to innovate your recruitment and working practices by offering many more remote-working opportunities. By doing so you will open-up a route to employment which immediately reaches out into every corner of the country and into every community. Why fish from a pool of talent when you can trawl the deeper ocean? As someone who spent 3 years remote-working for a US publisher I can say, from real experience, that it can work—and be very effective.

It does not require penetrating insight to realize that a great deal of UK publishing activity is clustered around a small number of key locations. To some, it may sound heretical to suggest that not everyone actually wants, or is able, to work in any of the handful of locations into which so much of our publishing activity has coalesced. Family commitments, disability or just the sheer expense means that many highly employable and talented people simply cannot relocate or commute. The crippling cost of housing, or the UK’s grotesquely expensive rail fares are, for many, huge barriers to employment within the centres of our publishing universe. But you have to “go where the work is,” right? But is that really true in this highly advanced economy—with (albeit not universal) access to high-speed communications, cloud-based software systems and mobile technology? Has our recruitment thinking, working patterns and management practices really failed to evolve at the speed of technology? Why can’t more work go to where the employees are? Yes, of course, most publishers use a lot of freelancers and contractors who work remotely but not everyone wants to be self-employed—many just want a job with a regular income. Recruitment agencies can play a big part here by being pro-active and asking employers if they’ll consider remote working, and on what terms—you’ll almost certainly attract more candidates too.

Obviously, no-one could sensibly claim that remote working is possible for all publishing jobs in every publisher, or that remote working has no impact on teams who are office-bound. Equally, not everyone wants to work remotely or has the temperament to do so. Without question, there are organizational, technical—and management culture—issues to consider: no-one should pretend there’s a secret panacea. However, unless there’s a conscious effort to look into providing remote-working opportunities, to document and identify the challenges and pro-actively address them, then publishers will continue to limit their recruitment options and, perhaps, draw from an unnecessarily restricted subset of our national talent. Employers who enable their employees to work remotely may be surprised at the level of commitment and dedication received in return—if someone desperately wants that job but needs to work remotely, and is given the opportunity to do so, chances are they’ll move heaven and earth to do their very best work for that employer.

by Graham Douglas at Friday, 2016-12-09 16:34

2016-12-08

TUG

TUGboat 37:3 published

TUGboat volume 37, number 3 (a regular issue) has been mailed to TUG members. It is also available online and from the TUG store. Please consider joining or renewing your TUG membership if you haven't already.

Thursday, 2016-12-08 18:53

Internationale Gutenberg-Gesellschaft in Mainz e.V.

Weihnachtspause / Christmas Break

Vom 20.12.2016 bis einschließlich 09.01.2017 machen wir Weihnachtspause. Die Geschäftsstelle der Gutenberg-Gesellschaft ist ab 10.01.2017 wieder wie gewohnt für Sie da. Wir wünschen Ihnen und Ihrer Familie frohe Weihnachten und einen guten Start ins neue Jahr. We are closed for the holidays and will be back by january 10th 2017. We wish you and your loved ones a very merry christmas and a happy New Year.

Thursday, 2016-12-08 13:34

2016-12-06

LaTeX Project

Videos from the memorial for Sebastian Rahtz (13.2.1955 - 15.3.2016)

Videos from the memorial for Sebastian Rahtz (13.2.1955 - 15.3.2016)

Earlier this year our good friend and colleague Sebastian Rahtz passed away — with him the TeX community lost one of its very influencial members.

On September 27th 2016 the Oxford e-Research Centre hosted a memorial session for Sebastian at Wolfram College, Oxford titled “SPQR a digital legacy: what Sebastian Patrick Quintus Rahtz did for us”. Videos of the event have now been made available and they are certainly worth watching.

Phil Taylor talks about Sebastian’s contributions to the TeX world and Anne Trefethen reads a short statement from Don Knuth. I very much enjoyed listening to Joe Talbot’s talk about “What Sebastian Taught Us” and there are many more presentations that are all worth watching.

I wrote a short memorial on Sebastian for the Technische Komödie (the newsletter of the German TeX User’s Group (DANTE)). It was also was also translated to English as a TUGBoat article. Both articles can be found on the Publications page.

Sebastian and his legacy will stay with us.

Frank

Tuesday, 2016-12-06 23:00

Beautiful Type

Great details on that ambigram from @markcaneso. Full shot on...



Great details on that ambigram from @markcaneso. Full shot on @friendsoftype account. http://ift.tt/2gY4ll1

Tuesday, 2016-12-06 10:29

2016-12-05

Reden ist Silber ¶ Druckerey Blog (Martin Z. Schröder, Drucker)

Ein aufmerksamer Besucher [3]

Der vierjährige Patenbub war zu Besuch. Ich hatte aber keine Zeit zum Spielen, ich mußte drucken. Dann bleibe er ein wenig und schaue zu, sagte der Knabe. Und er stellte sich auf zwei wacklige Farbdosen und reckte seine Nase über die Schließplatte.

»Druckst du grün?« – »Nein, heute drucke ich rot.« – »Wann druckst du wieder grün?« – »Weiß ich nicht, ich hab gerade nichts grünes zu drucken, morgen drucke ich blau.« – »Druck doch mal wieder grün.« – »Es gibt sehr viele Farben. Schau mal, das ist ein Farbfächer, aus dem man die Farben aussucht, die gedruckt werden sollen.« Das Kind sieht sich den HKS-Fächer an und sieht Goldreste auf dem Tisch. »Und Gold?« – »Gold-Pigmente habe ich hier.« Ich nehme den Topf mit dem Goldklumpen aus dem Regal. »Das ist ein Klumpen Pigmente. Wenn man Gold drucken will, nimmt man etwas davon und ein wenig Firnis und rührt die Farbe an.« – »Dann wird es flüssig.« Dem stimme ich zu und hänge die während des Gesprächs zusammengefügte Druckform in die Maschine.

Nun stelle ich fest, daß ich keinen sauberen Spachtel mehr habe. Ich türme die bunten Spachtel vor mir auf und greife nach dem Benzin. »Vorsicht, jetzt stinkt’s.« Das Kind schnuppert am Benzin. »Es wäre mir lieb, wenn du deine Nase von diesem Platz etwas entfernst, denn gesund ist das Zeug nicht.« – »Dann sehe ich aber nichts.« – »Warte!« Ich hole eine Leiter und stelle sie mitten in den Maschinenraum. Der Bub nimmt oben Platz und strahlt herab. Jeden Handgriff soll ich jetzt erklären. Ich verwende keine spezielle Sprache für das Kind, ich halte davon nichts. Das interessiert fragende Kind wird aufgeklärt über Form- und Walzenwaschmittel, Feuergefahr und Gift (»Ach deshalb hast du einen Feurlöscher!«). Ich gebe rote Farbe ins Farbwerk des Heidelbergers. »Warum verteilt sich die Farbe?« Ich lasse die Maschine langsam laufen und erkläre: »Siehst du diesen großen glänzenden Zylinder? Der macht eine Seitwärtsbewegung, rechts, links, rechts, links, immer hin und her. Und die kleinen Gummiwalzen hier drehen sich gegenläufig auf dem Zylinder. Dabei wird die Farbe zerquetscht und breit verteilt.« Er nickt. Nach dem ersten Druck wird die Form auf die Mitte des Formats gestellt. Ich messe mit Typometer und Fadenzähler und rechne laut und zügig in Cicero und Punkt des Duodezimalsystems. Der Bub meint: »Bei dir lerne ich rechnen.« – »Du kannst wahrscheinlich schon rechnen.« – »Nein.« Wir stellen fest, daß er durchaus weiß, was eins und eins ergibt, und vertiefen müssen wir das jetzt nicht. »Ich kann alles bei dir lernen.« Das Kind beherrscht, wie man sieht, die Klaviatur des Kompliments. Dann wird die Auflage gedruckt.

Beim späteren Spiel mit Matchbox meint der Junge, daß ich sehr viel besitze. Sehr viel gute Dinge, um etwas damit zu machen. An Spielzeug ist das Kind nicht arm, aber in einer Werkstatt werden die Dinge verwandelt. Aus einem Stück Wellpappe wird eine Brücke. Aus einem Holzklotz eine Absperrung. Papierschnipsel bekommen ein Loch mit der Ahle und werden Bauteile für den Kran. Klebt man zwei Kartonstücken zusammen und falzt die Ränder nach oben, bekommt man einen langen Anhänger für den Kran. Es könnte sein, daß hier alles nur ein Rohmaterial ist. Das ist eine wichtige Lehre: die Dinge anschauen und ihnen ansehen, was man aus ihnen machen kann.

Gesellschaftliche Fragen: Was vergeben wir uns, wenn wir die Arbeit und die vermeintlich Unbeteiligten so voneinander abgrenzen, wie wir es tun. Jede Werkstatt, ob darin nun Krawatten gereinigt werden oder Schuhe besohlt und geputzt oder Schlüssel geschliffen oder Brot gebacken oder Suppe gekocht oder eine Socke gestopft oder ein Bild gemalt oder ein Schemel repariert, ist eine Offizin, ein Ort, an dem etwas gemacht wird. Selbst ein Supermarkt als Ort des Handels ist an sich eine interessante Stätte. Früher waren die Wohnstuben und Küchen Werkstätten, auch die Höfe und Keller. Mit Fertignahrung und Wegwerfkleidung hat das Heim einen bedeutenden Teil seiner Funktion als Offizin verloren. Dafür gibt es Bildschirme, auf denen kann man sich lustige Filme anschauen, in denen Werkstätten oder Fabriken vorkommen.

Es ist durchaus romantisch, wenn ein Kind auf einer Leiter sitzt und das Werkstatt-Treiben anschaut und kommentiert und erfragt. Man denkt sich als Erwachsener vielleicht: der Druckerei-Onkel mit seinen Krawatten und Fliegen, die blaue Schürze, die Lesebrille – ein Meister Eder! Während ich in der Tat glücklich über jede dieser Minuten durch meine Bilderbuchwerkstatt stapfe, denke ich mir: Familien können das zu Hause haben, beim Kochen sogar jeden Tag. Man muß nur selbst etwas tun wollen, möglichst auch Freude daran haben, erst dann wird es für Kinder interessant. Kinder kommen und gehen, tun ein wenig mit und lassen es wieder liegen und können beim übernächsten Mal etwas, das uns verblüfft. Sie sind sich nach meiner Erfahrung darin alle ähnlich. – Ich kenne zwei Neunjährige, die können einen Kuchen backen. Ohne Backmischung oder Fertigteig. Ihre Mutter ist eine grandiose Köchin. – Es liegt wohl von Anfang an in uns, daß wir mittun wollen, daß wir verstehen wollen, wie sich Dinge bilden und wie man Dinge erschafft. Und es liegt leider auch an uns, wenn wir das dem Nachwuchs vorenthalten mit der unsäglich irrigen Begründung, daß wir ihn so besser darauf vorbereiten können, erwachsen zu sein, also wohl nichts mehr selbst machen zu können. Mit langen Aufenthalten in Kinder-Einrichtungen. Mit fertigem Spielzeug in Kinderzimmern. In einer pädagogischen Situation lassen sich solche Momente aber nicht erschaffen: ein Kind auf einer Leiter in einem Maschinenraum, in dem jemand seinem Beruf nachgeht und sich zusehen läßt und das Kind Rechnen lernen will.

by Martin Z. Schröder at Monday, 2016-12-05 07:00

2016-12-01

STM Publishing: Tools, Technologies and Change (by Graham Douglas)

The flavours of TeX—a guide for publishing staff: LaTeX, pdfTeX, pdfLaTeX, XeTeX, XeLaTeX, LuaTeX et al

ShareThis

Summary

This is not a technical article on using TeX (i.e, TeX installation or programming). Instead, it offers some background information for people who work in STM (scientific, technical and medical) publishing and aims to provide an easy-to-follow explanation by addressing the question “what is TeX?”—and, hopefully, demystifies some confusing terminology. My objective is, quite simply, to offer an introduction to TeX-based software for new, or early-career, STM publishing staff—especially those working in production (print or digital). Just by way of a very brief bio, as in “am I qualified to write this”: I’m writing this piece based on my 20+ years of experience of STM publishing, having worked in senior editorial positions through to technical production and programming roles. In addition, over the last few years I have spent a great deal of time building and compiling practically every TeX engine from its original source code, together with creating my own custom TeX installation to explore the potential of production automation through modern TeX-based software.

Introduction

If you work in STM (scientific, technical and medical) publishing, especially within mathematics and physics, chances are that you’ve heard of something called “TeX” (usually pronounced “tech”)—you might also have encountered, or read about, authors using tools called LaTeX, pdfTeX, pdfLaTeX, XeTeX, XeLaTeX, LuaTeX, LuaLateX etc. Unless you are a TeX user, or familiar with the peculiarities of the TeX ecosystem, you may be forgiven for feeling somewhat confused as to what those terms actually mean. If you are considering working in STM publishing and have never heard of TeX, then I should just note that it is software which excels at typesetting advanced mathematics and is widely used by mathematicians, physicists, computer scientists to write and prepare their journal articles, books, PhD theses and so forth. TeX’s roots date back to late 1970s but over the intervening decades new versions have evolved to provide considerable enhancements and additional functionality. Those new to STM publishing, or considering it as a career, may be surprised to learn that a piece of software dating back to the late 1970s is still in widespread use by technical authors—and publishing workflows.

NOTE: TeX is not just for mathematics. It is a common misconception that the use of TeX is restricted to scientific and technical disciplines—typesetting of complex mathematics. Whilst it finds most users in those domains, TeX is widely used for the production of non-mathematical content. In addition to typesetting mathematics, modern TeX engines (XeTeX and LuaTeX) provide exquisite handling of typeset text, support for OpenType font technologies, Unicode support, OpenType math fonts (as pioneered by Microsoft Word), multilingual typesetting (including Arabic and other complex scripts) and output directly to PDF. LuaTeX, in particular, is incredibly powerful because it also has the Lua scripting language built into its typesetting engine, offering (for example) almost unlimited scope for the automated production/typesetting of highly complex or bespoke documentation, books and so forth. LuaTeX also provides you with the ability to write plugins to extend its capabilities. Those plugins are usually written in C/C++ to perform specialist tasks—for example: graphics processing, parsing XML, specialist text manipulation, on-the-fly database queries or, indeed, pretty much anything you might need to do as part of your document production processes. If you don’t want the complexities of writing plugins, chances are you can simply use the Lua scripting language to perform many of your more complex processing tasks.

Irrespective of the tools used by authors to write/prepare their work, the lingua franca of today’s digital publishing workflows—especially journals—is XML, which is generated from the collection of text and graphics files submitted by authors. Most publishers now outsource the generation of XML to offshore companies usually based in countries such as India, China or the Philippines. Many production staff usually do not have to worry (too much) about the messy details of conversion—provided the XML passes quality control procedures and is a correct and faithful representation of the authors’ work. The future is, of course, online authorship platforms which remove the need for this expensive conversion of authors’ work into XML—but we’re still some way from that being standard practice: old habits die hard, so Microsoft Word and TeX will be around for some time, as will the need for conversion into XML.

And so to TeX: A brief history in time

My all-time favourite quote comes from the American historian Daniel J. Boorstin who once noted that:

“Trying to plan for the future without a sense of the past is like trying to plant cut flowers.”

In keeping with the ethos of that quote I’ll start with a very brief history of TeX.

On 30 March 1977 the diary of Professor Donald Knuth, a computer scientist at Stanford University, recorded the following note:

“Galley proofs for vol. 2 finally arrive, they look awful (typographically)… I decide I have to solve the problem myself”.

That small entry in Professor Knuth’s diary was the catalyst for a programming journey which lasted several years and the outcome of that epic project was a piece of typesetting software capable of producing exquisitely typeset mathematics and, of course, text: that program was called TeX. Along the way, Knuth, and his colleagues, designed new and sophisticated algorithms to solve some very complex typesetting problems: including automatic line breaking, hyphenation and, of course, mathematical typesetting. As part of the development, Knuth needed to fonts to use with his typesetting software so he also developed his own font technology called METAFONT, although we won’t discuss that in any detail here.

To cut short a very long story, TeX proved to be a huge success—in no small part because Knuth took the decision to make TeX’s source code (i.e., program code) freely available, meaning that it could be built/ported, for free, to work on a wide range of computer systems. TeX enabled mathematicians, physicists, computer scientists and authors from many other technical disciplines to have exquisite control over typesetting their own work, producing beautifully typeset material containing highly complex mathematical content. Authors could use TeX to write and prepare their books and papers, and submit their “TeX code” to publishers—usually assured of a greater degree of certainty that their final proofs would not suffer the same fate as Knuth’s.

TeX: Knuth maintains his version, but others have evolved

Even today, nearly 4 decades after that fateful genesis of TeX, Professor Knuth continues to make periodic bug fixes to the master source code of his version of TeX—which is archived at ftp://ftp.cs.stanford.edu/tex/ and available from other sources, such as CTAN (Comprehensive TeX Archive Network). Those updates take place every few years with the latest being “The TeX tuneup of 2014” as reported in the journal TUGboat 35:1, 2014. During those “tuneups” Knuth does not add any new features to TeX, they really are just bug fixes. In the 1980s Knuth decided that in the interest of achieving long-term stability he would freeze the development of TeX; i.e., that no new features would be added to his version of  TeX. I specifically mentioned “his version of TeX” because Knuth did not exclude or prevent others from using his code to create “new versions of TeX” which have additional features and functionality. Those “new versions” are usually given names to indicate that whilst they are based on Knuth’s original they have additional functionality—hence the addition of prefixes to give program names such as pdfTeX, XeTeX and LuaTeX.

Huh—what about LaTeX? At this point you might be wondering why I have not mentioned LaTeX, and it is a good question. Just to jump ahead slightly, the reason I am not mentioning LaTeX (at this point) is because LateX is not a version of the executable TeX typesetting program—it is a collection of TeX macros, a topic which I will discuss in more detail below.

At this point, I’ll just use the term “TeX” (in quotes) to refer to the Knuth’s original version and all its later descendants (pdfTeX, XeTeX, LuaTeX).

So, what does “TeX” actually do?

As noted, “TeX” is a typesetting program—but if you have formed a mental image of a graphical user interface (GUI), such as Adobe InDesign, then think again. At the time of TeX’s genesis, in the late 1970s, today’s sophisticated graphical interfaces and operating systems were still some way into the future and TeX’s modus operandi still reflects its heritage—even for the new modern variants of TeX. Those accustomed to using modern page layout applications, such as Adobe InDesign, may be surprised to see how TeX works. Suppose someone gives you a copy of a “TeX” executable program and you want to use it to do something, how do you do that? “TeX” uses a so-called command-line interface: it has no fancy graphical screen into which you type your text to be typeset or point, click, tap to set options or configurations. If you run the “TeX” program you see a simple screen with a blinking cursor. Just by way of example, here’s the screen I see when I run LuaTeX (luatex.exe on Windows):

luatex

Clearly, if you want a piece of software to typeset something, you will need to provide some form of input (material to typeset) in order to get some form of output (your typeset material). Your input to the typesetting program will not only need to contain the material to be typeset but will also require some instructions to tell a typesetting program which fonts to use, the page size and a myriad of other details controlling the appearance of the typeset results. To typeset anything with “TeX” you provide it with an input text file containing your to-be-typeset material interspersed with “typesetting instructions” telling “TeX” how to format/typeset the material you have provided: i.e., what you want it to achieve. And here is where “TeX” achieves its legendary power and flexibility. The “typesetting instructions” that control “TeX’s” typesetting process are written using a very powerful programming language—one that Professor Knuth designed specifically to provide users with enormous flexibility and detailed control of “TeX’s” typesetting capabilities. So we can now start to see that “TeX” is, in fact, a piece of typesetting software that users can direct and control by providing it with instructions written in a programming language. You should think of “TeX” as an executable program (“typesetting engine”) which understands the TeX typesetting language.

A tiny example

Just to make it clear, here is a tiny example of some input to “TeX”—please do not worry about the meaning of the strange-looking markup (“TeX” commands that start with a “\”). The purpose here is simply to show you what input to “TeX” looks like:

$$\left| 4 x^3 + \left( x + {42 \over 1+x^4} \right) \right|.$$

And here is the output (as displayed in this WordPress blog using the MathJax-LaTeX plugin):

\[\left| 4 x^3 + \left( x + {42 \over 1+x^4} \right) \right|.\]

So, in order to produce your magnum opus you would write and prepare a text file containing your material interspersed with “TeX” commands and save that to a file called, say, myopus.tex and then tell your “TeX” engine to process that file. If all goes well, and there are no bugs in your “TeX” code (i.e., “TeX” programming instructions) then you should get an output myopus.pdf containing a beautifully typeset version of your work. I have, of course, omitted quite some detail here because, as I said at the start, this is not an article about running/using “TeX”.

“TeX” the program (typesetting “engine”) and “TeX” the typesetting language

So, the word “TeX” refers both to an executable program (the “TeX” typesetting engine) and the set of typesetting instructions that the engine can process: instructions written in the “TeX” language. Understanding that the executable “TeX” engine is programmable is central to truly appreciating the differences between LaTeX, pdfTeX, pdfLaTeX, XeTeX, LuaTeX and so forth.

Each “TeX” engine (program) understands hundreds of so-called primitive commands. Primitive in this sense does not mean “simple” or “unsophisticated”, it means that they are the fundamental building blocks of the TeX language. A simple, though not wholly accurate, analogy is the alphabet of a particular language: the individual characters of the alphabet cannot be reduced to simpler entities; they are the fundamental building blocks from which words, sentences etc are constructed.

And finally: from TeX to pdfTeX, XeTeX and LuaTeX

Just to recap. When Knuth wrote the original version of “TeX” he defined it to have the features and capabilities that he thought were sufficient to meet the needs of sophisticated text and mathematical typesetting based, of course, on the technology environment of that time—including processing and memory of available computers, font technologies and output devices. Knuth’s specification of “TeX” included its internal/programming design (“TeX’s” typesetting algorithms) and, of course, defining the “TeX” language that people can use to “mark up” the material to be typeset. What I mean by “defining the TeX language” is defining the set of several hundred primitive commands that the “TeX” engine can understand, and the action taken by the “TeX” engine whenever it encounters one of those primitives during the processing of your input text.

Naturally, technology environments evolve: computers become faster and have more storage/memory, new font technologies are released (Type 1, TrueType, OpenType),  file output formats evolve (e.g., the move from PostScript to PDF) and Unicode became the dominant way to encode text. Naturally, “TeX” users wanted those new technologies to be supported by “TeX”—in addition to incorporating ideas for, and improvements to, the existing features and capabilities of Knuth’s original TeX program. As noted earlier, in the 1980s Knuth decided to freeze his development of TeX: no more new features in his version—bug fixes only.  With the genuine need to update/modernize Knuth’s original software, TeX programming experts have taken Knuth’s original source code and enhanced it to add new features and provide support for modern typesetting technologies. The four-decade history of TeX’s evolution is quite complex but if you really want the full story then read this article by Frank Mittelbach: TUGboat, Volume 34 (2013), No. 1.

These new versions of TeX not only provide additional features (e.g., outputting direct to PDF, supporting OpenType fonts) they also extend and adapt the TeX language too: by adding new primitives to Knuth’s original set, thus providing users with greater programming power and flexibility to control the actions of the typesetting engine. Each new TeX engine is given its own name to distinguish it from Knuth’s original software: hence you now have pdfTeX, XeTeX and LuaTeX. These three TeX engines are not 100% compatible with each other and it is quite easy to prepare input that can be processed with one TeX engine but fail to work with others—simply because a particular TeX engine may support primitive commands that the others do not. But all is not lost: enter the world of TeX macros!

Primitives are not the whole story: macros and LaTeX

I have mentioned that each TeX engine supports a particular set of low-level commands called primitives—but this is not the full story. Of course, many of the same primitives are supported by all engines but some are specific to a particular engine. “TeX” achieves its true power and sophistication through so-called TeX macros. The primitive commands of an engine’s TeX language can be combined together to define new commands, or macros, built from low-level primitive instructions—and/or other macros. TeX macros allow you to define new commands that are capable of performing complex typesetting operations, saving a great deal of time, typing and programming errors. In addition, TeX engines provide primitives that you can use to detect which TeX engine is being used to typeset a document—so that a TeX engine can, on-the-fly, adapt its behaviour depending on whether or not it supports a particular primitive it might encounter. If a particular primitive is not supported directly but can be “mimicked” (using combinations of other primitives) then all is usually well—but if the chosen TeX engine really cannot cope with a particular primitive then typesetting will fail and an error will be reported.

The TeX language is, after all, a programming language—albeit one designed to solve typesetting problems; but as a programming language TeX is extremely arcane and works very differently to most programming languages you are likely to encounter today.

So, finally, what is LaTeX?

We’ve talked about various versions of the TeX engine—from Knuth's original TeX to its descendants of pdfTeX, XeTeX and LuaTeX—and briefly discussed TeX as a typesetting language: primitives, programming and the ability to write macros. Finally, we are in a position to discuss LaTeX. The logical extension to writing individual TeX macros for some specific task you want to solve, as an individual, is to prepare a collection of macros that others can also use—a package of macros that collectively provide some useful tools and commands that others can benefit from. And that is precisely what LaTeX is: it is a very large collection of complex and sophisticated macros designed to help you typeset books, journal papers and so forth. It provides a wealth of features to control things like page layout, fonts and a myriad of other typesetting details. Not only that but LaTeX was designed to be extensible: you can plug-in additional, more specialist, macro packages written to solve specific typesetting problems—e.g., producing nicely typeset tables or typesetting  particularly complex forms of mathematics. If you visit the Comprehensive TeX Archive Network you can choose from hundreds, if not thousands, of macro packages that have been written and contributed by users worldwide.

So, if someone says they are typesetting their work with LaTeX then they are only telling you part of the story. What they really mean is that they are using the LaTeX macro package with a particular TeX engine—usually pdfTeX but maybe  XeTeX (for multilingual work) or LuaTeX (perhaps for advanced customized document production). Sometimes you will see terms such as pdfLaTeX, XeLaTeX or even LuaLaTeX: but these are not actually the names of TeX engines, all they signify is which TeX engine is being used to run LaTeX. For example, if someone says I am “using pdfLaTeX” what that really means is “I am preparing my typeset documents using the LaTeX macro package and processing it with the pdfTeX engine”. Equally, if anyone says to you that they are “using TeX” then, I hope, you now see that statement does not actually tell you the whole story.

by Graham Douglas at Thursday, 2016-12-01 17:55

2016-11-30

Some TeX Developments (by Joseph Wright)

Dependencies

There’s been some recent discussion on the TeX Live mailing list about recording dependencies for (La)TeX packages. This is a good idea but means that package authors need to think about their dependency situation. So I thought a few words on this would be helpful, at least from the point of view of the most common case: LaTeX packages.

It’s pretty easy to accumulate \RequirePackage lines in your source, but if you are serious about giving a useful set of dependencies you need to know what each one is for. In many ways the rule is easy: require each package you use. What makes that more complicated is that you might use features which are available when you load package X but are actually provided by package Y. For example, if you load my siunitx package, it loads array so means that you can do for example

\begin{tabular}{>{$}l<{$}}

So how do you tell what your ‘real’ dependencies are? The usual rule is that you check the documentation: does it say that package X itself provides the features you use? In the case above, siunitx doesn’t document that syntax extension for tabular: it’s documented by array. So if you wrote a package that uses siunitx but also needs to use features from array you should

\RequirePackage{array}
\RequirePackage{siunitx}

This means that even if at some future stage there’s a change in the internals of a package you load, things should still all work.

If you want to track down where stuff might be coming from, you can always \listfiles to get a full overview of your current package use (starting from a small example).

There are a few places were packages are so closely linked you might not have to list them both. The most obvious is TikZ/pgf: the two are different ‘layers’ of the same set up but are documented together, so if you load TikZ you can assume pgf. Of course, there is no harm in listing both!

by josephwright at Wednesday, 2016-11-30 10:53

2016-11-28

Beautiful Type

Really nice lettering by @markcaneso. Don’t forget to buy...



Really nice lettering by @markcaneso. Don’t forget to buy his t-shirt on @cottonbureau ;-) http://ift.tt/2gpor7q

Monday, 2016-11-28 20:31

2016-11-27

Some TeX Developments (by Joseph Wright)

Beamer moves to GitHub

There are lots of places one can host development of open source code. I’ve used a few over the years, but in recent times have mainly focussed on GitHub. That’s true not least because the LaTeX3 development code is held there. The one package I’m involved in that’s to-date been elsewhere has been beamer: there are lots of issues in the tracker which I didn’t want to lose. So for some time it’s been slightly orphaned on BitBucket. I’ve now (finally) been able to migrate everything using a very handy script maintained by Jeff Widman. I’m making final arrangements on the move, but the key is that new issues should to to GitHub.

by josephwright at Sunday, 2016-11-27 10:14

2016-11-26

LaTeX Project

New procedure for reporting LaTeX bugs

New procedure for reporting LaTeX bugs

The LaTeX Project Team maintains a bugs database for the core LaTeX software (LaTeX kernel + packages maintained by the team). Due to the fact that we get more and more bug reports that we can’t help with (as they are for one of the many third-party packages out there) we have written a small package that helps with classifying issues and that identifies the correct addressee for a bug report. This package should be used in every test file showing a bug prior to reporting that bug to us.

The new procedure and some helpful information is now described in a dedicated bugs page that you find here.

Saturday, 2016-11-26 23:00

2016-11-24

Internationale Gutenberg-Gesellschaft in Mainz e.V.

24.11.16 Jour Fixe der Freunde Gutenbergs

Der Jour Fixe der Freunde Gutenbergs lädt für dieses Jahr zum letzten Mal zu weiteren Vorträgen ein. Das Thema wird noch bekannt gegeben.
Uhrzeit: 12.00 – 14.00 Uhr
Veranstaltungsort: Cuvée 2016, Gutenberg-Museum, Liebfrauenplatz 5, 55116 Mainz
Veranstalter: IGG
Ansprechpartner: Hartmut Flothmann (h.g.flothmann@t-online.de)

Thursday, 2016-11-24 11:44

2016-11-21

LaTeX Project

Issue 10 of LaTeX3 news released

Tenth issue of LaTeX3 news released

It has been a while since we published an issue on LaTeX3 development topics, but that doesn’t mean nothing has happened in the meantime. On the contrary.

Issue number ten of the LaTeX3 news brings some info about testing LaTeX (or even non-LaTeX) packages using l3build; refinements to expl3; an experimental extension to xparse and on globally optimized pagination of documents.

Monday, 2016-11-21 00:00

2016-11-20

TeX Stack Exchange

TikZ people

Why do we make fancy things with TikZ? Because we can.

Nils Fleischhacker created tikzpeople, a package for drawing people shapes with TikZ. It was heavily inspired by Microsoft Visio’s people shapes. It provides real pgf shapes with anchors for convenient use in TikZ drawings:

tikz-anchors

The original purpose was using people shapes in drawings that illustrate parties in cryptographic protocols. You know, famous Alice and Bob.

You can use the shapes by simple commands:

\node[businessman, minimum size=1.5cm] at (0,0) {};

Since he was on a roll, he created many further shapes. Here are sample, as all images here taken from the package manual:

tikz-people-1

tikz-people-2

And there are option for customizing shapes, such as color and patterns for skin, hair, shirt, hat, trousers, buttons, and more details. Plus some general additions. For example, kind of a frequently used standard symbol (or not) is the female, good but also evil, mirrored chef / cook in front of a monitor:

\node[chef, evil, female, good, mirrored, monitor, minimum size=1.5cm] {};

tikz-cook

Why did he? Because he can.

by stefan at Sunday, 2016-11-20 10:34

2016-11-16

Internationale Gutenberg-Gesellschaft in Mainz e.V.

16.11.2016 Longdrinks im Baron: Ist Amazon endlich in der Buchbranche angekommen?

In gemütlicher Runde werden wir uns dieses Mal dem Thema "Amazon" widmen, denn in den letzten Jahren hat Amazon den Verlag Amazon Publishing gegründet und stationäre Buchhandlungen mit dem Namen Amazon Books eröffnet. Daraus resultiert die Frage, warum Amazon nach so vielen Jahren nun doch den Weg in die traditionelle Buchbranche sucht. Alle sind herzlich eingeladen! Aktuelle Informationen und Tipps zu dem Thema veröffentlichen wir regelmäßig auf unserer Facebook-Seite <link https:="https:"/>https://www.facebook.com/Int.GutenbergGesellschaft/. Uhrzeit: ab 18:00 Uhr (bis ca. 20:00 Uhr) Veranstaltungsort: Baron (Uni-Gelände, Johann-Joachim-Becher-Weg 3, 55128 Mainz)

Wednesday, 2016-11-16 12:12

2016-11-14

2016-11-12

2016-11-06

2016-11-03

LaTeX Project

Future releases of LaTeX will require an eTeX-enabled engine

Future releases of LaTeX will require an eTeX-enabled engine

LaTeX2e was released in 1994 and since then the LaTeX3 Project have been committed to keeping it working smoothly for users. That means balancing up keeping the code stable with fixing bugs and adding new features.

Back in 2003 the team announced that the eTeX extensions would be used by the kernel when they were available. The new primitives offered by eTeX make many parts of TeX programming easier and often there’s no way in ‘classical’ TeX to get the same effect. As eTeX was finalised in 1999, starting to use it seriously in around 2004 meant most people had access to them.

Since then, the availability and use of eTeX has spread, and almost all users have them available. Indeed, the standard format-building routines for LaTeX have included them for many years. There are also a lot of packages on CTAN that use eTeX, most obviously any using the expl3 programming language that the LaTeX3 Project have created.

The team had always meant to say at some stage that eTeX was now required, and indeed thought we had until we checked over the official newsletters! So as of the next LaTeX2e release, scheduled for the start of 2017, the kernel will only build if eTeX is enabled. For this release, we are likely to add a test for eTeX but no actual use directly in the kernel, though in the future there will probably be more use of the extensions.

Thursday, 2016-11-03 23:00

2016-11-01

TeX Stack Exchange

e-TeX and LaTeX

Our fellow moderator Joseph wrote on his blog www.texdev.net, that from the next release on LaTeX2e will require e-TeX.

Why? Wasn’t LaTeX2e intended to be stable, nowadays only getting bug fixes and new features? Could it break something? Well, e-TeX is has been finished in 1999. In December 2003, the LaTeX team wrote:

We expect that within the next two years, releases of LaTeX will change modestly in order to run best under an extended TeX engine that contains the e-TeX primitives, e.g., e-TeX or pdfTeX.

So we have been warned a long time ago. 🙂 Well, e-TeX is in the major TeX distributions, and it is enabled by default. And note, that the latexrelease pack­age pro­vides both for­ward and back­ward com­pat­i­bil­ity of the LaTeX ker­nel.

Why to use e-TeX? One important reason is, that TeX originally has 256 registers, such as count registers for counters and dimen registers for lengths (and skip, muskip, toks, insert and box registers), and e-TeX makes 32768 registers available. Since 2015, LaTeX is already using the extended range if e-TeX is detected.

But there’s more, such as a bunch of additional primitives. Let’s summarize a bit:

  • Extended register range
  • \unexpanded, making the next token unexpandable
  • \unless that lets you negate \if commands
  • \readline for reading in text with special characters, such as %, $,\, &, _, or ^
  • Bidirectional typesetting
  • \detokenize for converting tokens into simple text strings
  • The \middle delimiter that works like \left and \right

Just to mention what came first to my mind.

e-TeX and expl3 together is the programming layer that LaTeX3 is build on. Now LaTeX2e is committed to it as well.

by stefan at Tuesday, 2016-11-01 10:05

2016-10-31

Some TeX Developments (by Joseph Wright)

LaTeX2e and e-TeX

LaTeX2e was released in 1994 and since then the LaTeX3 Project have been committed to keeping it working smoothly for users. That means balancing up keeping the code stable with fixing bugs and adding new features.

Back in 2003 the team announced that the e-TeX extensions would be used by the kernel when they were available. The new primitives offered by e-TeX make many parts of TeX programming easier and  often there’s no way in ‘classical’ TeX to get the same effect. As e-TeX was finalised in 1999, starting to use it seriously in around 2004 meant most people had access to them.

Since then, the availability and use of e-TeX has spread, and almost all users have them available. Indeed, the standard format-building routines for LaTeX have included them for many years. There are also a lot of packages on CTAN that use e-TeX, most obviously any using the expl3 programming language that the LaTeX3 Project have created.

The team had always meant to say at some stage that e-TeX was now required, and indeed thought we had until I checked over the official newsletters! So as of the next LaTeX2e release, scheduled for the start of 2017, the kernel will only build if e-TeX is enabled. For this release, we are likely to add a test for e-TeX but no actual use directly in the kernel, though in the future there will probably be more use of the extensions.

by josephwright at Monday, 2016-10-31 15:38

2016-10-29

TeX Stack Exchange

xparse revisited

At the recent UK-TUG meeting, I spoke about one of the most popular parts of the code written by the LaTeX3 Project, xparse. TeX-sx regulars will doubtless have come across xparse, either in one of the tagged questions or from one of the over 2400 posts featuring \NewDocumentCommand!

At the UK-TUG meeting, I was revisiting xparse following a talk I gave back in 2009. Since then, the LaTeX team have worked on xparse, we’ve learned what it’s good at and we’ve had a lot of user feedback. So it seemed like a good opportunity to take about it again.

I focussed on two areas: what xparse can do for the end user and how that leads on to seeing it’s syntax as a language for described LaTeX commands. For end users, xparse is a more-or-less complete way to describe document commands, and goes well beyond what \newcommand can do. My usual example is the syntax of \newcommand itself:

\DeclareDocumentCommand
\newcommand
{ s +m O{0} +o +m }
% #1 = Star: \BooleanTrue or \BooleanFalse
% #2 = Command name
% #3 = Number of arguments
% #4 = Default for first argument if
% optional or \NoValue otherwise
% #5 = Code
{
....
}

What you’ll notice here is that we can collect up all of the arguments in one place with no need to use (La)TeX code to pick up stars, multiple optional arguments and so on.

The xparse package offers several different types of argument, some of which are more common than others. Once I’d briefly outlined them, I talked about some things that xparse can do that ‘normal’ LaTeX commands can’t, for example correctly parsing nested arguments:

\DeclareDocumentCommand\foo{ O{} m }
{Code using #1 and #2}

\foo[\baz[arg1]]{arg2
}

% #1 = \baz[arg1]
% #2 = arg2

I also talked about how we can get more helpful error messages from TeX by some clever internal set in xparse.

Taking the idea of abstracting different kinds of LaTeX arguments took me to the concept of xparse as a language to describe LaTeX syntax. That’s not limited to being implemented in TeX, and it is also a very convenient shorthand (I use xparse-like descriptors whenever I’m taking about LaTeX commands nowadays, even if the underlying implementation is different). We had some lively discussion at the meeting about how that might fit in to efforts such as LaTeXML, and I’m sure I’ll be following that up in the future.

by Joseph Wright at Saturday, 2016-10-29 21:06

Weblog von Markus Kohm

Ersatz dringend gesucht!

Als ich heute morgen mein Wohnzimmernotebook in Betrieb nehmen wollte, verweigerte es den Bootvorgang mit der lapidaren Meldung »Fan Error«. Eine schnelle Recherche ergab, dass der Lüfter entweder stark verschmutzt oder defekt ist und deshalb nicht auf Touren kommt. Erzwingt man allerdings das Booten (rechtzeitig ESC drücken) läuft der Hauptlüfter scheinbar ganz normal an. An dem liegt es also vermutlich eher nicht. Da trotz WLAN-Karte kein Second Fan verbaut ist, muss das Problem wohl trotzdem am Hauptlüfter liegen, den es bei e-Bay tatsächlich für 10-15 € teils inkl. Wärmeleitpaste gibt. Es gibt leider auch diverse Erfahrungsberichte im Internet, dass das Notebook auch mit Austauschlüfter nicht wieder booten wollte, der Fehler also blieb. Außerdem ist der Ausbau des Hauptlüfters (im Gegensatz zum Second Fan) eine recht aufwändige Geschichte.

Schade, ich hatte das vor einigen Jahren geschenkte Notebook (Danke noch einmal an Uwe und seine Frau!) lieb gewonnen. Zwar wurde es in letzter Zeit in der Tat oftmals recht heiß (vor allem gestern beim Linux Upgrade) aber es handlich und leicht und bootete innerhalb von Sekunden von der von mir extra verbauten (ebenfalls geschenkten!) SSD. Ich konnte damit bis tief in die Nacht bequem auf dem Sofa Support leisten. Es wanderte mit mir auch gerne in die Küche (ja, ich weiß, dass die Luft dort eigentlich Gift für so ein Gerät ist, aber es stand immer in maximaler Entfernung vom Herd) und sogar in Keller, wenn ich dort zu tun hatte. Es ist also kein Wunder, dass in den letzten Monaten 95% des Supports und min. 50% meiner LaTeX-Entwicklung auf dem Gerät liefen. Es ist mir tatsächlich ans Herz gewachsen.

Aber jetzt muss eben Ersatz her! Falls jemand ein zum ThinkPad x61s vergleichbares Notebook übrig hat, würde ich mir das sehr gerne anbieten lassen! Ein paar Eckdaten: 1.6 GHz Intel Core 2 Duo L7500, 12" Bildschirm im Format 4:3 mit 1024x768 Punkten, Intel-Pro-Wireless-Adapter mit max. 300 Mbit/s (realistisch waren aber nur 150). RAM und HDD spielen keine Rolle, da ich sowohl DDR2 als auch DDR3-Riegel habe, um ggf. auf 4 GB aufzurüsten und die SSD aus dem ThinkPad natürlich wieder Verwendung fände.

Vergleicht man den alten Dual-Core mit aktuellen Atom-Prozessoren wie dem Z3735F @ 1.33GHz oder Z8230 fällt auf, dass die fast so viel Leistung versprechen. Theoretisch könnte das bedeuten, dass ich mit einem 10-12"-Windows-Tablet + Tastaturhülle oder ein Convertible mit noch weniger Gewicht und noch mehr Handlichkeit ebenfalls zum Ziel gelangen könnte. Natürlich müsste ich darauf erst einmal Linux installieren, was aber bei diversen Windows-Tablets mit Intel-Prozessor dem Vernehmen nach durchaus möglich sein soll. Problematisch könnte der Speicher sein. 32 GB-Flash sollten zwar meist genügen (Updates sind damit nach meiner Erfahrung aber oftmals lästig) aber ob das für die 2 GB RAM auch gilt, mit denen sich die (besseren, erschwinglichen) Tablets begnügen müssen? Falls also jemand Erfahrung mit so einem Gerät hat, darf er sich gerne bei mir melden. Das gilt natürlich erst recht, wenn jemand eines abzugeben hat.

Falls sich also jemand wundern sollte, dass ich in nächster Zeit deutlich schlechter zu erreichen bin, wisst ihr jetzt den Grund.

Bis irgendwann
Markus

by Markus Kohm at Saturday, 2016-10-29 06:29

2016-10-28

2016-10-27

TeX Stack Exchange

LaTeX diary of a pandemic

Several virulent diseases have broken out simultaneously all over the world – that’s the board game Pandemic Legacy. Your team of players travel around the world as disease-fighting specialists, researching cures for the plagues and treating disease hotspots.

Dominik Wagenführ played that game with friends, and he had the idea to write a diary of all the events happening during the game. The publisher allowed him to use pictures of the game for it. He even used a publication service to print it on paper and to bind it.

Sample images by Dominik Wagenführ, http://www.deesaster.org:

pandemic_legacy_buch_cover

pandemic_buch

Very interesting idea. Perhaps you like role-playing games, and would like to create a book of your adventures? LaTeX is great for it!

by stefan at Thursday, 2016-10-27 12:25

Internationale Gutenberg-Gesellschaft in Mainz e.V.

27.10.16 Jour Fixe der Freunde Gutenbergs

Der Jour Fixe der Freunde Gutenbergs lädt zu weiteren Vorträgen ein. Das Thema wird ebenfalls kurz vorher bekannt gegeben.
Uhrzeit: 12.00 – 14.00 Uhr
Veranstaltungsort: Cuvée 2016, Gutenberg-Museum, Liebfrauenplatz 5, 55116 Mainz
Veranstalter: IGG
Ansprechpartner: Hartmut Flothmann (h.g.flothmann@t-online.de)

Thursday, 2016-10-27 10:44

2016-10-26

TeX & Friends (by Jürgen Fenn)

Neue Ausgabe von mathmode.pdf

Herbert Voß hat eine neue Fassung seines E-Books zum Mathematiksatz mit LaTeX erstellt. Anders als die vorhergehende Version 2.37 von mathmode.pdf aus dem Jahr 2009, die auf CTAN verteilt worden war, kann die aktualisierte Ausgabe unter dem Titel Mathematical Typesetting with LaTeX in der TUG-Version 0.23 von der Website der TeX Users Group heruntergeladen werden. Dort findet man auch die Quelltexte aller Beispiele. Einige verbliebene kleinere Fehler werden in den kommenden Tagen noch behoben. Herbert Voß bittet gegebenenfalls um Bug-Reports an seine Adresse. Das gesamte Material wird vom Autor nur für den privaten Gebrauch bereitgestellt und darf insbesondere nicht über andere Server weiter verteilt werden.

Voß, Herbert. Mathematical Typesetting with LaTeX. TUG-Version 0.23, 25. Oktober 2016.


Einsortiert unter:Neuigkeiten

by schneeschmelze at Wednesday, 2016-10-26 19:50

TeX Stack Exchange

LyX in the cloud

Every tried running LyX on a Blackberry phone? Or iOS? No problem. Scott told us at LaTeX-Community.org, that there’s a cloud service for running LyX. He and me tested it, and it seems to work well. You can reach it at:

https://www.rollapp.com/app/lyx

And this is how it looks like (in my cloud…):

lyx-cloud

by stefan at Wednesday, 2016-10-26 07:53

2016-10-25

TeX Stack Exchange

TeX shapes the landscape

Computer games and science fiction movies use procedural landscapes. Why not use TeX to generate some. The approach is:

  • Using the diamond-square algorithm for generating the height map
  • Using Lua for implementing the algorithm
  • Print with pgfplots, that provides a comprehensive interface for 3d plotting
  • Let’s make a surface plot mit mesh
  • Use a color map: blue for below the water line, green for mountains, white for snow, color gradient according to the height
  • Matrix size in powers of two, perhaps adding opacity, starting at zero level for islands, …

That’s an example what we can get:

landschaft1

Another one:

landschaft2

The same with option shader=interp:

landschaft3

The code is:

\documentclass[border=10pt]{standalone}
\usepackage{pgfplots}
\usepackage{luacode}
\begin{luacode*}
function terrain(seed,dimension,options)
-- inner functions come from the Heightmap module
-- Module Copyright (C) 2011 Marc Lepage

local max, random = math.max, math.random

-- Find power of two sufficient for size
local function pot(size)
local pot = 2
while true do
if size &lt;= pot then return pot end
pot = 2*pot
end
end

-- Create a table with 0 to n zero values
local function tcreate(n)
local t = {}
for i = 0, n do t[i] = 0 end
return t
end

-- Square step
-- Sets map[x][y] from square of radius d using height function f
local function square(map, x, y, d, f)
local sum, num = 0, 0
if 0 &lt;= x-d then
if   0 &lt;= y-d   then sum, num = sum + map[x-d][y-d], num + 1 end
if y+d &lt;= map.h then sum, num = sum + map[x-d][y+d], num + 1 end
end
if x+d &lt;= map.w then
if   0 &lt;= y-d   then sum, num = sum + map[x+d][y-d], num + 1 end
if y+d &lt;= map.h then sum, num = sum + map[x+d][y+d], num + 1 end
end
map[x][y] = f(map, x, y, d, sum/num)
end

-- Diamond step
-- Sets map[x][y] from diamond of radius d using height function f
local function diamond(map, x, y, d, f)
local sum, num = 0, 0
if   0 &lt;= x-d   then sum, num = sum + map[x-d][y], num + 1 end
if x+d &lt;= map.w then sum, num = sum + map[x+d][y], num + 1 end
if   0 &lt;= y-d   then sum, num = sum + map[x][y-d], num + 1 end
if y+d &lt;= map.h then sum, num = sum + map[x][y+d], num + 1 end
map[x][y] = f(map, x, y, d, sum/num)
end

-- Diamond square algorithm generates cloud/plasma fractal heightmap
-- http://en.wikipedia.org/wiki/Diamond-square_algorithm
-- Size must be power of two
-- Height function f must look like f(map, x, y, d, h) and return h'
local function diamondsquare(size, f)
-- create map
local map = { w = size, h = size }
for c = 0, size do map[c] = tcreate(size) end
-- seed four corners
local d = size
map[0][0] = f(map, 0, 0, d, 0)
map[0][d] = f(map, 0, d, d, 0)
map[d][0] = f(map, d, 0, d, 0)
map[d][d] = f(map, d, d, d, 0)
d = d/2
-- perform square and diamond steps
while 1 &lt;= d do
for x = d, map.w-1, 2<em>d do
for y = d, map.h-1, 2</em>d do
square(map, x, y, d, f)
end
end
for x = d, map.w-1, 2<em>d do
for y = 0, map.h, 2</em>d do
diamond(map, x, y, d, f)
end
end
for x = 0, map.w, 2<em>d do
for y = d, map.h-1, 2</em>d do
diamond(map, x, y, d, f)
end
end
d = d/2
end
return map
end

-- Default height function
-- d is depth (from size to 1 by powers of two)
-- h is mean height at map[x][y] (from square/diamond of radius d)
-- returns h' which is used to set map[x][y]
function defaultf(map, x, y, d, h)
return h + (random()-0.5)*d
end

-- Create a heightmap using the specified height function (or default)
-- map[x][y] where x from 0 to map.w and y from 0 to map.h
function create(width, height, f)
f = f and f or defaultf
-- make heightmap
local map = diamondsquare(pot(max(width, height)), f)
-- clip heightmap to desired size
for x = 0, map.w do for y = height+1, map.h do map[x][y] = nil end end
for x = width+1, map.w do map[x] = nil end
map.w, map.h = width, height
return map
end

-- Initialize pseudo random number generator with seed, to be able to reproduce
math.randomseed(seed)
map = create(dimension, dimension)
if options ~= [[]] then
tex.sprint("&#92;addplot3["
.. options .. "
] coordinates{")
else
tex.sprint("&#92;addplot3 coordinates{")
end
for x = 0, map.w do
for y = 0, map.h do
tex.sprint("("..x..","..y..","..map[x][y]..")")
end
end
tex.sprint("};")
end
\end{luacode*}
\begin{document}
\begin{tikzpicture}
\begin{axis}[colormap={terrain}{color(0cm)=(blue!40!black);
color(1cm)=(blue); color(2cm)=(green!40!black);
color(4cm)=(green!60!white);color(4cm)=(white!95!black);
color(8cm)=(white); color(8cm)=(white)},
hide axis, view = {90}{10}
]
\directlua{terrain(14,128,[[surf,mesh/rows=129,mesh/check=false]])}
\end{axis}
\end{tikzpicture}
\end{document
}

The first image had the seed 10 (first argument of the terrain function) and view={10}{55}.

This was done just for fun, I posted it already in German on TeXwelt.de. Bigger landscapes can be too hard for memory and processing power when using TeX for this, but that’s just today – tomorrow computers will be more capable.

by stefan at Tuesday, 2016-10-25 12:30

Easy ball mindmaps

Since TikZ provides a mindmap library, drawing them is pretty easy:

  • Define your styles
  • Use the tree syntax for nesting nodes and children

I liked to produce this drawing:

mindmap

And that’s the short code:

\documentclass{article}
\usepackage[landscape]{geometry}
\usepackage{tikz}
\usetikzlibrary{mindmap}
\usepackage{dtklogos}
\begin{document}
\begin{tikzpicture}
  \path [
    mindmap,
    text = white,
    level 1 concept/.append style =
      {font = \Large\bfseries, sibling angle = 90},
    level 2 concept/.append style =
      {font = \normalsize\bfseries},
    level 3 concept/.append style =
      {font = \small\bfseries},
    tex/.style     = {concept, ball color = blue,
      font = \Huge\bfseries},
    engines/.style = {concept, ball color = green!50!black},
    formats/.style = {concept, ball color = blue!50!black},
    systems/.style = {concept, ball color = red!90!black},
    editors/.style = {concept, ball color = orange!90!black}
 
]
  node [tex] {\TeX} [clockwise from = 0]
    child[concept color = green!50!black, nodes = {engines}] {
      node {Engines} [clockwise from = 90]
        child { node {\TeX} }
        child { node {pdf\TeX} }
        child { node {\XeTeX} }
        child { node {Lua\TeX} }}
    child [concept color = blue, nodes = {formats}] {
      node {Formats} [clockwise from = 300]
        child { node {\LaTeX} }
        child { node {\ConTeXt} }}
    child [concept color = red, nodes = {systems}] {
      node {Systems} [clockwise from = 210]
        child { node {\TeX Live} [clockwise from = 300]
          child { node {Mac \TeX} }}
        child { node {MiK\TeX} [clockwise from = 60]
          child { node {Pro \TeX t} }}}
    child [concept color = orange, nodes = {editors}] {
      node {Editors} };
\end{tikzpicture}
\end{document
}

If you get errors, ensure that the dtklogos and hologo package are installed, otherwise replace commands such as \ConTeXt by a simple word ConTeXt.

by stefan at Tuesday, 2016-10-25 11:44

Internationale Gutenberg-Gesellschaft in Mainz e.V.

25.10.2016 Veranstaltungshinweis

Die Internationale Gutenberg-Gesellschaft möchte ihre Mitglieder auf folgende Veranstaltung hinweisen: "Die Grundlagen der Reformation. Frühe Luther-Schriften im Weltdokumentenerbe der UNESCO" am 25.10.16 um 19 Uhr in der Christuskirche, Kaiserstraße 56, 55116 Mainz. Hr. Prof. Dr. Füssel wird hier mit einem Vortrag vertreten sein.

Tuesday, 2016-10-25 11:36

TeX Stack Exchange

Calculating IP addresses in TeX

Just testing this blog, why not writing a post as an exercise. 🙂

In big projects I have hundreds of sub networks. There’s a lot of small transport networks with small subnet masks, that partition a summarizing net, and there are gateways. All these data is part of documentation and drawings. When I have a second project of the same kind, I can re-use documentation and drawings, just based on another IP address range. Will I calculate all again, typing the new IP addresses into each place at the drawings and docs? No, I like to automate it. TeX can do it.

You all know, what an IP address is:

500px-ipv4_address-svg

I chose Lua for calculating things, and LuaLaTeX compiles my documents. I will post a simplified example here.

On the Internet, I found a nice snippet ipv4.lua by Nick Barret, so I did not have to program all myself. How to use it? Here are basic steps for a possible implementation:

  • Add \usepackage{luacode} to your preamble
  • Get some Lua code (google for <keyword> and Lua) or program yourself
  • Use the code as argument of a \luaexec command
  • Modify the Lua code a bit:
    • remove Lua comments or change them (incompatible syntax)
    • quote % by a backslash: \%
    • quote backslashes, such as here: tex.sprint( "\\def\\ipDevice{#1}" )
  • Instead of print, use tex.sprint for handing over things to TeX

A compilable demo:

\documentclass{article}
\usepackage{luacode}
\newcommand{\ip
}[2]{%
\luaexec{
local ip1= string.gsub("#1", "\%.(\%d+).(\%d+).(\%d+)","")
local ip2= string.gsub("#1", "(\%d+)\%.","",1)
local ip2= string.gsub(ip2, "\%.(\%d+).(\%d+)","")
local ip3= string.gsub("#1", "(\%d+)\%.(\%d+)\%.","",1)
local ip3= string.gsub(ip3, "\%.(\%d+)","")
local ip4= string.gsub("#1", "(\%d+)\%.(\%d+)\%.(\%d+)\%.","",1)
local mask = #2
local ip = { tonumber( ip1 ), tonumber( ip2 ), tonumber( ip3 ), tonumber( ip4 ) }
local masks = {
    [1] = { 127, 255, 255, 255 },
    [2] = { 63, 255, 255, 255 },
    [3] = { 31, 255, 255, 255 },
    [4] = { 15, 255, 255, 255 },
    [5] = { 7, 255, 255, 255 },
    [6] = { 3, 255, 255, 255 },
    [7] = { 1, 255, 255, 255 },
    [8] = { 0, 255, 255, 255 },
    [9] = { 0, 127, 255, 255 },
    [10] = { 0, 63, 255, 255 },
    [11] = { 0, 31, 255, 255 },
    [12] = { 0, 15, 255, 255 },
    [13] = { 0, 7, 255, 255 },
    [14] = { 0, 3, 255, 255 },
    [15] = { 0, 1, 255, 255 },
    [16] = { 0, 0, 255, 255 },
    [17] = { 0, 0, 127, 255 },
    [18] = { 0, 0, 63, 255 },
    [19] = { 0, 0, 31, 255 },
    [20] = { 0, 0, 15, 255 },
    [21] = { 0, 0, 7, 255 },
    [22] = { 0, 0, 3, 255 },
    [23] = { 0, 0, 1, 255 },
    [24] = { 0, 0, 0, 255 },
    [25] = { 0, 0, 0, 127 },
    [26] = { 0, 0, 0, 63 },
    [27] = { 0, 0, 0, 31 },
    [28] = { 0, 0, 0, 15 },
    [29] = { 0, 0, 0, 7 },
    [30] = { 0, 0, 0, 3 },
    [31] = { 0, 0, 0, 1 }
}
local wildcard = masks[tonumber( mask )]
local ipcount = math.pow( 2, ( 32 - mask ) )
local bottomip = {}
for k, v in pairs( ip ) do
    if wildcard[k] == 0 then
        bottomip[k] = v
    elseif wildcard[k] == 255 then
        bottomip[k] = 0
    else
        local mod = v \% ( wildcard[k] + 1 )
        bottomip[k] = v - mod
    end
end
local topip = {}
for k, v in pairs( bottomip ) do
    topip[k] = v + wildcard[k]
end
tex.sprint( "&#92;def&#92;ipMask{" .. 255-wildcard[1] .. "." .. 255-wildcard[2]
  .. "." .. 255-wildcard[3] .. "." .. 255-wildcard[4].. "}" )
local isnetworkip = ( ip[1] == bottomip[1] and ip[2] == bottomip[2]
  and ip[3] == bottomip[3] and ip[4] == bottomip[4] )
local isbroadcastip = ( ip[1] == topip[1] and ip[2] == topip[2]
  and ip[3] == topip[3] and ip[4] == topip[4] )
  tex.sprint( "&#92;def&#92;ipDevice{#1}" )
  tex.sprint( "&#92;def&#92;ipNetwork{" .. bottomip[1] .. "." .. bottomip[2]
      .. "." .. bottomip[3] .. "." .. bottomip[4] .. "/" .. mask .. "}" )
  tex.sprint( "&#92;def&#92;ipGateway{" .. bottomip[1] .. "." .. bottomip[2]
      .. "." .. bottomip[3] .. "." .. bottomip[4]+1 .. "}" )
  tex.sprint( "&#92;def&#92;ipBroadcast{" .. topip[1] .. "." .. topip[2]
      .. "." .. topip[3] .. "." .. topip[4] .. "}" )
  tex.sprint( "&#92;def&#92;ipHostRange{" .. bottomip[1] .. "." .. bottomip[2]
      .. "." .. bottomip[3] .. "." .. bottomip[4] + 2 .. " - " .. topip[1]
      .. "." .. topip[2] .. "." .. topip[3] .. "." .. topip[4] - 1 .. "}")
}}
\begin{document}
\ip{10.240.3.38}{29}
Network data for device \ipDevice:
\bigskip

\begin{tabular}{rl}
Subnet:      &amp; \ipNetwork   &#92;
Subnet mask: &amp; \ipMask      &#92;
Gateway:     &amp; \ipGateway   &#92;
Broadcast:   &amp; \ipBroadcast &#92;
Host range:  &amp; \ipHostRange &#92;
\end{tabular}
\end{document
}

In this demo, I specified a device IP and its net mask, an got these values in return:

Network data for device 10.240.3.38:

Subnet: 10.240.3.32/29
Subnet mask: 255.255.255.248
Gateway: 10.240.3.33
Broadcast: 10.240.3.39
Host range: 10.240.3.34 – 10.240.3.38

In the real world, I use loops to fill tables with such data or use it in labels on edges in TikZ drawings, such as here:

routing

by stefan at Tuesday, 2016-10-25 11:19

2016-10-23

TeX Stack Exchange

Hello!

This whole web site is just a test page for the original: tex.blogoverflow.com. Currently we speak at meta.TeX.SE about how to continue our community blog. Here, it’s just a quick test for now. Much needs to be done, such as checking images and links, supporting MarkDown syntax in posts, proper syntax highlighting, reviewing each single post to ensure that it’s properly displayed.

by stefan at Sunday, 2016-10-23 19:24

2016-10-22

Dominik Wagenführ

Pandemic Legacy – Tagebuch

Einleitung

Pandemie ist ein 2008 von Matt Leacock entwickeltes kooperatives Brettspiel, bei dem die Spieler versuchen vier Krankheiten, die auf der Welt ausgebrochen sind, einzudämmen und zu heilen. Dies allein macht bereits Spaß, 2015 schloss sich Leacock mit Rob Daviau zusammen und entwickelte mit Pandemic Legacy einen Nachfolger.

Das Besondere an dem Legacy-Spiel ist, dass sich sowohl das Spielbrett, als auch die Karten und die Regeln über die Spielzeit hinweg verändern. So entwickelt sich eine Story, die mitunter sehr überraschende Wendungen nimmt und uns Spieler begeisterte. Nicht umsonst ist das Spiel bei BGG auf Platz 1.

Das Tagebuch in LaTeX

Wir haben von April bis Juli 2016 unsere 18 Partien absolviert und hatten viel Spaß dabei. Ich hatte die Idee, die Geschehnisse in Textform festzuhalten. Mein Mitspieler sagte, dass Bilder den Text noch aufwerten würden. Nach einer Anfrage beim Verlag „Z-Man Games“ erhielt ich die Erlaubnis zur Verwendung.

Da ich mich seit 10 Jahren mit Textgestaltung und Layout beschäftige, war meine Idee aus dem typischen Blogcharakter auszubrechen und stattdessen ein Buch daraus zu machen. Dieses habe ich wie üblich mit LaTeX gesetzt, was zahlreiche Arbeiten, wie zum Beispiel die Registerhaltigkeit oder Nutzung verschiedener Hintergrundbilder stark vereinfachte.

Das Ergebnis findet man hier als PDF zum Download:

Pandemic Legacy – Tagebuch

Natürlich enthält das Buch Spoiler! Damit man aber nicht überrascht wird, fängt jeder Monat auf einer neuen Seite an und es wird vorm Umblättern auch immer gewarnt! Man kann also soweit lesen, wie man selbst gespielt hat.

Den LaTeX-Quelltext gibt es ebenfalls zum Download, wobei ich die Bilder weggelassen habe bis auf die Hintergrundbilder und das Seitenzahl-Icon:

Der Druck bei epubli

Da ein PDF schön ist, aber sich so schlecht anfassen lässt, habe ich das 44-seitige Büchlein noch über epubli drucken lassen. Von diversen Diensten, die ich mir angeschaut habe, war bei diesem zum einen das Preis-/Leistungsverhältnis sehr gut (10,40 € für ein Exemplar) und zum anderen war die Eingabemaske für die Druckdaten sehr simpel und leicht verständlich (im Gegensatz zu anderen Diensten).

Das Ergebnis sieht man in folgendem Bild (Achtung: der Text spoilert die Ereignisse im Mai!):

Pandem Legacy – Tagebuch

by Dee (nospam@example.com) at Saturday, 2016-10-22 12:12