Today is another day of the TUG 2015 meeting. See my earlier posts about the previous days.
Pavneet Arora started the first session with a talk about FLUSS, a flow leak monitoring system. I was curious, how this should be related to TeX. The working title, FLUSS, is an acronym. It stands for “Flow leaks unearthed ss” where ss means 2 x sigma. The latter refers to double sigma testing.
His talk actually has little to do with typesetting. But it has to do with TeX. Pavneet Arora considered TeX as a part of the core stack of embedded systems. He used TeX as a sophisticated documentation backend, and for reporting, so in this case not for publishing. So to say, he used the “TeX of Things” to detect water leaks. Why is this important? Like with fire, water damage can be limited by catching them early on. Similarly, you don’t have fire detectors at home, but smoke detectors, to get warned at an early stage. He focused on the water supply instead of all possible breaks and leaks along the whole supply way. His application suite is monitoring the flow at the source side, like near the water meter. It’s learning water consumption patterns over time, and results in ConTeXt generated reports. They allow alarming thanks to pattern recognition. The hardware is an embedded system based on a Raspberry Pi. There’s a bunch of tools to install – TeX was the easiest part – a great sign of its maturity and its reliable packaging. We saw ConTeXt generated diagrams and how to detect a water leak there. This topic is of high importance to insurance companies, connected to much money. This way, Pavneet Arora showed an interesting and unexpected usage of TeX in the industry.
In the next talk, Tom Hejda spoke about preparing LaTeX document classes and templates for the Czech Technical University in Prague (CTU). He spoke about differences in creating classes for journal articles compared to university theses. There, he considered the user’s point of view, stated some obvios facts and gave examples. He started with typical usage. The procedures are different:
- Journal article: the author typesets, it’s getting reviewed, there’s a final author’s version, it’s being copyedited and typeset
- Thesis: the student typesets, a supervisor comments, the final version is submitted by student
A journal has its style, decides which packages can be used, the journal has full control of output. In contrast, with a thesis, the university has style restrictions, but students decide how to actually typeset the thesis. Journal articles and theses differ in sectioning depths, used packages, and in the variety of topics, which is rather narrow in journals.
Tom Hejda compared examples:
- actapoly, a class for journal articles in the Acta Polytechnical, written in a mixture of TeX and LaTeX2e
- ctuthesis, written using LaTeX3 as much as possible, with a rich key=value interface
He discussed differences in their approach.
Boris Veytsman followed, he presented a new multibibliography package. There’s actually a package with this name. He reworked it, his new package is called nmbib.
Generally, a bibliography is not only a technical list. It describes the state of the field. So, not only an alphabetical listing, but also a chronological list shows the development and progress in the field. With nmbib, you can choose ordering by name, by appearance, and chronological, all in the same document. Each cite command produces entries for all lists. With the old multibibliography package, there were some limitations, such as support for just fixed BibTeX styles. Perl was required. Now, with the new nmib, you still get a look and feel similar to multibibliography: you get three lists with hyperref links. But now, nmbib got compatibility with the natbib package and supports its commands. Any natbib style may be used for alphabetical or sequential bibliography lists. You don’t need Perl any more. Instead of using a Perl script, BibTeX is simply run three times for three orderings. nmbib is much more flexible compared to multibibliography, since all natbib customizations can be used, and citation styles can be customized. The new and more flexible nmbib package has also been developed with using in ebooks in mind.
Leila Akhamadeeva joined Boris Veytsman for a presentation about trilingual templates for an educational institute in Bashkortostan, Russia. That’s actually a challenge, also because the Bashkir Cyrillic is different to the Russia Cyrillic. A formal document is already a challenge for a style designer, but a consistent multilingual style is even more challenging. TeX is a good tool for such tasks. Btw. they used Paratype for consistent fonts.
Paul Gessler followed with a talk about printing Git commit history graphs. Git is a popular version control system. Based on the gitinfo2 package, Paul Gessler wrote an experimental package called gittree, which generates such graphs for use in LaTeX, on the basis of TikZ, and provides a convenient interface. He showed use and creative abuse, such as with a github project MetroGit where each commit is a metro station, and a branch is a metro line, a merge is a connection between lines, together it produces a map of the metro stations of Paris. Paul Gessler’s code will be on github at the end end of summer, and he expects to put it on CTAN in early 2016.
Steve Peter then did his final task as a president. He introduced the new president, Kaveh Bazargan.
Kaveh Bazargan said, that it’s an honor to hopefully contribute in this way to this great community. He works with TeX since 1983, and his first TUG meeting was 1986 in Strasbourg. He sais, TeX deserves to get far more visibility. So, he hopes that we keep old friends but get more youngsters to come in.
We can show there’s a hell of a lot TeX can do, what can hardly be done using other technology. Though, other applications pretty much catched up. He thanked for for the votes in the election, and announced that he will talk afterwards with the people who voted differently. With a smile.
The new TUG board then gathered before the audience. Time for questions and answers!
A first suggestion was founding of an accessibility team. Klaus Hoeppner said we may join forces with institutes working on tools for blind people. A lot of people is working on things such as tagged pdf, teams should be brought together, different groups should meet each other.
It was said and agreed, that we should more demonstrate how TeX is used in academic, work and industry. But where to show it? On TUG meetings, we are between ourselves. The TUG web site is visited mostly by TeX users too, probably, not by not-yet-users. There is a TeX showcase.
My silent thought was, I could adapt and expand the TeXample gallery, a tagged and categorized gallery built by Kjell Magne Fauske. It’s currently focused on TikZ, but could be extended to TeX in general. There are sophisticated scripts for automated workflows including compiling, adding to file shares and database, tagging, and generating output in PNG and JPG via ghostscript for gallery view and thumbnail preview.
Frank Mittelbach said, that we should support the entry level at universities. Some opinions went further: we should promote the very early use of TeX, such as in schools. The potential TeX entry point today is often, when people start writing their thesis. But at that time, they already used Word and such for 10 years. Why to switch? TeX comes late here. Who switches from Word if he or she already used to it for 6 or 10 years.
Regarding publishers and TeX: few publishers use TeX – there were all here in the room. Such as River Valley Technologies and VTeX. Many publishers deny TeX also because there’s still old bad stuff. More and more other applications catch up. 95 percent of files coming from authors are in word, so the industry developed clever things around word, expensive tools work with word, such as taking out references for processing. The industry standard is converting from word, period. TeX is in the minority. Though, it’s a big industry to tackle, if you know how.
We should pass the information that the TeX distribution is maintained actively, and will be. Thats’s an important criterion. How many people go to conferences in other fields, to tell about TeX? Not so many. Ok, other user groups are also pretty much among themselves.
Boris Veytsman started another discussion. Whatever we all think, where is TeX is going to be in the future. And we as the TUG. In old times, there was much meaning and reason for being a user group. Without today’s Internet, we promoted and helped users. But today? We may have made ourselves unutilized, because we made a great job. There’s CTAN, a user doesn’t need to be TUG member anymore, to get all benefits, software and online help in forums. What reason is left to join? Thanking and sponsoring, what else? Many members are just in because of sympathy. We should find justification for TeX User Group as such, to exist, find some way to convince people to join. It was an open discussion with many people contributing. Why do we need us? How can we tell anybody, that we are relevant? Should and could we find a new identity? Why is a user group necessary?
But, even though millions of users rely on TeX, things can easily break. As wonderful as they are, teams are small. CTAN maintainers are 4 people, the LaTeX team consists of 5 people. Once you get ill, everything stops. Rarely new people pick up. It’s not only about users or money – a very important issue is getting users to contribute and to turn into developers. We are seriously low in developers: we need users to turn into being developers. So, user groups are essential to activate people who start contributing.
That was a serious discussion, and it’s good to bring up such points; to raise questions to find answers. But be sure, people here are positive and in great mood.
In the afternoon, there was an excursion to the Messel Pit, an UNESCO World Heritage site because of its abundance of fossils. In the evening we met at the Herrengarten and talked until about 1 am in the night.
It was a lot to write up, I will write about the third day soon.