SumatraPDF integration in LaTeX-Emacs-make tool chain

 

SumatraPDF integration in LaTeX-Emacs-make tool chain

steps:

1.upgrade to the most recent SumatraPDF http://code.google.com/p/sumatrapdf/downloads/list

2.make sure sumatrapdf is in the search path

3.update make file for the make integration

MAIN=maintex

EPS=img/*.eps

TEX=tomo_spectra.tex 

BIB=da.bib 

REBUILDABLES =  \
*.log   \
*.blg   \
*.bbl   \
*.aux   \
*.lof   \
*.lot   \

LATEXOPTIONS = -src-specials -interaction=nonstopmode

PDFVIEWER = sumatrapdf -reuse-instance -inverse-search "c:\emacs\bin\emacsclientw.exe +%l \"%f\""

BIBDIR = "C:/H/Bib"

BIBOPTIONS = -include-directory=$(BIBDIR)

vpath %.bib /cygdrive/c/H/Bib

viewpdf : $(MAIN).pdf
    $(PDFVIEWER) $(MAIN).pdf &

dvi : $(MAIN).dvi
    yap -1  $(MAIN).dvi

$(MAIN).pdf : $(MAIN).dvi
    pdfclosem $(MAIN).pdf ; dvipdfm $(MAIN).dvi

$(MAIN).dvi : $(MAIN).bbl $(EPS)
    latex $(LATEXOPTIONS) $(MAIN)
        if ( grep 'Rerun' $(MAIN).log > /dev/null ) ; then\
            latex $(LATEXOPTIONS) $(MAIN) ; \
        else :; fi

$(MAIN).bbl : $(TEX)
    rm -f $(MAIN).aux *.bbl; latex $(LATEXOPTIONS) $(MAIN); bibtex $(BIBOPTIONS) $(MAIN) 

clean :
    rm -f $(REBUILDABLES)

4.update .emacs for AucTeX integration

(setq TeX-view-program-list
      '(("sumatra" "sumatrapdf -reuse-instance \"%f\"")
        ("Yap" ("yap -1" (mode-io-correlate " -s %n%b") " %o"))
        ("dvips and start" "dvips %d -o && start \"\" %f")
        ("start" "pdfviewm %o")))

(setq TeX-view-program-selection
      '(((output-dvi style-pstricks)
         "dvips and start")
        (output-dvi "Yap")
        (output-pdf "sumatra")
        (output-html "start")))

if I need inverse search, pub /usepackage{pdfsync} before \begin{document}, or use pdflatex instead of latex

why using sumatrapdf? it does not lock the pdf file, so when restarting the make process, I don’t need to close the pdf file manually.

Date: 2012-02-16

Author: Da Zhang

Org version 7.8.03 with Emacs version 24

Validate XHTML 1.0

tried org2blog mode successfully

Following the blog of Gabriel Saldana “Post to WordPress blogs with Emacs & Org-mode” at http://blog.nethazard.net/post-to-wordpress-blogs-with-emacs-org-mode/#utm\_source=feed&utm\_medium=feed&utm\_campaign=feed I tested the org2blog mode.

The set up is a little different from what was described in the above post: I put the following code in my .emacs file:

(require ‘org2blog-autoloads) (setq org2blog/wp-blog-alist ‘((“wordpress” :url “http://username.wordpress.com/xmlrpc.php” :username “username” :tags-as-categories nil)))

You have to substitute the “username” with your actual username of wordpress.

To use the org2blog, you can M-x org2blog/wp-login first, and then M-x org2blog/wp-new-entry.

automatic formatting bib entry and generating proper file name for pdf file

This is the subroutine I use frequently to reformat bib entries downloaded from journal websites.
It can achieve the following things:

1. rename the bib entry name as FirstAuthorLastName-Year-Title
2. it will remove the lengthy abstract and note items
3. it will generate a filename for saving pdf copy in the format of FirstAuthorLastName_Year_Title_Journal.pdf
4. it will move the url item to the end and comment it

Here is the code:

;; Filename: fbib.el
;; Author: Da Zhang
;; Usage:
;; Compile:
;; System:
;; Bugs:
;; Created: Thu Apr 29 23:38:36 2010
;; Last-Updated: Fri Oct 15 14:22:05 2010 (-14400 -0400)
;; Update #: 40
;; Description:
;;;;;;;;;;;;;;;;;;;;;;;;;;; -*- Mode: Emacs-Lisp -*- ;;;;;;;;;;;;;;;;;;;;;;;;;;
;;
;;; Code:

(defun fbib ()
“Format the bib entry copied from websites, and generate the file name for saving the pdf files systematically.”
(interactive)
(goto-char (point-max))
(re-search-backward “@” nil t)
(beginning-of-line)
(setq beg-pos (point))

;; remove the original bib entry name
(re-search-forward “\{” nil t)
(re-search-forward “,”)
(backward-char)
(let ((beg (point)))
(re-search-backward “\{” nil t)
(forward-char)
(delete-region beg (point)))

;; search for author name and copy it to bib entry name
(let ((tmp (point)))
(re-search-forward “author” nil t)
(re-search-forward “\{” nil t)
(let ((start (point)))
(re-search-forward “\}” nil t)
(let ((end (point)))
(if (re-search-in-region “,” start end)
(backward-word 1)
(if (re-search-in-region “and” start end)
(backward-word 2)
(re-search-forward “\}” nil t)
(backward-word 1)))))
;; (re-search-forward “and” nil t)
;; (backward-word 2)
(let ((beg (point)))
(forward-word)
(copy-region-as-kill beg (point)))
(goto-char tmp)
(yank)
(insert “-“))
;; search for year and copy it to bib entry name
(let ((tmp (point)))
(re-search-forward “year” nil t)
(re-search-forward “\{” nil t)
(let ((beg (point)))
(forward-word)
(copy-region-as-kill beg (point)))
(goto-char tmp)
(yank)
(insert “-“))

;; search for article title and copy it to bib entry name
(let ((tmp (point)))
(re-search-forward “title” nil t)
(re-search-forward “\{” nil t)
(let ((beg (point)))
(re-search-forward “\}” nil t)
(backward-char)
(copy-region-as-kill beg (point)))
(goto-char tmp)
(yank)
(let ((bib-name-end (point)))
(replace-in-region ” ” “-” tmp bib-name-end)
(replace-in-region “:” “-” tmp bib-name-end)
))

;; optional: search keywords, and kill it
(goto-char beg-pos)
(if (re-search-forward “keywords” nil t)
(progn
(beginning-of-line)
(let ((beg (point)))
(re-search-forward “\},”)
(forward-char)
(kill-region beg (point)))))

;; optional: search url, and move it to the back of the entry
(goto-char beg-pos)
(if (re-search-forward “url” nil t)
(progn
(beginning-of-line)
(kill-line)
(re-search-forward “^\}” nil t)
(forward-char)
(yank)
(re-search-backward “url” nil t)
(beginning-of-line)
(let ((beg (point)))
(end-of-line)
(comment-region beg (point)))))

;; form the pdf file name and add it to the end of the buffer
(goto-char (point-max))
(let ((tmp (point)))
(goto-char beg-pos)
(re-search-forward “author” nil t)
(re-search-forward “\{” nil t)
(let ((start (point)))
(re-search-forward “\}” nil t)
(let ((end (point)))
(if (re-search-in-region “,” start end)
(backward-word 1)
(if (re-search-in-region “and” start end)
(backward-word 2)
(re-search-forward “\}” nil t)
(backward-word 1)))))
;; (re-search-forward “and” nil t)
;; (backward-word 2)
(let ((beg (point)))
(forward-word)
(copy-region-as-kill beg (point)))
(goto-char tmp)
(yank)
(insert “_”))
(let ((tmp (point)))
(re-search-backward “year” nil t)
(re-search-forward “\{” nil t)
(let ((beg (point)))
(forward-word)
(copy-region-as-kill beg (point)))
(goto-char tmp)
(yank)
(insert “_”))
(let ((tmp (point)))
(re-search-backward “title” nil t)
(re-search-forward “\{” nil t)
(let ((beg (point)))
(re-search-forward “\}” nil t)
(backward-char)
(copy-region-as-kill beg (point)))
(goto-char tmp)
(yank)
(insert “_”))
(let ((tmp (point)))
(re-search-backward “journal” nil t)
(re-search-forward “\{” nil t)
(let ((beg (point)))
(re-search-forward “\}” nil t)
(backward-char)
(copy-region-as-kill beg (point)))
(goto-char tmp)
(yank)
(insert “.pdf”))
(beginning-of-line)
(let ((pdf-name-beg (point)))
(end-of-line)
(replace-in-region “:” “_” pdf-name-beg (point)))

;; optional: search abstract, and delete it
(goto-char beg-pos)
(if (re-search-forward “abstract” nil t)
(progn
(beginning-of-line)
(let ((beg (point)))
(re-search-forward “\}” nil t)
(end-of-line)
(kill-region beg (point)))
(kill-line)))
)

(defun re-search-in-region (pat start end)
“regexp search forward in region specified by start and end.”
(save-restriction
(narrow-to-region start end)
(goto-char (point-min))
(re-search-forward pat nil t)))

(defun replace-in-region (from-string to-string start end)
“Replace from-string with to-string in region specified by start and end.”
(save-restriction
(narrow-to-region start end)
(goto-char (point-min))
(while (search-forward from-string nil t) (replace-match to-string nil t))
)
)

(provide ‘fbib)
;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;
;;; fbib.el ends here

Let Emacs be your file processing engine, and use shell to drive it for batch processing!

Emacs has so many wonderful text processing functions that are not available elsewhere, such as align-regexp. Therefore, it will be very attractive to write an elisp script, and use shell to call Emacs to apply the script to many files.

I have some examples for this implementation:

In ~/my_elisp.el

;; my_elisp.el starts here

;; define the function for text processing
(defun format-rpt ()
“A function to format the OCR-processed reports for further excel import.”
(interactive)
;; align the table of data–> aligned and make the numbers comma separated
(goto-char (point-min))
(re-search-forward “land-mark-regexp-pattern” nil t) ;; find the land mark in the file to start the alignment
(beginning-of-line)
(let ((beg (point))) (align-regexp beg (point-max) “\\(\\s-*\\) ” 1 1 nil)) ;; align the first time
(while (re-search-forward ” \\{2,\\}\\b” nil t) ;; add comma for excel importing “comma separated file”
(insert “,”))
(goto-char (point-min))
(re-search-forward “land-mark-regexp-pattern” nil t) ;; find the land mark in the file to start the alignment
(beginning-of-line)
(let ((beg (point))) (align-regexp beg (point-max) “\\(\\s-*\\),” 1 4 nil)) ;; align the second time
(write-file “./save_to.txt” nil) ;; save file
)

(format-rpt) ;; call the function for text processing

;; my_elisp.el ends here

Then in bash (I used cygwin), use find to drive the Emacs:

find . -name “ocr.txt” -printf ‘%h\n’| while read dir; do (cd “$dir”; emacs –no-site-file -nw –batch ocr.txt -l ~/my_lisp) done

This special find calling pattern can deal with the spaces in path in Windows OS, and make it easy for cygwin.

mintty parameters

When starting Mintty, it’s better to include the following parameters:
-e /bin/bash –login

Otherwise Mintty will use the path of windows: system PATH first, user PATH second, then cygwin path ==> this will cause Windows versions of cygwin commands such as FIND to be executed.

Other tip: /cygwin/etc/profile and ~/.bashrc are the common two files controlling the behavior of bash shells.

hack on match-paren

I used to use a short code "match-paren" (http://grok2.tripod.com/) when I program, especially in Lisp where parentheses are everywhere. I like this piece of code, for its simplicity and usefulness: if you bind this code to something like M-[, and when you press M-[ on a "(", the cursor goes to the matching ")" automatically. This also works when mark is activated, so you can highlight the region between two matching "(" and ")" very easily. The original code is as follows:

(defun match-paren (arg)
"Go to the matching paren if on a paren."
(interactive "p")
(cond ((looking-at "\\s\(") (forward-list 1) (backward-char 1))
((looking-at "\\s\)") (forward-char 1) (backward-list 1))))

Then sometimes I found that the code does not always work intuitively, especially when I want to highlight a region with the matching "(" and ")", so I did the following hack:
(1) when you keep hitting the key-binding, e.g., M-[, the cursor jump back and forth between its original locations, not like the original code
(2) when the mark is active, the cursor jump to the matching parenthesis and move forward one step after reaching ")" or backward one step after reaching "(", so the highlighted region contains everything between (and including) the matching "(" and ")". I found this especially useful when you want to cut a list out when programming in Lisp.

Here is my hack.

(defun da-match-paren (arg)
"Go to the matching paren if on a paren."
(interactive "p")
(cond ((and mark-active (looking-at "\\s\(")) (forward-list 1))
((and mark-active (looking-back "\\s\)")) (backward-list 1))
((looking-at "\\s\(") (forward-list 1) (backward-char 1))
((looking-at "\\s\)") (forward-char 1) (backward-list 1))
))
(global-set-key (kbd "M-[") ‘da-match-paren)

another way to track literature

I found using NIH or NSF grant number, such as NIH RO1-EB002123, is another good way to track a series of papers from a particular research group. Actually, the corresponding authors usually pay attention to which grants a paper should mention at the Acknowledgment part, so the papers having the same grant number are often automatically and carefully classified according to the big project they are related to. I think this is worth mentioning in my web note.