# Mac brew pip is configured with locations that require TLS/SSL

After upgrading mac os and Python3, such an error was reported every time I use pip. The error said I have no ssl, but apparently I have it. I unlinked openssl and reinstalled it, issue solved:

brew unlink openssl
brew reinstall openssl


Wired.

2018/3/23 posted in  Python

# Dynet Debug

When you are facing bugs like:

Assertion failed: (dimensions_match(m_leftImpl.dimensions(), m_rightImpl.dimensions())), function TensorEvaluator, file include/eigen3/unsupported/Eigen/CXX11/src/Tensor/TensorEvaluator.h, line 392.


Check if you made mistake in transposing a matrix's dimension multi-times.

        t_rel_logits = dy.transpose(rel_logits, [1, 0, 0])

2018/1/8 posted in  ML

# Stack Long Short-Term Memories

Dyer et al. (2015)1 added an stack pointer $$TOP$$ to a conventional LSTM. The trick is use that $$TOP$$ cell as $$h_{t-1}$$ and $$c_{t-1}$$.

Not only the two standard structures (stack and buffer) in transition-based dependency parsing are implemented via a stack-LSTM, but also a third stack storing history actions are introduced and implemented in the same way. The authors seem to favor stack structure and hope it can encode configurations more thoroughly.

1. C. Dyer, M. Ballesteros, W. Ling, A. Matthews, and N. A. Smith, “Transition-Based Dependency Parsing with Stack Long Short-Term Memory.,” arXiv, vol. 1505, p. arXiv:1505.08075, 2015.

2017/11/22 posted in  NLP

# Recurrent Neural Network Grammars

Dyer et al. (2016)1 adopted RNNs to do both parsing and language modeling.

RNNs are deemed to be inappropriate models of natural language, since relations between words are in compositional nested structures rather than sequential surface order2.

The authors introduced a new generative probabilistic model of sentences to enable modeling of nested, hierarchical structures in nature language, for RNNs. Parsing operates in bottom-up fashion, while generation makes use of top-down grammar information.

RNNG defines a joint probability distribution over string terminals (words in a language) and phrase-structure nonterminals. It is motivated by the conventional transition system, which is an abstract state machine. But the first big difference is that RNNG is a generative model (although can be modified to discriminative parsing). Formally, the RNNG is defined by a triple $$\langle N,\Sigma,\Theta \rangle$$, where $$N$$ denotes nonterminal symbols (NP, VP, etc.), $$\Sigma$$ denotes terminal symbols ($$N \cap \Sigma = \emptyset$$), and $$\Theta$$ denotes model parameters. Regarding implementation, RNNG is consisted of a stack storing partially completed constituents, a buffer storing already-generated terminals, and a list of past actions. It generates sentence $$x$$ and its phrase-structure tree $$y$$ simultaneously. The actions sequence $$\boldsymbol{a} = \langle a_1,\ldots,a_n \rangle$$ to generate $$(x, y)$$ is called the oracle.

1. C. Dyer, A. Kuncoro, M. Ballesteros, and N. A. Smith, “Recurrent Neural Network Grammars,” arXiv.org, vol. cs.CL. p. arXiv:1602.07776, 24-Feb-2016.

2. Noam Chomsky, Syntactic Structures. The Hague[J]. Mouton and Company, 1957.

2017/11/4 posted in  NLP

# Markdown syntax guide and writing on Markdown

## Philosophy

Markdown is intended to be as easy-to-read and easy-to-write as is feasible.
Readability, however, is emphasized above all else. A Markdown-formatted document should be publishable as-is, as plain text, without looking like it's been marked up with tags or formatting instructions.
Markdown's syntax is intended for one purpose: to be used as a format for writing for the web.

Read more   2017/4/18 posted in  Other