cntk.layers.models.attention module

Standard attention model.

AttentionModel(attention_dim, attention_span=None, attention_axis=None, init=glorot_uniform(), go_backwards=False, enable_self_stabilization=True, name='')[source]

Layer factory function to create a function object that implements an attention model as described in Bahdanau, et al., “Neural machine translation by jointly learning to align and translate.”