Posts about trajectory

2018-01-16 testing more complex trajectories

MotionClouds may be considered as a control stimulus - it seems more interesting to consider more complex trajectories. Let's start with the classical Motion Cloud:

In [1]:
name = 'trajectory'
import os
import numpy as np
import MotionClouds as mc
fx, fy, ft = mc.get_grids(mc.N_X, mc.N_Y, mc.N_frame)
In [2]:
name_ = name + '_dense'
seed = 42
mc1 = mc.envelope_gabor(fx, fy, ft)
mc.figures(mc1, name_, seed=seed)
mc.in_show_video(name_)

The information is distributed densely in space and time.

one definition of a trajectory

It is also possible to show the impulse response ("texton") corresponding to this particular texture (be patient to see a full period):

In [3]:
name_ = name + '_impulse'
seed = 42
mc1 = mc.envelope_gabor(fx, fy, ft)
mc.figures(mc1, name_, seed=seed, impulse=True)
mc.in_show_video(name_)

To generate a trajectory, we should just convolve this impulse response to a trajectory defined as a binary profile in space and time:

In [4]:
name_ = name + '_straight'
seed = 42
x, y, t = fx+.5, fy+.5, ft+.5
width_y, width_x = 0.01, 0.005
events = 1. * (np.abs(y - .5) < width_y )* (np.abs(x - t) < width_x )
mc1 = mc.envelope_gabor(fx, fy, ft)
mc.figures(mc1, name_, seed=seed, events=events)
mc.in_show_video(name_)

It is possible to make this trajectory noisy:

In [5]:
name_ = name + '_noisy'
noise_x = 0.02
noise = noise_x * np.random.randn(1, 1, mc.N_frame)
events = 1. * (np.abs(y - .5) < width_y )* (np.abs(x + noise - t) < width_x )
mc1 = mc.envelope_gabor(fx, fy, ft)
mc.figures(mc1, name_, seed=seed, events=events)
mc.in_show_video(name_)

Finally, it is possible to make the amplitude of the texton change as a function of time:

In [6]:
name_ = name + '_noisier'
noise = noise_x * np.random.randn(1, 1, mc.N_frame)
events = 1. * (np.abs(y - .5) < width_y )* (np.abs(x + noise - t) < width_x )
A_noise_x = 0.02
A_noise = A_noise_x * np.random.randn(1, 1, mc.N_frame)
phase_noise = 2 * np.pi * np.random.rand(1, 1, mc.N_frame)
A_noise = np.cumsum(A_noise, axis=-1) / np.sqrt(t+1)
phase_noise = np.cumsum(phase_noise, axis=-1)
mc1 = mc.envelope_gabor(fx, fy, ft)
mc.figures(mc1, name_, seed=seed, events=A_noise*np.exp(phase_noise*1j)*events)
mc.in_show_video(name_)

addition of a the trajectory to the incoherent noise

It is now possible to add this trajectory to any kind of background, such as a background texture of the same "texton" but with a null average motion:

In [7]:
name_ = name + '_overlay'
movie_coh = mc.rectif(mc.random_cloud(mc1, seed=seed, events=A_noise*np.exp(phase_noise*1j)*events))
mc0 = mc.envelope_gabor(fx, fy, ft, V_X=0)
movie_unc = mc.rectif(mc.random_cloud(mc0, seed=seed+1))
rho_coh = .9
mc.anim_save(rho_coh*movie_coh+(1-rho_coh)*movie_unc, os.path.join(mc.figpath, name_))
mc.in_show_video(name_)
In [8]:
name_ = name + '_overlay_difficult'
rho_coh = .5
mc.anim_save(rho_coh*movie_coh+(1-rho_coh)*movie_unc, os.path.join(mc.figpath, name_))
mc.in_show_video(name_)

Though it is difficult to find the coherent pattern in a single frame, one detects it thanks to its coherent motion (see work from Watamaniuk, McKee and colleagues).