repo_name stringlengths 6 67 | path stringlengths 5 185 | copies stringlengths 1 3 | size stringlengths 4 6 | content stringlengths 1.02k 962k | license stringclasses 15 values |
|---|---|---|---|---|---|
DistrictDataLabs/yellowbrick | yellowbrick/model_selection/validation_curve.py | 1 | 14338 | # yellowbrick.model_selection.validation_curve
# Implements a visual validation curve for a hyperparameter.
#
# Author: Benjamin Bengfort
# Created: Sat Mar 31 06:27:28 2018 -0400
#
# Copyright (C) 2018 The scikit-yb developers
# For license information, see LICENSE.txt
#
# ID: validation_curve.py [c5355ee] benjamin@bengfort.com $
"""
Implements a visual validation curve for a hyperparameter.
"""
##########################################################################
# Imports
##########################################################################
import numpy as np
from yellowbrick.base import ModelVisualizer
from yellowbrick.style import resolve_colors
from yellowbrick.exceptions import YellowbrickValueError
from sklearn.model_selection import validation_curve as sk_validation_curve
##########################################################################
# ValidationCurve visualizer
##########################################################################
class ValidationCurve(ModelVisualizer):
"""
Visualizes the validation curve for both test and training data for a
range of values for a single hyperparameter of the model. Adjusting the
value of a hyperparameter adjusts the complexity of a model. Less complex
models suffer from increased error due to bias, while more complex models
suffer from increased error due to variance. By inspecting the training
and cross-validated test score error, it is possible to estimate a good
value for a hyperparameter that balances the bias/variance trade-off.
The visualizer evaluates cross-validated training and test scores for the
different hyperparameters supplied. The curve is plotted so that the
x-axis is the value of the hyperparameter and the y-axis is the model
score. This is similar to a grid search with a single hyperparameter.
The cross-validation generator splits the dataset k times, and scores are
averaged over all k runs for the training and test subsets. The curve
plots the mean score, and the filled in area suggests the variability of
cross-validation by plotting one standard deviation above and below the
mean for each split.
Parameters
----------
estimator : a scikit-learn estimator
An object that implements ``fit`` and ``predict``, can be a
classifier, regressor, or clusterer so long as there is also a valid
associated scoring metric.
Note that the object is cloned for each validation.
param_name : string
Name of the parameter that will be varied.
param_range : array-like, shape (n_values,)
The values of the parameter that will be evaluated.
ax : matplotlib.Axes object, optional
The axes object to plot the figure on.
logx : boolean, optional
If True, plots the x-axis with a logarithmic scale.
groups : array-like, with shape (n_samples,)
Optional group labels for the samples used while splitting the dataset
into train/test sets.
cv : int, cross-validation generator or an iterable, optional
Determines the cross-validation splitting strategy.
Possible inputs for cv are:
- None, to use the default 3-fold cross-validation,
- integer, to specify the number of folds.
- An object to be used as a cross-validation generator.
- An iterable yielding train/test splits.
see the scikit-learn
`cross-validation guide <https://bit.ly/2MMQAI7>`_
for more information on the possible strategies that can be used here.
scoring : string, callable or None, optional, default: None
A string or scorer callable object / function with signature
``scorer(estimator, X, y)``. See scikit-learn model evaluation
documentation for names of possible metrics.
n_jobs : integer, optional
Number of jobs to run in parallel (default 1).
pre_dispatch : integer or string, optional
Number of predispatched jobs for parallel execution (default is
all). The option can reduce the allocated memory. The string can
be an expression like '2*n_jobs'.
kwargs : dict
Keyword arguments that are passed to the base class and may influence
the visualization as defined in other Visualizers.
Attributes
----------
train_scores_ : array, shape (n_ticks, n_cv_folds)
Scores on training sets.
train_scores_mean_ : array, shape (n_ticks,)
Mean training data scores for each training split
train_scores_std_ : array, shape (n_ticks,)
Standard deviation of training data scores for each training split
test_scores_ : array, shape (n_ticks, n_cv_folds)
Scores on test set.
test_scores_mean_ : array, shape (n_ticks,)
Mean test data scores for each test split
test_scores_std_ : array, shape (n_ticks,)
Standard deviation of test data scores for each test split
Examples
--------
>>> import numpy as np
>>> from yellowbrick.model_selection import ValidationCurve
>>> from sklearn.svm import SVC
>>> pr = np.logspace(-6,-1,5)
>>> model = ValidationCurve(SVC(), param_name="gamma", param_range=pr)
>>> model.fit(X, y)
>>> model.show()
Notes
-----
This visualizer is essentially a wrapper for the
``sklearn.model_selection.learning_curve utility``, discussed in the
`validation curves <https://bit.ly/2KlumeB>`__
documentation.
.. seealso:: The documentation for the
`learning_curve <https://bit.ly/2Yz9sBB>`__
function, which this visualizer wraps.
"""
def __init__(
self,
estimator,
param_name,
param_range,
ax=None,
logx=False,
groups=None,
cv=None,
scoring=None,
n_jobs=1,
pre_dispatch="all",
**kwargs
):
# Initialize the model visualizer
super(ValidationCurve, self).__init__(estimator, ax=ax, **kwargs)
# Validate the param_range
param_range = np.asarray(param_range)
if param_range.ndim != 1:
raise YellowbrickValueError(
"must specify array of param values, '{}' is not valid".format(
repr(param_range)
)
)
# Set the visual and validation curve parameters on the estimator
self.param_name = param_name
self.param_range = param_range
self.logx = logx
self.groups = groups
self.cv = cv
self.scoring = scoring
self.n_jobs = n_jobs
self.pre_dispatch = pre_dispatch
def fit(self, X, y=None):
"""
Fits the validation curve with the wrapped estimator and parameter
array to the specified data. Draws training and test score curves and
saves the scores to the visualizer.
Parameters
----------
X : array-like, shape (n_samples, n_features)
Training vector, where n_samples is the number of samples and
n_features is the number of features.
y : array-like, shape (n_samples) or (n_samples, n_features), optional
Target relative to X for classification or regression;
None for unsupervised learning.
Returns
-------
self : instance
Returns the instance of the validation curve visualizer for use in
pipelines and other sequential transformers.
"""
# arguments to pass to sk_validation_curve
skvc_kwargs = {
key: self.get_params()[key]
for key in (
"param_name",
"param_range",
"groups",
"cv",
"scoring",
"n_jobs",
"pre_dispatch",
)
}
# compute the validation curve and store scores
curve = sk_validation_curve(self.estimator, X, y, **skvc_kwargs)
self.train_scores_, self.test_scores_ = curve
# compute the mean and standard deviation of the training data
self.train_scores_mean_ = np.mean(self.train_scores_, axis=1)
self.train_scores_std_ = np.std(self.train_scores_, axis=1)
# compute the mean and standard deviation of the test data
self.test_scores_mean_ = np.mean(self.test_scores_, axis=1)
self.test_scores_std_ = np.std(self.test_scores_, axis=1)
# draw the curves on the current axes
self.draw()
return self
def draw(self, **kwargs):
"""
Renders the training and test curves.
"""
# Specify the curves to draw and their labels
labels = ("Training Score", "Cross Validation Score")
curves = (
(self.train_scores_mean_, self.train_scores_std_),
(self.test_scores_mean_, self.test_scores_std_),
)
# Get the colors for the train and test curves
colors = resolve_colors(n_colors=2)
# Plot the fill betweens first so they are behind the curves.
for idx, (mean, std) in enumerate(curves):
# Plot one standard deviation above and below the mean
self.ax.fill_between(
self.param_range, mean - std, mean + std, alpha=0.25, color=colors[idx]
)
# Plot the mean curves so they are in front of the variance fill
for idx, (mean, _) in enumerate(curves):
self.ax.plot(
self.param_range, mean, "d-", color=colors[idx], label=labels[idx]
)
if self.logx:
self.ax.set_xscale("log")
return self.ax
def finalize(self, **kwargs):
"""
Add the title, legend, and other visual final touches to the plot.
"""
# Set the title of the figure
self.set_title("Validation Curve for {}".format(self.name))
# Add the legend
self.ax.legend(frameon=True, loc="best")
# Set the axis labels
self.ax.set_xlabel(self.param_name)
self.ax.set_ylabel("score")
##########################################################################
# Quick Method
##########################################################################
def validation_curve(
estimator,
X,
y,
param_name,
param_range,
ax=None,
logx=False,
groups=None,
cv=None,
scoring=None,
n_jobs=1,
pre_dispatch="all",
show=True,
**kwargs
):
"""
Displays a validation curve for the specified param and values, plotting
both the train and cross-validated test scores. The validation curve is a
visual, single-parameter grid search used to tune a model to find the best
balance between error due to bias and error due to variance.
This helper function is a wrapper to use the ValidationCurve in a fast,
visual analysis.
Parameters
----------
estimator : a scikit-learn estimator
An object that implements ``fit`` and ``predict``, can be a
classifier, regressor, or clusterer so long as there is also a valid
associated scoring metric.
Note that the object is cloned for each validation.
X : array-like, shape (n_samples, n_features)
Training vector, where n_samples is the number of samples and
n_features is the number of features.
y : array-like, shape (n_samples) or (n_samples, n_features), optional
Target relative to X for classification or regression;
None for unsupervised learning.
param_name : string
Name of the parameter that will be varied.
param_range : array-like, shape (n_values,)
The values of the parameter that will be evaluated.
ax : matplotlib.Axes object, optional
The axes object to plot the figure on.
logx : boolean, optional
If True, plots the x-axis with a logarithmic scale.
groups : array-like, with shape (n_samples,)
Optional group labels for the samples used while splitting the dataset
into train/test sets.
cv : int, cross-validation generator or an iterable, optional
Determines the cross-validation splitting strategy.
Possible inputs for cv are:
- None, to use the default 3-fold cross-validation,
- integer, to specify the number of folds.
- An object to be used as a cross-validation generator.
- An iterable yielding train/test splits.
see the scikit-learn
`cross-validation guide <https://bit.ly/2MMQAI7>`_
for more information on the possible strategies that can be used here.
scoring : string, callable or None, optional, default: None
A string or scorer callable object / function with signature
``scorer(estimator, X, y)``. See scikit-learn model evaluation
documentation for names of possible metrics.
n_jobs : integer, optional
Number of jobs to run in parallel (default 1).
pre_dispatch : integer or string, optional
Number of predispatched jobs for parallel execution (default is
all). The option can reduce the allocated memory. The string can
be an expression like '2*n_jobs'.
show: bool, default: True
If True, calls ``show()``, which in turn calls ``plt.show()`` however
you cannot call ``plt.savefig`` from this signature, nor
``clear_figure``. If False, simply calls ``finalize()``
kwargs : dict
Keyword arguments that are passed to the base class and may influence
the visualization as defined in other Visualizers. These arguments are
also passed to the ``show()`` method, e.g. can pass a path to save the
figure to.
Returns
-------
visualizer : ValidationCurve
The fitted visualizer
"""
# Initialize the visualizer
oz = ValidationCurve(
estimator,
param_name,
param_range,
ax=ax,
logx=logx,
groups=groups,
cv=cv,
scoring=scoring,
n_jobs=n_jobs,
pre_dispatch=pre_dispatch,
)
# Fit the visualizer
oz.fit(X, y)
# Draw final visualization
if show:
oz.show(**kwargs)
else:
oz.finalize()
# Return the visualizer object
return oz
| apache-2.0 |
rzzzwilson/morse | morse/gui.py | 1 | 6275 | # class taken from the SciPy 2015 Vispy talk opening example
# see https://github.com/vispy/vispy/pull/928
import sys
import threading
import atexit
import numpy as np
import pyaudio
from matplotlib.backends.backend_qt5agg import FigureCanvasQTAgg as FigureCanvas
from matplotlib.backends.backend_qt5agg import NavigationToolbar2QT as NavigationToolbar
import matplotlib.pyplot as plt
from PyQt5 import QtCore
from PyQt5.QtWidgets import QApplication, QWidget, QHBoxLayout, QLabel, QCheckBox, QSlider, QVBoxLayout
def movingaverage(interval, window_size):
window = np.ones(int(window_size))/float(window_size)
return np.convolve(interval, window, 'same')
class MplFigure(object):
def __init__(self, parent):
self.figure = plt.figure(facecolor='white')
self.canvas = FigureCanvas(self.figure)
self.toolbar = NavigationToolbar(self.canvas, parent)
class MicrophoneRecorder(object):
def __init__(self, rate=4000, chunksize=1024):
self.rate = rate
self.chunksize = chunksize
self.p = pyaudio.PyAudio()
self.stream = self.p.open(format=pyaudio.paInt16,
channels=1,
rate=self.rate,
input=True,
frames_per_buffer=self.chunksize,
stream_callback=self.new_frame)
self.lock = threading.Lock()
self.stop = False
self.frames = []
atexit.register(self.close)
def new_frame(self, data, frame_count, time_info, status):
data = np.fromstring(data, 'int16')
with self.lock:
self.frames.append(data)
if self.stop:
return None, pyaudio.paComplete
return None, pyaudio.paContinue
def get_frames(self):
with self.lock:
frames = self.frames
self.frames = []
return frames
def start(self):
self.stream.start_stream()
def close(self):
with self.lock:
self.stop = True
self.stream.close()
self.p.terminate()
class LiveFFTWidget(QWidget):
def __init__(self):
QWidget.__init__(self)
# customize the UI
self.initUI()
# init class data
self.initData()
# connect slots
self.connectSlots()
# init MPL widget
self.initMplWidget()
def initUI(self):
hbox_gain = QHBoxLayout()
autoGain = QLabel('Auto gain for frequency spectrum')
autoGainCheckBox = QCheckBox(checked=False)
hbox_gain.addWidget(autoGain)
hbox_gain.addWidget(autoGainCheckBox)
# reference to checkbox
self.autoGainCheckBox = autoGainCheckBox
hbox_fixedGain = QHBoxLayout()
fixedGain = QLabel('Manual gain level for frequency spectrum')
fixedGainSlider = QSlider(QtCore.Qt.Horizontal)
fixedGainSlider.setValue(100)
hbox_fixedGain.addWidget(fixedGain)
hbox_fixedGain.addWidget(fixedGainSlider)
self.fixedGainSlider = fixedGainSlider
vbox = QVBoxLayout()
vbox.addLayout(hbox_gain)
vbox.addLayout(hbox_fixedGain)
# mpl figure
self.main_figure = MplFigure(self)
vbox.addWidget(self.main_figure.toolbar)
vbox.addWidget(self.main_figure.canvas)
self.setLayout(vbox)
self.setGeometry(300, 300, 350, 300)
self.setWindowTitle('LiveFFT')
self.show()
# timer for callbacks, taken from:
# http://ralsina.me/weblog/posts/BB974.html
timer = QtCore.QTimer()
timer.timeout.connect(self.handleNewData)
timer.start(100)
# keep reference to timer
self.timer = timer
def initData(self):
# mic = MicrophoneRecorder()
mic = MicrophoneRecorder(rate=8000, chunksize=4096)
mic.start()
# keeps reference to mic
self.mic = mic
# computes the parameters that will be used during plotting
self.freq_vect = np.fft.rfftfreq(mic.chunksize, 1./mic.rate)
self.time_vect = np.arange(mic.chunksize, dtype=np.float32) / mic.rate * 1000
def connectSlots(self):
pass
def initMplWidget(self):
"""creates initial matplotlib plots in the main window and keeps
references for further use"""
# top plot
self.ax_top = self.main_figure.figure.add_subplot(211)
# self.ax_top.set_ylim(-32768, 32768)
self.ax_top.set_ylim(0, 32768)
self.ax_top.set_xlim(0, self.time_vect.max())
self.ax_top.set_xlabel(u'time (ms)', fontsize=6)
# bottom plot
self.ax_bottom = self.main_figure.figure.add_subplot(212)
self.ax_bottom.set_ylim(0, 1)
self.ax_bottom.set_xlim(0, self.freq_vect.max())
self.ax_bottom.set_xlabel(u'frequency (Hz)', fontsize=6)
# line objects
self.line_top, = self.ax_top.plot(self.time_vect, np.ones_like(self.time_vect))
self.line_bottom, = self.ax_bottom.plot(self.freq_vect, np.ones_like(self.freq_vect))
def handleNewData(self):
""" handles the asynchroneously collected sound chunks """
# gets the latest frames
frames = self.mic.get_frames()
if len(frames) > 0:
# keeps only the last frame
current_frame = frames[-1]
# smooth frame
current_frame = np.absolute(current_frame)
current_frame = movingaverage(current_frame, 32)
# plots the time signal
self.line_top.set_data(self.time_vect, current_frame)
# computes and plots the fft signal
fft_frame = np.fft.rfft(current_frame)
if self.autoGainCheckBox.checkState() == QtCore.Qt.Checked:
fft_frame /= np.abs(fft_frame).max()
else:
fft_frame *= (1 + self.fixedGainSlider.value()) / 5000000.
#print(np.abs(fft_frame).max())
self.line_bottom.set_data(self.freq_vect, np.abs(fft_frame))
# refreshes the plots
self.main_figure.canvas.draw()
if __name__ == "__main__":
app = QApplication(sys.argv)
window = LiveFFTWidget()
sys.exit(app.exec_())
| mit |
alancucki/multipoint_tsne | multipoint_tsne.py | 1 | 28802 | import datetime as dt
import logging
import sys
from collections import defaultdict
import numpy as np
import theano
import theano.tensor as tt
from scipy.spatial.distance import squareform
from sklearn import utils
from sklearn.decomposition import PCA
from sklearn.manifold import t_sne#, _barnes_hut_tsne
from sklearn.metrics.pairwise import pairwise_distances
from sklearn.neighbors import BallTree
import opt
import pyximport
pyximport.install(reload_support=True)
import _barnes_hut_tsne
import _barnes_hut_mptsne
logger = logging.getLogger(__name__)
logging.basicConfig(format="[%(asctime)s] %(message)s", datefmt="%H:%M",
level=logging.INFO, stream=sys.stdout)
def pdist2(X, Y=None, lib=np):
"""Computes a matrix of squared distances.
Args:
X: Matrix, each row is a datapoint.
Y: (optional) Matrix of datapoints.
lib: (optional) Module computing the expression (numpy or Theano)
"""
sum_X = lib.sum(lib.square(X), 1)
if Y is None:
return (-2 * lib.dot(X, X.T) + sum_X).T + sum_X
sum_Y = lib.sum(lib.square(Y), 1)
return (-2 * lib.dot(Y, X.T) + sum_X).T + sum_Y
def pdist2_with_copies(n_in, Y, full_copy_mask):
"""Computes a matrix of squared distances.
Points in Y might have copies. If y_i and y_j have copies, then the computed
distance should be the closest distance of any of their copies.
If Y has N+C points, where C is the number of copies, returned matrix
should be N x N.
Args:
Y: Matrix of datapoints.
full_copy_mask: List of N+C indices describing which datapoint has been
copied.
"""
if full_copy_mask is None:
return pdist2(Y)
count_dict = defaultdict(int)
layer_specs = defaultdict(list)
for ind,proto in enumerate(full_copy_mask):
count_dict[proto] += 1
l = count_dict[proto]
layer_specs[l].append((ind, proto))
# Sort the last layer. Each preceeding layer should have elements
# from the previous layer in the same order at the end,
# the rest at the beginning, e.g.:
# layer#1: [0, 1, 4, 3, 5, 2]
# layer#2: [4, 3, 5, 2]
# layer#3: [3, 5, 2]
# layer#4: [2]
nlayers = len(layer_specs.keys())
for l in range(1, nlayers + 1):
layer_specs[l] = sorted(
layer_specs[l],
key=lambda (ind, proto): count_dict[proto] * 1000000 + proto)
# Compute pdist between original pts
inds = list(zip(*layer_specs[1])[0])
assert len(inds) == n_in
Layer = Y[inds]
YY = pdist2(Layer, lib=tt)
# Update it with pdists from copies using tt.minimum
for i in xrange(1, nlayers + 1):
for j in xrange(i, nlayers + 1):
if i == j == 1:
continue
layer1_inds = list(zip(*layer_specs[i])[0])
layer2_inds = list(zip(*layer_specs[j])[0])
Layer1 = Y[layer1_inds]
Layer2 = Y[layer2_inds]
sz_layer1 = Layer1.shape[0]
sz_layer2 = Layer2.shape[0]
YK = pdist2(Layer1, Layer2, lib=tt)
YY = tt.set_subtensor(YY[-sz_layer1:, -sz_layer2:],
tt.minimum(YY[-sz_layer1:, -sz_layer2:], YK))
if i != j:
YY = tt.set_subtensor(YY[-sz_layer2:, -sz_layer1:],
tt.minimum(YY[-sz_layer2:, -sz_layer1:], YK.T))
pdist_Y = YY
# Shuffle it back, so the order would match P.
inv_perm = np.argsort(inds)
pdist_Y = pdist_Y[inv_perm,:][:,inv_perm]
return pdist_Y
def Q_graph(pdist2):
Q = (1.0 / (1.0 + pdist2))
Q = tt.set_subtensor(
Q[tt.arange(Q.shape[0]), tt.arange(Q.shape[0])], 0.0)
Q /= Q.sum()
Q = tt.maximum(Q, 1e-12)
return Q
def full_copy_mask__to__next_repr_pos_idx(fcm, N):
ret = np.ones(fcm.shape, dtype=np.int64) * -1
num_reprs = np.ones(N, dtype=np.int32)
next_idx_for_label = np.arange(N)
for idx in range(N, fcm.shape[0]):
label = fcm[idx]
num_reprs[label] += 1
ret[next_idx_for_label[label]] = idx
next_idx_for_label[label] = idx
return ret, num_reprs
def _prepare_data(x, p, perplexity, initial_dims, compute_neighbors_nn=False):
neighbors_nn = None
if x is None and p is None:
raise ValueError('Both supplied X and P are None.')
if not x is None and not p is None:
raise ValueError('Supplied both X and P. Please supply one.')
if p is None:
logger.info('Applying PCA')
time0 = dt.datetime.now()
if x.shape[1] > initial_dims:
x = PCA(n_components=initial_dims).fit_transform(x)
time1 = dt.datetime.now()
if (time1 - time0).seconds > 10:
logging.info('Took', (time1 - time0).seconds, 'seconds')
logging.info('Computing pairwise dists...')
time0 = dt.datetime.now()
distances = pairwise_distances(x, metric='euclidean', squared=True)
n_samples = x.shape[0]
k = min(n_samples - 1, int(3. * perplexity + 1))
time1 = dt.datetime.now()
if (time1 - time0).seconds > 10:
logging.info('Took', (time1 - time0).seconds, 'seconds')
logging.info('Computing NNs...')
time0 = dt.datetime.now()
bt = BallTree(x)
distances_nn, neighbors_nn = bt.query(x, k=k + 1)
neighbors_nn = neighbors_nn[:, 1:]
time1 = dt.datetime.now()
if (time1 - time0).seconds > 10:
logging.info('Took', (time1 - time0).seconds, 'seconds')
logging.info('Computing P...')
time0 = dt.datetime.now()
P_sparse = tsne._joint_probabilities_nn(distances, neighbors_nn,
perplexity, verbose=False)
time1 = dt.datetime.now()
if (time1 - time0).seconds > 10:
logging.info('Took', (time1 - time0).seconds, 'seconds')
logging.info('Computing squareform P...')
time0 = dt.datetime.now()
p = squareform(P_sparse).astype(np.float32)
time1 = dt.datetime.now()
if (time1 - time0).seconds > 10:
logger.info('Took', (time1 - time0).seconds, 'seconds')
if compute_neighbors_nn and neighbors_nn is None:
n_in = x.shape[0] if p is None else p.shape[0]
neighbors_nn = k_neighbors(perplexity, n_in, x=x, p=p)
return np.ascontiguousarray(p), np.ascontiguousarray(neighbors_nn)
def copy_potential(p, q, y, copy_mask, sne_grad=True):
pdist = pdist2(y)
n_in = p.shape[0]
n_out = y.shape[0]
# Iterate over nonnegative values and check,
# if the corresponding pts are the closest (copies).
if not copy_mask is None:
cm = copy_mask # CopyMask(n_in)
# cm.update(copy_mask)
p_ = p[cm.full_copy_mask, :][:, cm.full_copy_mask]
q_ = q[cm.full_copy_mask, :][:, cm.full_copy_mask]
const_terms = (p_ > q_) * (p_ - q_)
if not sne_grad:
const_terms *= (1.0 / (1.0 + pdist))
for i in xrange(n_out):
for j in xrange(n_out):
if i == j or const_terms[i,j] < 1e-14:
continue
# Check if pts for (i,j) are the closest ones.
if not cm.are_closest(y, i, j):
const_terms[i,j] = 0.0
else:
const_terms = (p > q) * (p - q)
if not sne_grad:
const_terms *= (1.0 / (1.0 + pdist))
# Now only Fattr for valid copy-copy pairs are left in const terms.
# All that's left is to multiply by y_i - y_j parts and sum up.
copy_potential_grads = np.zeros(y.shape)
for i in xrange(n_out):
g = const_terms[i][:, None] * (y[i] - y)
copy_potential_grads[i] = np.sum(g, axis=0)
# Now g holds all Fattr forces working on point i.
return -1.0 * copy_potential_grads
# Either use cython or pass neighbors; check it inside the function
def cleanup_low_pbb_mass(targets, p, y, cm, neighbors=None, mass_threshold=0.05):
before = len(cm._copy_mask)
if before == 0:
return targets
assert not neighbors is None, 'Neighbors cannot be None'
next_repr_pos_idx, num_reprs = full_copy_mask__to__next_repr_pos_idx(
cm.full_copy_mask, p.shape[0])
mass = np.zeros(y.shape[0], dtype=np.float32)
_barnes_hut_mptsne.compute_pbb_mass(p, y, neighbors, num_reprs,
next_repr_pos_idx, mass, verbose=1)
inds2del = []
for r in cm.copy_group_iter():
if len(r) == 1:
continue
m = np.asarray(map(lambda t: mass[t], r))
m /= m.sum()
inds = np.asarray(r)[m < mass_threshold]
# Note: Inds should be deleted starting from the highest!
# Deletion of protos is carried by swapping with one of copies.
# It may mess up, if a proto would be deleted first.
for i in np.sort(inds)[::-1]:
inds2del.append(i)
for i in sorted(inds2del)[::-1]:
targets = del_pt(i, p, cm, targets)
after = len(cm._copy_mask)
logger.info('Cleanup of copies: {} -> {}'.format(before, after))
return targets
def probability_mass(p, y, cm):
n = p.shape[0]
mass = np.zeros((y.shape[0]))
pd = pdist2(y)
for i in xrange(y.shape[0]):
pts_i = cm.copies_of(i)
proto_i = i if i < n else cm._copy_mask[i - n]
for j in xrange(y.shape[0]):
proto_j = j if j < n else cm._copy_mask[j - n]
if p[proto_i, proto_j] < 1e-8:
continue
pts_j = cm.copies_of(j)
# Determine if d(i,j) is the closest one for this pair
min_dist = np.min(pd[pts_i, :][:, pts_j])
if np.isclose(pd[i][j], min_dist):
mass[i] += p[proto_i][proto_j]
return mass
def del_pt(ind, p, cm, targets=[]):
"""Attempt to delete a copy of a point (or proto and replace with copy)."""
n = p.shape[0]
if ind < n:
group = cm.copies_of(ind)
if len(group) == 1:
raise ValueError('Point does not have copies to delete from')
assert(ind == group[0])
for y in targets:
swap_pt(y, ind, group[1])
ind = group[1]
# Now point 'ind' can be safely deleted.
assert(ind >= n)
cm.overwrite(np.delete(cm._copy_mask, [ind - n], axis=0))
return [np.delete(y, [ind], axis=0) for y in targets]
def swap_pt(y, ind1, ind2):
tmp_row = np.copy(y[ind1])
y[ind1] = np.copy(y[ind2])
y[ind2] = np.copy(tmp_row)
def k_neighbors(perplexity, n_in, x=None, p=None, verbose=True):
k = min(n_in - 1, int(3. * perplexity + 1))
neighbors_nn = None
if verbose:
logging.info("Computing %i nearest neighbors..." % k)
if x is None:
# Use the precomputed distances to find
# the k nearest neighbors and their distances
neighbors_nn = np.argsort(p, axis=1)[:,::-1][:, :k]
else:
raise NotImplementedError
# Find the nearest neighbors for every point
bt = BallTree(X)
# LvdM uses 3 * perplexity as the number of neighbors
# And we add one to not count the data point itself
# In the event that we have very small # of points
# set the neighbors to n - 1
distances_nn, neighbors_nn = bt.query(X, k=k + 1)
neighbors_nn = neighbors_nn[:, 1:]
return np.ascontiguousarray(neighbors_nn)
class CopyMask(object):
"""Represents a mapping of datapoints to their protos (progenitors).
Args:
n: int, initial number of datapoints (without any copies)
"""
def __init__(self, n):
self.n = n
self._copy_mask = np.ndarray((0,)).astype(np.int32)
self.proto_to_copies = defaultdict(list)
self.layer_specs = defaultdict(list)
self.update(self._copy_mask)
@property
def full_copy_mask(self):
return np.concatenate([np.arange(self.n), self._copy_mask])
@property
def num_copies(self):
return self._copy_mask.shape[0]
def max_ncopies(self):
if len(self._copy_mask) == 0:
return 1
return np.max(np.bincount(self._copy_mask)) + 1
def update(self, new_copy_mask):
self._copy_mask = np.hstack([self._copy_mask, new_copy_mask])
self._copy_mask = self.proto_only(self._copy_mask)
self.proto_to_copies = defaultdict(list)
self.layer_specs = defaultdict(list)
for ind,proto in enumerate(self.full_copy_mask):
self.proto_to_copies[proto].append(ind)
l = len(self.proto_to_copies[proto])
self.layer_specs[l].append((ind,proto))
def overwrite(self, new_copy_mask):
self._copy_mask = np.ndarray((0,)).astype(np.int32)
self.update(new_copy_mask)
def proto_only(self, new_copy_mask):
ret = np.copy(new_copy_mask)
for i,v in enumerate(ret):
ret[i] = self.full_copy_mask[v]
return ret
def copies_of(self, k):
"""All copies of pt with index k"""
proto = self.full_copy_mask[k]
ret = self.proto_to_copies[proto]
assert (k in ret)
return ret
def copy_group_iter(self):
for i in xrange(self.n):
yield self.copies_of(i)
def copy_pair_iter(self):
copy_dict = defaultdict(list)
# Set pbbs among copies.
for ind,proto in enumerate(self._copy_mask):
copy_dict[proto].append(ind + self.n)
for proto,copy_list in copy_dict.items():
copy_list.append(proto)
for c1 in copy_list:
for c2 in copy_list:
yield (c1, c2)
def are_closest(self, y, i, j):
"""Checks if datapoints y_i and y_j are the closest ones
of their corresponding copy groups.
"""
ci = self.copies_of(i)
cj = self.copies_of(j)
if len(ci) == len(cj) == 1:
return True
if i in cj:
return False
pdist = pdist2(y[ci], y[cj])
min_pt = np.unravel_index(np.argmin(pdist), (len(ci), len(cj)))
return ci[min_pt[0]] == i and cj[min_pt[1]] == j
class MultipointTSNE(object):
def __init__(self,
n_components=2,
perplexity=30.0,
early_exaggeration=4.0,
learning_rate=200.0,
optimizer='gd',
optimizer_kwargs={},
initial_dims=50,
train_schedule=[(True, 250, 0.0), (False, 750, 0.0)],
init='random',
verbose=0,
n_iter_check=100,
random_state=None,
method='barnes_hut',
angle=0.5,
num_cleanups=2,
cleanup_thresh=0.15,
cleanup_thresh_after_clonning=None):
self.ndims = n_components
self.perplexity = perplexity
self.early_exagg = early_exaggeration
optimizer_kwargs.update({'lrate': learning_rate, 'verbose': verbose,
'n_iter_check': n_iter_check})
opt_protos = {'gd': opt.GD, 'adam': opt.Adam, 'lbfgsb': opt.LBFGSB}
self.optimizer = opt_protos[optimizer](**optimizer_kwargs)
self.initial_dims = initial_dims
self.train_schedule = train_schedule
self.init_method = init
self.verbose = verbose
self.random_state = utils.check_random_state(random_state)
self.method = method
if method != 'barnes_hut':
raise NotImplementedError
self.angle = angle
self.cleanup_thresh = cleanup_thresh
if cleanup_thresh_after_clonning is None:
cleanup_thresh_after_clonning = cleanup_thresh
self.cleanup_thresh_after_clonning = cleanup_thresh_after_clonning
self.num_cleanups = num_cleanups
self.history = []
self.copy_history = [] # Iterations, in which points were clonned.
def _initial_solution(self, n, method, x=None):
if method == 'random':
return self.random_state.randn(n, self.ndims)
elif method == 'svd':
from sklearn.decomposition import TruncatedSVD
return np.TruncatedSVD(n_components=self.ndims).fit_transform(x)
else:
raise ValueError('Unknown initialization method ' + method)
def compute_Q(self, y=None, copy_mask=None):
if y is None:
y = self.y
if copy_mask is None and self.cm is None:
pdist2_y = pdist2(y)
else:
if copy_mask is None:
copy_mask = self.cm._copy_mask
n = y.shape[0] - len(copy_mask)
pdist2_y = pdist2_with_copies(
self.n_in, y, self.cm.full_copy_mask)
Pdist = tt.dmatrix('Pdist')
return Q_graph(Pdist).eval({Pdist: pdist2_y})
def copy_pts(self, copy_perc=None, ncopies=None, copy_potential_grads=None,
method='percent'):
"""Sums forces coming from points, for which Fattr > Frep.
j-th point is similar in original space (P[i,j] > 0 so Fattr exists).
The point does not meet similarity yet (attr stornger than rep).
"""
self.copy_history.append(len(self.history))
logger.info('Picking pts to copy')
if copy_potential_grads is None:
if self.method == 'barnes_hut':
next_repr_pos_idx, num_reprs = full_copy_mask__to__next_repr_pos_idx(
self.cm.full_copy_mask, self._P.shape[0]
)
grad = np.zeros(self.y.shape, dtype=np.float32)
copy_potential_grads = np.zeros(self.y.shape, dtype=np.float32)
_barnes_hut_mptsne.gradient_mptsne(
self._P, self.y, self.neighbors_nn, grad, copy_potential_grads,
num_reprs, next_repr_pos_idx, self.angle, self.ndims, False,
correct_cell_counts=True, dof=1.0, skip_num_points=0
)
elif method == 'exact':
q = self.compute_Q(y=self.y)
copy_potential_grads = copy_potential(self._P, q, self.y, self.cm)
else:
raise ValueError
self.copy_potential_grads = copy_potential_grads
if method == 'auto':
potential = np.sum(copy_potential_grads ** 2, axis=1)
t = 0.0005
ncopies = np.sum(potential > t)
elif method == 'percent':
# Select top copy_percentage.
ncopies = int(round(copy_perc * self._P.shape[0] / 100))
else:
raise ValueError('Unknown copy method ' + method)
init_ncopies = ncopies
if ncopies > 0:
# Reverse and select first ncopies.
potential = np.sum(self.copy_potential_grads ** 2, axis=1)
cm_update = np.argsort(potential)[::-1][:ncopies]
if method == 'percent':
method += '%d ' % copy_perc
(y_update, cm_update) = self.initialize_copies(self.y, cm_update)
self.y = np.vstack([self.y, y_update])
self.cm.update(cm_update)
ncopies = len(cm_update)
def update_param_(param_, with_avg=False):
if with_avg:
new_rows = np.tile(
np.mean(param_.reshape(-1, self.ndims), axis=0),
(ncopies, 1))
else:
np.zeros((ncopies, self.ndims), dtype=param_.dtype)
return np.vstack([
param_.reshape(-1, self.ndims),
new_rows,
]).ravel()
if hasattr(self.optimizer, 'momentum'):
self.optimizer.update = update_param_(self.optimizer.update)
if hasattr(self.optimizer, 'm'):
self.optimizer.m = update_param_(self.optimizer.m, with_avg=True)
self.optimizer.v = update_param_(self.optimizer.v, with_avg=True)
logger.info('Initialized %d new copies (%d -> %d -> %d total, %s)' % (
ncopies, self.y.shape[0] - ncopies,
self.y.shape[0] - ncopies + init_ncopies,
self.y.shape[0], method))
def initialize_copies(self, y, cm_update):
p = self._P
n = p.shape[0]
newcopies = len(cm_update)
old_copy_mask = self.cm._copy_mask
cm_update_proto = self.cm.proto_only(cm_update)
tmp_fcm = np.concatenate([self.cm.full_copy_mask, cm_update_proto])
# Cost function for a single point
Y = tt.dmatrix('Y')
pdist_Y = pdist2_with_copies(self.n_in, Y, tmp_fcm)
Q = Q_graph(pdist_Y)
# Compute for selected points and do not sum up
KL = tt.where(abs(p) > 1e-8, p * tt.log(p / Q), 0)
cost_fun = theano.function([Y], [KL.sum(axis=1)])
y_new = np.copy(y[cm_update])
yy = np.vstack([y, y_new])
# Pick appriopriate gradients
g = self.copy_potential_grads[cm_update]
assert(g.shape[0] == newcopies)
# Normalize the gradients
g /= np.sqrt(np.sum(g ** 2, axis=1))[:,None]
std = np.std(y, axis=0)
nsteps = 100
step_size = 5.0 * std / nsteps
costs = np.zeros((nsteps, newcopies))
yy[-newcopies:] += g * step_size
for i in range(nsteps):
yy[-newcopies:] += g * step_size
# Pick the ones matching this protos
costs[i] = cost_fun(yy)[0][cm_update_proto]
steps = np.argmin(costs, axis=0)
y_new += g * (steps[:, None]+1) * step_size
return (y_new, cm_update_proto)
def continue_run(self, new_train_schedule):
self.run_schedule(new_train_schedule)
self.train_schedule += new_train_schedule
def run_schedule(self, schedule):
niters = sum([stage[1] for stage in schedule])
logger.info('Running %d iterations' % niters)
def cleanup(threshold=None):
if threshold is None:
threshold = self.cleanup_thresh
if not threshold or len(self.cm.full_copy_mask) <= self.n_in:
return
for j in range(self.num_cleanups):
targets = [self.y]
if hasattr(self.optimizer, 'momentum'):
targets.append(self.optimizer.update.reshape(-1, self.ndims))
if hasattr(self.optimizer, 'm'):
targets.append(self.optimizer.m.reshape(-1, self.ndims))
targets.append(self.optimizer.v.reshape(-1, self.ndims))
targets = cleanup_low_pbb_mass(
targets, self._P, self.y, self.cm,
neighbors=self.neighbors_nn, mass_threshold=threshold
)
self.y = targets.pop(0)
if hasattr(self.optimizer, 'momentum'):
self.optimizer.update = targets.pop(0).ravel()
if hasattr(self.optimizer, 'm'):
self.optimizer.m = targets.pop(0).ravel()
self.optimizer.v = targets.pop(0).ravel()
for (exaggerate, num_iters, copy_perc) in schedule:
if copy_perc:
method = 'auto' if self.autocopy and not exaggerate else 'percent'
self.copy_pts(copy_perc=copy_perc, method=method)
cleanup(self.cleanup_thresh_after_clonning)
if exaggerate:
self._P *= self.early_exagg
self.y, err = self.optimizer.optimize(
self.y, obj_fun=self._kl_divergence_bh,
obj_kwargs=self.obj_kwargs, num_iters=num_iters)
self.y = self.y.reshape(-1, self.ndims)
print_precise_error = 0
if print_precise_error:
# Print Theano error
P, Y = tt.fmatrices('P', 'Y')
obj = self.tt_cost_copy(self._P, Y, self.cm)
logging.info('Precise error:', obj(self.y)[0])
if exaggerate:
self._P /= self.early_exagg
cleanup()
def run(self, x=None, p=None):
p, self.neighbors_nn = _prepare_data(
x, p, self.perplexity, self.initial_dims, compute_neighbors_nn=True)
self._P = np.copy(p).astype(np.float32)
self.n_in = p.shape[0]
self.cm = CopyMask(self.n_in)
self.obj_kwargs = {'P': self._P,
'neighbors': self.neighbors_nn,
'degrees_of_freedom': 1.0, # XXX
'n_components': self.ndims,
'angle': self.angle,
'skip_num_points': 0, # XXX
'verbose': self.verbose,
'copy_mask': self.cm}
self.autocopy = False
self.y = self._initial_solution(
self.n_in, self.init_method).astype(np.float32)
logger.info('Optimizer: %s' % type(self.optimizer))
self.run_schedule(self.train_schedule)
return self.y
def initialize_copies_seq(self, y, cm_update):
'''Initialize copies sequentially
New copy_mask has indices of points to be initialized.
They are in order of importance; try doing line search with
each one of them, going from the first to the last.
'''
p = self._P
n_in = p.shape[0]
n_out_prev = y.shape[0]
cm_update_proto = self.cm.proto_only(cm_update)
# Precompute stuff about y
std = np.std(y, axis=0)
nsteps = 50
step_size = 5.0 * std / nsteps
costs = np.zeros(nsteps)
yy = y
# For each pt in copy_mask:
sys.stdout.write('[c-SNE] Initializing copies ') ; sys.stdout.flush()
tmp_fcm = np.copy(self.cm.full_copy_mask)
for ind, (copy, proto) in enumerate(zip(cm_update, cm_update_proto)):
# Add point to y
y_new = np.copy(y[copy])
yy = np.vstack([yy, y_new])
# Perform line search
tmp_fcm = np.append(tmp_fcm, proto)
next_repr_pos_idx, num_reprs = \
full_copy_mask__to__next_repr_pos_idx(tmp_fcm, n_in)
# Pick appriopriate gradients
g = self.copy_potential_grads[copy]
# Normalize the gradients
g /= np.sqrt(np.sum(g ** 2))
for i in range(nsteps):
yy[-1] += g * step_size
costs[i] = _barnes_hut_mptsne.estimate_error(
p, yy, self.neighbors_nn, num_reprs, next_repr_pos_idx,
self.sum_Q, self.ndims, verbose=0)
step = np.argmin(costs, axis=0)
if 1 < step + 1 < nsteps:
sys.stdout.write('.') ; sys.stdout.flush()
yy[-1] = y[copy] + g * (step + 1) * step_size
else:
sys.stdout.write('x') ; sys.stdout.flush()
yy = yy[:-1]
tmp_fcm = tmp_fcm[:-1]
print ''
return (yy[n_out_prev:], tmp_fcm[n_out_prev:])
def _kl_divergence_bh(self, Y, P, neighbors, degrees_of_freedom,
n_components, angle=0.5, skip_num_points=0,
verbose=False, copy_mask=None):
assert np.all(np.isfinite(Y)), 'Infinite elems in Y'
if np.any(np.invert(np.isfinite(Y))):
print np.where(np.invert(np.isfinite(Y)))
Y = Y.reshape(Y.shape[0] / n_components, n_components)
if copy_mask is None or copy_mask.num_copies == 0:
grad = np.zeros(Y.shape, dtype=np.float32)
error = _barnes_hut_tsne.gradient(
P, Y, neighbors, grad, angle, n_components, verbose=False,
dof=degrees_of_freedom, skip_num_points=skip_num_points)
else:
# Prepare the 'num_reprs' list
# Preapre the 'next_repr_pos_id' list
next_repr_pos_idx, num_reprs = full_copy_mask__to__next_repr_pos_idx(
copy_mask.full_copy_mask, P.shape[0])
grad = np.zeros(Y.shape, dtype=np.float32)
potential_grads = np.zeros(Y.shape, dtype=np.float32)
sum_Q = np.zeros(1, dtype=np.float32)
error = _barnes_hut_mptsne.gradient_mptsne(
P, Y, neighbors, grad, potential_grads, num_reprs,
next_repr_pos_idx, angle, n_components, verbose=False,
correct_cell_counts=True, dof=degrees_of_freedom,
skip_num_points=skip_num_points, store_sum_Q=sum_Q)
self.potential_grads = potential_grads
self.sum_Q = sum_Q[0]
c = 2.0 * (degrees_of_freedom + 1.0) / degrees_of_freedom
grad = grad.ravel()
grad *= c
return error, grad
| bsd-3-clause |
leferrad/learninspy | test/test_metrics.py | 1 | 3016 | #!/usr/bin/env python
# -*- coding: utf-8 -*-
__author__ = 'leferrad'
# For Travis CI compatibility on plots
import matplotlib
matplotlib.use('agg')
from learninspy.utils.evaluation import ClassificationMetrics, RegressionMetrics
from learninspy.utils.fileio import get_logger
from learninspy.utils.plots import plot_confusion_matrix
import numpy as np
logger = get_logger(name=__name__)
def test_classification_metrics():
logger.info("Testeando métricas de evaluación para clasificación...")
# Ejemplos para testear evaluación sobre 3 clases
predict = [0, 1, 0, 2, 2, 1]
labels = [0, 1, 1, 2, 1, 0]
metrics = ClassificationMetrics(zip(predict, labels), 3)
# Generales
assert metrics.accuracy() == 0.5
assert (metrics.confusion_matrix() == np.array([[1, 1, 0], [1, 1, 1], [0, 0, 1]])).all()
# Por etiqueta
assert metrics.precision(label=0) == metrics.precision(label=1) == metrics.precision(label=2) == 0.5
assert (metrics.recall(label=0) == 0.5 and
metrics.recall(label=1) == 0.3333333333333333 and
metrics.recall(label=2) == 1.0)
# Micro and macro
assert metrics.precision(macro=True) == metrics.precision(macro=False) == 0.5
assert (metrics.recall(macro=True) == 0.611111111111111 and
metrics.recall(macro=False) == 0.5)
# F-measure variando Beta
assert metrics.f_measure(beta=1) == 0.5499999999999999 # F1-score, igual ponderación
assert metrics.f_measure(beta=0.5) == 0.5188679245283019 # F0.5 score, prioriza precision en lugar de recall
assert metrics.f_measure(beta=2) == 0.5851063829787233 # F2-score, prioriza recall en lugar de precision
logger.info("OK")
def test_plotting():
# Test of plotting confusion matrix
logger.info("Testeando plot de matriz de confusion...")
predict = [0, 1, 0, 2, 2, 1]
labels = [0, 1, 1, 2, 1, 0]
metrics = ClassificationMetrics(zip(predict, labels), 3)
plot_confusion_matrix(metrics.confusion_matrix(), show=False)
logger.info("OK")
return
def test_regression_metrics():
logger.info("Testeando métricas de evaluación para regresión...")
# Ejemplos para testear regresión
dim = 100 # Tamaño del muestreo
x = np.linspace(-4, 4, dim) # Eje x
clean = np.sinc(x) # Función sinc (labels)
np.random.seed(123)
noise = np.random.uniform(0, 0.1, dim) # Ruido para ensuciar la sinc (error en la predicción)
signal = clean + noise # Señal resultante del ruido aditivo (predict)
metrics = RegressionMetrics(zip(signal, clean))
assert np.allclose(metrics.r2(), 0.9708194315829859, rtol=1e-4)
assert np.allclose(metrics.explained_variance(), 0.9943620888461356, rtol=1e-4)
assert np.allclose(metrics.mse(), 0.0031164269743473839, rtol=1e-4)
assert np.allclose(metrics.rmse(), 0.0558249673027, rtol=1e-4)
assert np.allclose(metrics.mae(), 0.0501428880, rtol=1e-4)
assert np.allclose(metrics.rmae(), 0.2239260770, rtol=1e-4)
logger.info("OK")
| isc |
mne-tools/mne-tools.github.io | 0.18/_downloads/84edbf21b0a4d2c809f9a980df68abb5/plot_define_target_events.py | 29 | 3376 | """
============================================================
Define target events based on time lag, plot evoked response
============================================================
This script shows how to define higher order events based on
time lag between reference and target events. For
illustration, we will put face stimuli presented into two
classes, that is 1) followed by an early button press
(within 590 milliseconds) and followed by a late button
press (later than 590 milliseconds). Finally, we will
visualize the evoked responses to both 'quickly-processed'
and 'slowly-processed' face stimuli.
"""
# Authors: Denis Engemann <denis.engemann@gmail.com>
#
# License: BSD (3-clause)
import mne
from mne import io
from mne.event import define_target_events
from mne.datasets import sample
import matplotlib.pyplot as plt
print(__doc__)
data_path = sample.data_path()
###############################################################################
# Set parameters
raw_fname = data_path + '/MEG/sample/sample_audvis_filt-0-40_raw.fif'
event_fname = data_path + '/MEG/sample/sample_audvis_filt-0-40_raw-eve.fif'
# Setup for reading the raw data
raw = io.read_raw_fif(raw_fname)
events = mne.read_events(event_fname)
# Set up pick list: EEG + STI 014 - bad channels (modify to your needs)
include = [] # or stim channels ['STI 014']
raw.info['bads'] += ['EEG 053'] # bads
# pick MEG channels
picks = mne.pick_types(raw.info, meg='mag', eeg=False, stim=False, eog=True,
include=include, exclude='bads')
###############################################################################
# Find stimulus event followed by quick button presses
reference_id = 5 # presentation of a smiley face
target_id = 32 # button press
sfreq = raw.info['sfreq'] # sampling rate
tmin = 0.1 # trials leading to very early responses will be rejected
tmax = 0.59 # ignore face stimuli followed by button press later than 590 ms
new_id = 42 # the new event id for a hit. If None, reference_id is used.
fill_na = 99 # the fill value for misses
events_, lag = define_target_events(events, reference_id, target_id,
sfreq, tmin, tmax, new_id, fill_na)
print(events_) # The 99 indicates missing or too late button presses
# besides the events also the lag between target and reference is returned
# this could e.g. be used as parametric regressor in subsequent analyses.
print(lag[lag != fill_na]) # lag in milliseconds
# #############################################################################
# Construct epochs
tmin_ = -0.2
tmax_ = 0.4
event_id = dict(early=new_id, late=fill_na)
epochs = mne.Epochs(raw, events_, event_id, tmin_,
tmax_, picks=picks, baseline=(None, 0),
reject=dict(mag=4e-12))
# average epochs and get an Evoked dataset.
early, late = [epochs[k].average() for k in event_id]
###############################################################################
# View evoked response
times = 1e3 * epochs.times # time in milliseconds
title = 'Evoked response followed by %s button press'
fig, axes = plt.subplots(2, 1)
early.plot(axes=axes[0], time_unit='s')
axes[0].set(title=title % 'late', ylabel='Evoked field (fT)')
late.plot(axes=axes[1], time_unit='s')
axes[1].set(title=title % 'early', ylabel='Evoked field (fT)')
plt.show()
| bsd-3-clause |
raghavrv/scikit-learn | sklearn/datasets/tests/test_svmlight_format.py | 9 | 17289 | from __future__ import division
from bz2 import BZ2File
import gzip
from io import BytesIO
import numpy as np
import scipy.sparse as sp
import os
import shutil
from tempfile import NamedTemporaryFile
from sklearn.externals.six import b
from sklearn.utils.testing import assert_equal
from sklearn.utils.testing import assert_array_equal
from sklearn.utils.testing import assert_array_almost_equal
from sklearn.utils.testing import assert_raises
from sklearn.utils.testing import assert_raises_regex
from sklearn.utils.testing import raises
from sklearn.utils.testing import assert_in
from sklearn.utils.fixes import sp_version
import sklearn
from sklearn.datasets import (load_svmlight_file, load_svmlight_files,
dump_svmlight_file)
currdir = os.path.dirname(os.path.abspath(__file__))
datafile = os.path.join(currdir, "data", "svmlight_classification.txt")
multifile = os.path.join(currdir, "data", "svmlight_multilabel.txt")
invalidfile = os.path.join(currdir, "data", "svmlight_invalid.txt")
invalidfile2 = os.path.join(currdir, "data", "svmlight_invalid_order.txt")
def test_load_svmlight_file():
X, y = load_svmlight_file(datafile)
# test X's shape
assert_equal(X.indptr.shape[0], 7)
assert_equal(X.shape[0], 6)
assert_equal(X.shape[1], 21)
assert_equal(y.shape[0], 6)
# test X's non-zero values
for i, j, val in ((0, 2, 2.5), (0, 10, -5.2), (0, 15, 1.5),
(1, 5, 1.0), (1, 12, -3),
(2, 20, 27)):
assert_equal(X[i, j], val)
# tests X's zero values
assert_equal(X[0, 3], 0)
assert_equal(X[0, 5], 0)
assert_equal(X[1, 8], 0)
assert_equal(X[1, 16], 0)
assert_equal(X[2, 18], 0)
# test can change X's values
X[0, 2] *= 2
assert_equal(X[0, 2], 5)
# test y
assert_array_equal(y, [1, 2, 3, 4, 1, 2])
def test_load_svmlight_file_fd():
# test loading from file descriptor
X1, y1 = load_svmlight_file(datafile)
fd = os.open(datafile, os.O_RDONLY)
try:
X2, y2 = load_svmlight_file(fd)
assert_array_equal(X1.data, X2.data)
assert_array_equal(y1, y2)
finally:
os.close(fd)
def test_load_svmlight_file_multilabel():
X, y = load_svmlight_file(multifile, multilabel=True)
assert_equal(y, [(0, 1), (2,), (), (1, 2)])
def test_load_svmlight_files():
X_train, y_train, X_test, y_test = load_svmlight_files([datafile] * 2,
dtype=np.float32)
assert_array_equal(X_train.toarray(), X_test.toarray())
assert_array_equal(y_train, y_test)
assert_equal(X_train.dtype, np.float32)
assert_equal(X_test.dtype, np.float32)
X1, y1, X2, y2, X3, y3 = load_svmlight_files([datafile] * 3,
dtype=np.float64)
assert_equal(X1.dtype, X2.dtype)
assert_equal(X2.dtype, X3.dtype)
assert_equal(X3.dtype, np.float64)
def test_load_svmlight_file_n_features():
X, y = load_svmlight_file(datafile, n_features=22)
# test X'shape
assert_equal(X.indptr.shape[0], 7)
assert_equal(X.shape[0], 6)
assert_equal(X.shape[1], 22)
# test X's non-zero values
for i, j, val in ((0, 2, 2.5), (0, 10, -5.2),
(1, 5, 1.0), (1, 12, -3)):
assert_equal(X[i, j], val)
# 21 features in file
assert_raises(ValueError, load_svmlight_file, datafile, n_features=20)
def test_load_compressed():
X, y = load_svmlight_file(datafile)
with NamedTemporaryFile(prefix="sklearn-test", suffix=".gz") as tmp:
tmp.close() # necessary under windows
with open(datafile, "rb") as f:
shutil.copyfileobj(f, gzip.open(tmp.name, "wb"))
Xgz, ygz = load_svmlight_file(tmp.name)
# because we "close" it manually and write to it,
# we need to remove it manually.
os.remove(tmp.name)
assert_array_equal(X.toarray(), Xgz.toarray())
assert_array_equal(y, ygz)
with NamedTemporaryFile(prefix="sklearn-test", suffix=".bz2") as tmp:
tmp.close() # necessary under windows
with open(datafile, "rb") as f:
shutil.copyfileobj(f, BZ2File(tmp.name, "wb"))
Xbz, ybz = load_svmlight_file(tmp.name)
# because we "close" it manually and write to it,
# we need to remove it manually.
os.remove(tmp.name)
assert_array_equal(X.toarray(), Xbz.toarray())
assert_array_equal(y, ybz)
@raises(ValueError)
def test_load_invalid_file():
load_svmlight_file(invalidfile)
@raises(ValueError)
def test_load_invalid_order_file():
load_svmlight_file(invalidfile2)
@raises(ValueError)
def test_load_zero_based():
f = BytesIO(b("-1 4:1.\n1 0:1\n"))
load_svmlight_file(f, zero_based=False)
def test_load_zero_based_auto():
data1 = b("-1 1:1 2:2 3:3\n")
data2 = b("-1 0:0 1:1\n")
f1 = BytesIO(data1)
X, y = load_svmlight_file(f1, zero_based="auto")
assert_equal(X.shape, (1, 3))
f1 = BytesIO(data1)
f2 = BytesIO(data2)
X1, y1, X2, y2 = load_svmlight_files([f1, f2], zero_based="auto")
assert_equal(X1.shape, (1, 4))
assert_equal(X2.shape, (1, 4))
def test_load_with_qid():
# load svmfile with qid attribute
data = b("""
3 qid:1 1:0.53 2:0.12
2 qid:1 1:0.13 2:0.1
7 qid:2 1:0.87 2:0.12""")
X, y = load_svmlight_file(BytesIO(data), query_id=False)
assert_array_equal(y, [3, 2, 7])
assert_array_equal(X.toarray(), [[.53, .12], [.13, .1], [.87, .12]])
res1 = load_svmlight_files([BytesIO(data)], query_id=True)
res2 = load_svmlight_file(BytesIO(data), query_id=True)
for X, y, qid in (res1, res2):
assert_array_equal(y, [3, 2, 7])
assert_array_equal(qid, [1, 1, 2])
assert_array_equal(X.toarray(), [[.53, .12], [.13, .1], [.87, .12]])
@raises(ValueError)
def test_load_invalid_file2():
load_svmlight_files([datafile, invalidfile, datafile])
@raises(TypeError)
def test_not_a_filename():
# in python 3 integers are valid file opening arguments (taken as unix
# file descriptors)
load_svmlight_file(.42)
@raises(IOError)
def test_invalid_filename():
load_svmlight_file("trou pic nic douille")
def test_dump():
X_sparse, y_dense = load_svmlight_file(datafile)
X_dense = X_sparse.toarray()
y_sparse = sp.csr_matrix(y_dense)
# slicing a csr_matrix can unsort its .indices, so test that we sort
# those correctly
X_sliced = X_sparse[np.arange(X_sparse.shape[0])]
y_sliced = y_sparse[np.arange(y_sparse.shape[0])]
for X in (X_sparse, X_dense, X_sliced):
for y in (y_sparse, y_dense, y_sliced):
for zero_based in (True, False):
for dtype in [np.float32, np.float64, np.int32]:
f = BytesIO()
# we need to pass a comment to get the version info in;
# LibSVM doesn't grok comments so they're not put in by
# default anymore.
if (sp.issparse(y) and y.shape[0] == 1):
# make sure y's shape is: (n_samples, n_labels)
# when it is sparse
y = y.T
dump_svmlight_file(X.astype(dtype), y, f, comment="test",
zero_based=zero_based)
f.seek(0)
comment = f.readline()
try:
comment = str(comment, "utf-8")
except TypeError: # fails in Python 2.x
pass
assert_in("scikit-learn %s" % sklearn.__version__, comment)
comment = f.readline()
try:
comment = str(comment, "utf-8")
except TypeError: # fails in Python 2.x
pass
assert_in(["one", "zero"][zero_based] + "-based", comment)
X2, y2 = load_svmlight_file(f, dtype=dtype,
zero_based=zero_based)
assert_equal(X2.dtype, dtype)
assert_array_equal(X2.sorted_indices().indices, X2.indices)
X2_dense = X2.toarray()
if dtype == np.float32:
# allow a rounding error at the last decimal place
assert_array_almost_equal(
X_dense.astype(dtype), X2_dense, 4)
assert_array_almost_equal(
y_dense.astype(dtype), y2, 4)
else:
# allow a rounding error at the last decimal place
assert_array_almost_equal(
X_dense.astype(dtype), X2_dense, 15)
assert_array_almost_equal(
y_dense.astype(dtype), y2, 15)
def test_dump_multilabel():
X = [[1, 0, 3, 0, 5],
[0, 0, 0, 0, 0],
[0, 5, 0, 1, 0]]
y_dense = [[0, 1, 0], [1, 0, 1], [1, 1, 0]]
y_sparse = sp.csr_matrix(y_dense)
for y in [y_dense, y_sparse]:
f = BytesIO()
dump_svmlight_file(X, y, f, multilabel=True)
f.seek(0)
# make sure it dumps multilabel correctly
assert_equal(f.readline(), b("1 0:1 2:3 4:5\n"))
assert_equal(f.readline(), b("0,2 \n"))
assert_equal(f.readline(), b("0,1 1:5 3:1\n"))
def test_dump_concise():
one = 1
two = 2.1
three = 3.01
exact = 1.000000000000001
# loses the last decimal place
almost = 1.0000000000000001
X = [[one, two, three, exact, almost],
[1e9, 2e18, 3e27, 0, 0],
[0, 0, 0, 0, 0],
[0, 0, 0, 0, 0],
[0, 0, 0, 0, 0]]
y = [one, two, three, exact, almost]
f = BytesIO()
dump_svmlight_file(X, y, f)
f.seek(0)
# make sure it's using the most concise format possible
assert_equal(f.readline(),
b("1 0:1 1:2.1 2:3.01 3:1.000000000000001 4:1\n"))
assert_equal(f.readline(), b("2.1 0:1000000000 1:2e+18 2:3e+27\n"))
assert_equal(f.readline(), b("3.01 \n"))
assert_equal(f.readline(), b("1.000000000000001 \n"))
assert_equal(f.readline(), b("1 \n"))
f.seek(0)
# make sure it's correct too :)
X2, y2 = load_svmlight_file(f)
assert_array_almost_equal(X, X2.toarray())
assert_array_equal(y, y2)
def test_dump_comment():
X, y = load_svmlight_file(datafile)
X = X.toarray()
f = BytesIO()
ascii_comment = "This is a comment\nspanning multiple lines."
dump_svmlight_file(X, y, f, comment=ascii_comment, zero_based=False)
f.seek(0)
X2, y2 = load_svmlight_file(f, zero_based=False)
assert_array_almost_equal(X, X2.toarray())
assert_array_equal(y, y2)
# XXX we have to update this to support Python 3.x
utf8_comment = b("It is true that\n\xc2\xbd\xc2\xb2 = \xc2\xbc")
f = BytesIO()
assert_raises(UnicodeDecodeError,
dump_svmlight_file, X, y, f, comment=utf8_comment)
unicode_comment = utf8_comment.decode("utf-8")
f = BytesIO()
dump_svmlight_file(X, y, f, comment=unicode_comment, zero_based=False)
f.seek(0)
X2, y2 = load_svmlight_file(f, zero_based=False)
assert_array_almost_equal(X, X2.toarray())
assert_array_equal(y, y2)
f = BytesIO()
assert_raises(ValueError,
dump_svmlight_file, X, y, f, comment="I've got a \0.")
def test_dump_invalid():
X, y = load_svmlight_file(datafile)
f = BytesIO()
y2d = [y]
assert_raises(ValueError, dump_svmlight_file, X, y2d, f)
f = BytesIO()
assert_raises(ValueError, dump_svmlight_file, X, y[:-1], f)
def test_dump_query_id():
# test dumping a file with query_id
X, y = load_svmlight_file(datafile)
X = X.toarray()
query_id = np.arange(X.shape[0]) // 2
f = BytesIO()
dump_svmlight_file(X, y, f, query_id=query_id, zero_based=True)
f.seek(0)
X1, y1, query_id1 = load_svmlight_file(f, query_id=True, zero_based=True)
assert_array_almost_equal(X, X1.toarray())
assert_array_almost_equal(y, y1)
assert_array_almost_equal(query_id, query_id1)
def test_load_with_long_qid():
# load svmfile with longint qid attribute
data = b("""
1 qid:0 0:1 1:2 2:3
0 qid:72048431380967004 0:1440446648 1:72048431380967004 2:236784985
0 qid:-9223372036854775807 0:1440446648 1:72048431380967004 2:236784985
3 qid:9223372036854775807 0:1440446648 1:72048431380967004 2:236784985""")
X, y, qid = load_svmlight_file(BytesIO(data), query_id=True)
true_X = [[1, 2, 3],
[1440446648, 72048431380967004, 236784985],
[1440446648, 72048431380967004, 236784985],
[1440446648, 72048431380967004, 236784985]]
true_y = [1, 0, 0, 3]
trueQID = [0, 72048431380967004, -9223372036854775807, 9223372036854775807]
assert_array_equal(y, true_y)
assert_array_equal(X.toarray(), true_X)
assert_array_equal(qid, trueQID)
f = BytesIO()
dump_svmlight_file(X, y, f, query_id=qid, zero_based=True)
f.seek(0)
X, y, qid = load_svmlight_file(f, query_id=True, zero_based=True)
assert_array_equal(y, true_y)
assert_array_equal(X.toarray(), true_X)
assert_array_equal(qid, trueQID)
f.seek(0)
X, y = load_svmlight_file(f, query_id=False, zero_based=True)
assert_array_equal(y, true_y)
assert_array_equal(X.toarray(), true_X)
def test_load_zeros():
f = BytesIO()
true_X = sp.csr_matrix(np.zeros(shape=(3, 4)))
true_y = np.array([0, 1, 0])
dump_svmlight_file(true_X, true_y, f)
for zero_based in ['auto', True, False]:
f.seek(0)
X, y = load_svmlight_file(f, n_features=4, zero_based=zero_based)
assert_array_equal(y, true_y)
assert_array_equal(X.toarray(), true_X.toarray())
def test_load_with_offsets():
def check_load_with_offsets(sparsity, n_samples, n_features):
rng = np.random.RandomState(0)
X = rng.uniform(low=0.0, high=1.0, size=(n_samples, n_features))
if sparsity:
X[X < sparsity] = 0.0
X = sp.csr_matrix(X)
y = rng.randint(low=0, high=2, size=n_samples)
f = BytesIO()
dump_svmlight_file(X, y, f)
f.seek(0)
size = len(f.getvalue())
# put some marks that are likely to happen anywhere in a row
mark_0 = 0
mark_1 = size // 3
length_0 = mark_1 - mark_0
mark_2 = 4 * size // 5
length_1 = mark_2 - mark_1
# load the original sparse matrix into 3 independent CSR matrices
X_0, y_0 = load_svmlight_file(f, n_features=n_features,
offset=mark_0, length=length_0)
X_1, y_1 = load_svmlight_file(f, n_features=n_features,
offset=mark_1, length=length_1)
X_2, y_2 = load_svmlight_file(f, n_features=n_features,
offset=mark_2)
y_concat = np.concatenate([y_0, y_1, y_2])
X_concat = sp.vstack([X_0, X_1, X_2])
assert_array_equal(y, y_concat)
assert_array_almost_equal(X.toarray(), X_concat.toarray())
# Generate a uniformly random sparse matrix
for sparsity in [0, 0.1, .5, 0.99, 1]:
for n_samples in [13, 101]:
for n_features in [2, 7, 41]:
yield check_load_with_offsets, sparsity, n_samples, n_features
def test_load_offset_exhaustive_splits():
rng = np.random.RandomState(0)
X = np.array([
[0, 0, 0, 0, 0, 0],
[1, 2, 3, 4, 0, 6],
[1, 2, 3, 4, 0, 6],
[0, 0, 0, 0, 0, 0],
[1, 0, 3, 0, 0, 0],
[0, 0, 0, 0, 0, 1],
[1, 0, 0, 0, 0, 0],
])
X = sp.csr_matrix(X)
n_samples, n_features = X.shape
y = rng.randint(low=0, high=2, size=n_samples)
query_id = np.arange(n_samples) // 2
f = BytesIO()
dump_svmlight_file(X, y, f, query_id=query_id)
f.seek(0)
size = len(f.getvalue())
# load the same data in 2 parts with all the possible byte offsets to
# locate the split so has to test for particular boundary cases
for mark in range(size):
if sp_version < (0, 14) and (mark == 0 or mark > size - 100):
# old scipy does not support sparse matrices with 0 rows.
continue
f.seek(0)
X_0, y_0, q_0 = load_svmlight_file(f, n_features=n_features,
query_id=True, offset=0,
length=mark)
X_1, y_1, q_1 = load_svmlight_file(f, n_features=n_features,
query_id=True, offset=mark,
length=-1)
q_concat = np.concatenate([q_0, q_1])
y_concat = np.concatenate([y_0, y_1])
X_concat = sp.vstack([X_0, X_1])
assert_array_equal(y, y_concat)
assert_array_equal(query_id, q_concat)
assert_array_almost_equal(X.toarray(), X_concat.toarray())
def test_load_with_offsets_error():
assert_raises_regex(ValueError, "n_features is required",
load_svmlight_file, datafile, offset=3, length=3)
| bsd-3-clause |
devanshdalal/scikit-learn | examples/gaussian_process/plot_gpr_co2.py | 131 | 5705 | """
========================================================
Gaussian process regression (GPR) on Mauna Loa CO2 data.
========================================================
This example is based on Section 5.4.3 of "Gaussian Processes for Machine
Learning" [RW2006]. It illustrates an example of complex kernel engineering and
hyperparameter optimization using gradient ascent on the
log-marginal-likelihood. The data consists of the monthly average atmospheric
CO2 concentrations (in parts per million by volume (ppmv)) collected at the
Mauna Loa Observatory in Hawaii, between 1958 and 1997. The objective is to
model the CO2 concentration as a function of the time t.
The kernel is composed of several terms that are responsible for explaining
different properties of the signal:
- a long term, smooth rising trend is to be explained by an RBF kernel. The
RBF kernel with a large length-scale enforces this component to be smooth;
it is not enforced that the trend is rising which leaves this choice to the
GP. The specific length-scale and the amplitude are free hyperparameters.
- a seasonal component, which is to be explained by the periodic
ExpSineSquared kernel with a fixed periodicity of 1 year. The length-scale
of this periodic component, controlling its smoothness, is a free parameter.
In order to allow decaying away from exact periodicity, the product with an
RBF kernel is taken. The length-scale of this RBF component controls the
decay time and is a further free parameter.
- smaller, medium term irregularities are to be explained by a
RationalQuadratic kernel component, whose length-scale and alpha parameter,
which determines the diffuseness of the length-scales, are to be determined.
According to [RW2006], these irregularities can better be explained by
a RationalQuadratic than an RBF kernel component, probably because it can
accommodate several length-scales.
- a "noise" term, consisting of an RBF kernel contribution, which shall
explain the correlated noise components such as local weather phenomena,
and a WhiteKernel contribution for the white noise. The relative amplitudes
and the RBF's length scale are further free parameters.
Maximizing the log-marginal-likelihood after subtracting the target's mean
yields the following kernel with an LML of -83.214::
34.4**2 * RBF(length_scale=41.8)
+ 3.27**2 * RBF(length_scale=180) * ExpSineSquared(length_scale=1.44,
periodicity=1)
+ 0.446**2 * RationalQuadratic(alpha=17.7, length_scale=0.957)
+ 0.197**2 * RBF(length_scale=0.138) + WhiteKernel(noise_level=0.0336)
Thus, most of the target signal (34.4ppm) is explained by a long-term rising
trend (length-scale 41.8 years). The periodic component has an amplitude of
3.27ppm, a decay time of 180 years and a length-scale of 1.44. The long decay
time indicates that we have a locally very close to periodic seasonal
component. The correlated noise has an amplitude of 0.197ppm with a length
scale of 0.138 years and a white-noise contribution of 0.197ppm. Thus, the
overall noise level is very small, indicating that the data can be very well
explained by the model. The figure shows also that the model makes very
confident predictions until around 2015.
"""
print(__doc__)
# Authors: Jan Hendrik Metzen <jhm@informatik.uni-bremen.de>
#
# License: BSD 3 clause
import numpy as np
from matplotlib import pyplot as plt
from sklearn.gaussian_process import GaussianProcessRegressor
from sklearn.gaussian_process.kernels \
import RBF, WhiteKernel, RationalQuadratic, ExpSineSquared
from sklearn.datasets import fetch_mldata
data = fetch_mldata('mauna-loa-atmospheric-co2').data
X = data[:, [1]]
y = data[:, 0]
# Kernel with parameters given in GPML book
k1 = 66.0**2 * RBF(length_scale=67.0) # long term smooth rising trend
k2 = 2.4**2 * RBF(length_scale=90.0) \
* ExpSineSquared(length_scale=1.3, periodicity=1.0) # seasonal component
# medium term irregularity
k3 = 0.66**2 \
* RationalQuadratic(length_scale=1.2, alpha=0.78)
k4 = 0.18**2 * RBF(length_scale=0.134) \
+ WhiteKernel(noise_level=0.19**2) # noise terms
kernel_gpml = k1 + k2 + k3 + k4
gp = GaussianProcessRegressor(kernel=kernel_gpml, alpha=0,
optimizer=None, normalize_y=True)
gp.fit(X, y)
print("GPML kernel: %s" % gp.kernel_)
print("Log-marginal-likelihood: %.3f"
% gp.log_marginal_likelihood(gp.kernel_.theta))
# Kernel with optimized parameters
k1 = 50.0**2 * RBF(length_scale=50.0) # long term smooth rising trend
k2 = 2.0**2 * RBF(length_scale=100.0) \
* ExpSineSquared(length_scale=1.0, periodicity=1.0,
periodicity_bounds="fixed") # seasonal component
# medium term irregularities
k3 = 0.5**2 * RationalQuadratic(length_scale=1.0, alpha=1.0)
k4 = 0.1**2 * RBF(length_scale=0.1) \
+ WhiteKernel(noise_level=0.1**2,
noise_level_bounds=(1e-3, np.inf)) # noise terms
kernel = k1 + k2 + k3 + k4
gp = GaussianProcessRegressor(kernel=kernel, alpha=0,
normalize_y=True)
gp.fit(X, y)
print("\nLearned kernel: %s" % gp.kernel_)
print("Log-marginal-likelihood: %.3f"
% gp.log_marginal_likelihood(gp.kernel_.theta))
X_ = np.linspace(X.min(), X.max() + 30, 1000)[:, np.newaxis]
y_pred, y_std = gp.predict(X_, return_std=True)
# Illustration
plt.scatter(X, y, c='k')
plt.plot(X_, y_pred)
plt.fill_between(X_[:, 0], y_pred - y_std, y_pred + y_std,
alpha=0.5, color='k')
plt.xlim(X_.min(), X_.max())
plt.xlabel("Year")
plt.ylabel(r"CO$_2$ in ppm")
plt.title(r"Atmospheric CO$_2$ concentration at Mauna Loa")
plt.tight_layout()
plt.show()
| bsd-3-clause |
tapomayukh/projects_in_python | sandbox_tapo/src/skin_related/Cody_Data/time_varying_data_exploration.py | 1 | 5730 | #!/usr/bin/env python
import math, numpy as np
#from enthought.mayavi import mlab
import matplotlib.pyplot as pp
import matplotlib.cm as cm
import scipy.ndimage as ni
import roslib; roslib.load_manifest('sandbox_tapo_darpa_m3')
import rospy
import tf
#import hrl_lib.mayavi2_util as mu
import hrl_lib.viz as hv
import hrl_lib.util as ut
import hrl_lib.transforms as tr
import hrl_lib.matplotlib_util as mpu
import pickle
from visualization_msgs.msg import Marker
from visualization_msgs.msg import MarkerArray
from hrl_haptic_manipulation_in_clutter_msgs.msg import SkinContact
from hrl_haptic_manipulation_in_clutter_msgs.msg import TaxelArray
from m3skin_ros.msg import TaxelArray as TaxelArray_Meka
from hrl_msgs.msg import FloatArrayBare
from geometry_msgs.msg import Point
from geometry_msgs.msg import Vector3
def callback(data, callback_args):
global Contact_FLAG
rospy.loginfo('Getting data!')
tf_lstnr = callback_args
sc = SkinContact()
sc.header.frame_id = '/torso_lift_link' # has to be this and no other coord frame.
sc.header.stamp = data.header.stamp
t1, q1 = tf_lstnr.lookupTransform(sc.header.frame_id,
data.header.frame_id,
rospy.Time(0))
t1 = np.matrix(t1).reshape(3,1)
r1 = tr.quaternion_to_matrix(q1)
print np.shape(t1)
print np.shape(r1)
force_vectors = np.row_stack([data.values_x, data.values_y, data.values_z])
contact_vectors = np.row_stack([data.centers_x, data.centers_y, data.centers_z])
fmags_instant = ut.norm(force_vectors)
print np.shape(contact_vectors)
threshold = 0.01
fmags_tuned = fmags_instant - threshold
fmags_tuned[np.where(fmags_tuned<0)]=0
# Calculating no. of contact regions with hand-tuned force threshold
contact_regions = fmags_instant > threshold
lb,ls = ni.label(contact_regions)
total_contact = ni.sum(lb)
if (total_contact < 2) and (Contact_FLAG == True):
tracking_point()
global trial_index
trial_index = trial_index + 1
savedata()
plotdata()
pp.show()
global time
time = 0.0
global time_varying_data
time_varying_data = [0,0,0,0,0,0]
global time_varying_data_tracker
time_varying_data_tracker = [0,0,0,0]
Contact_FLAG = False
if total_contact > 1:
# Calculating time:
global time
time = time + 0.01
# Calculating force data for contact with hand-tuned force threshold
#total_forces = ni.sum(fmags_instant,lb)
#mean_forces = ni.mean(fmags_instant,lb)
temp = fmags_tuned*lb
max_force = np.max(temp)
global contact_point_local, contact_point_world
local_x = ni.mean(contact_vectors[0,:],lb) # After thresholding, assuming one connected component (experiment designed that way)
local_y = ni.mean(contact_vectors[1,:],lb) # After thresholding, assuming one connected component (experiment designed that way)
local_z = ni.mean(contact_vectors[2,:],lb) # After thresholding, assuming one connected component (experiment designed that way)
contact_point_local = np.column_stack([local_x,local_y,local_z])
print np.shape(contact_point_local)
contact_point_world = r1*(contact_point_local.T) + t1
time_instant_data = [time,max_force,total_contact,contact_point_world[0],contact_point_world[1],contact_point_world[2]]
global time_varying_data
time_varying_data = np.row_stack([time_varying_data, time_instant_data])
Contact_FLAG = True
def tracking_point():
rospy.loginfo('Tracking Distance!')
global time_varying_data
ta = time_varying_data
k = 0
for i in ta[:,0]:
if i != ta[-1,0]:
instant_dist = math.sqrt((ta[k+1,3]-ta[1,3])**2 + (ta[k+1,4]-ta[1,4])**2 + (ta[k+1,5]-ta[1,5])**2)
time_instant_tracker = [ta[k+1,0], ta[k+1,1], ta[k+1,2], instant_dist]
global time_varying_data_tracker
time_varying_data_tracker = np.row_stack([time_varying_data_tracker, time_instant_tracker])
k=k+1
def savedata():
rospy.loginfo('Saving data!')
global trial_index
ut.save_pickle(time_varying_data_tracker, '/home/tapo/svn/robot1_data/usr/tapo/data/Exploration/Foliage/trial_' + np.str(trial_index) +'.pkl')
def plotdata():
rospy.loginfo('Plotting data!')
global trial_index
ta = ut.load_pickle('/home/tapo/svn/robot1_data/usr/tapo/data/Exploration/Foliage/trial_' + np.str(trial_index) +'.pkl')
mpu.figure(1)
pp.title('Time-Varying Force')
pp.xlabel('Time (s)')
pp.ylabel('Max Force')
pp.plot(ta[:,0], ta[:,1])
pp.grid('on')
mpu.figure(2)
pp.title('Time-Varying Contact')
pp.xlabel('Time (s)')
pp.ylabel('No. of Contact Regions')
pp.plot(ta[:,0], ta[:,2])
pp.grid('on')
mpu.figure(3)
pp.title('Point Tracker')
pp.xlabel('Time (s)')
pp.ylabel('Contact Point Distance')
pp.plot(ta[:,0], ta[:,3])
pp.grid('on')
def getdata():
rospy.init_node('getdata', anonymous=True)
tf_lstnr = tf.TransformListener()
rospy.Subscriber("/skin_patch_forearm_right/taxels/forces", TaxelArray_Meka, callback, callback_args = (tf_lstnr))
rospy.spin()
if __name__ == '__main__':
time = 0
Contact_FLAG = False
trial_index = 0
contact_point_local = [0,0,0]
contact_point_world = [0,0,0]
time_varying_data = [0,0,0,0,0,0]
time_varying_data_tracker = [0,0,0,0]
getdata()
tracking_point()
trial_index = trial_index + 1
savedata()
plotdata()
pp.show()
| mit |
sarahgrogan/scikit-learn | sklearn/ensemble/weight_boosting.py | 71 | 40664 | """Weight Boosting
This module contains weight boosting estimators for both classification and
regression.
The module structure is the following:
- The ``BaseWeightBoosting`` base class implements a common ``fit`` method
for all the estimators in the module. Regression and classification
only differ from each other in the loss function that is optimized.
- ``AdaBoostClassifier`` implements adaptive boosting (AdaBoost-SAMME) for
classification problems.
- ``AdaBoostRegressor`` implements adaptive boosting (AdaBoost.R2) for
regression problems.
"""
# Authors: Noel Dawe <noel@dawe.me>
# Gilles Louppe <g.louppe@gmail.com>
# Hamzeh Alsalhi <ha258@cornell.edu>
# Arnaud Joly <arnaud.v.joly@gmail.com>
#
# Licence: BSD 3 clause
from abc import ABCMeta, abstractmethod
import numpy as np
from numpy.core.umath_tests import inner1d
from .base import BaseEnsemble
from ..base import ClassifierMixin, RegressorMixin
from ..externals import six
from ..externals.six.moves import zip
from ..externals.six.moves import xrange as range
from .forest import BaseForest
from ..tree import DecisionTreeClassifier, DecisionTreeRegressor
from ..tree.tree import BaseDecisionTree
from ..tree._tree import DTYPE
from ..utils import check_array, check_X_y, check_random_state
from ..metrics import accuracy_score, r2_score
from sklearn.utils.validation import has_fit_parameter, check_is_fitted
__all__ = [
'AdaBoostClassifier',
'AdaBoostRegressor',
]
class BaseWeightBoosting(six.with_metaclass(ABCMeta, BaseEnsemble)):
"""Base class for AdaBoost estimators.
Warning: This class should not be used directly. Use derived classes
instead.
"""
@abstractmethod
def __init__(self,
base_estimator=None,
n_estimators=50,
estimator_params=tuple(),
learning_rate=1.,
random_state=None):
super(BaseWeightBoosting, self).__init__(
base_estimator=base_estimator,
n_estimators=n_estimators,
estimator_params=estimator_params)
self.learning_rate = learning_rate
self.random_state = random_state
def fit(self, X, y, sample_weight=None):
"""Build a boosted classifier/regressor from the training set (X, y).
Parameters
----------
X : {array-like, sparse matrix} of shape = [n_samples, n_features]
The training input samples. Sparse matrix can be CSC, CSR, COO,
DOK, or LIL. COO, DOK, and LIL are converted to CSR. The dtype is
forced to DTYPE from tree._tree if the base classifier of this
ensemble weighted boosting classifier is a tree or forest.
y : array-like of shape = [n_samples]
The target values (class labels in classification, real numbers in
regression).
sample_weight : array-like of shape = [n_samples], optional
Sample weights. If None, the sample weights are initialized to
1 / n_samples.
Returns
-------
self : object
Returns self.
"""
# Check parameters
if self.learning_rate <= 0:
raise ValueError("learning_rate must be greater than zero")
if (self.base_estimator is None or
isinstance(self.base_estimator, (BaseDecisionTree,
BaseForest))):
dtype = DTYPE
accept_sparse = 'csc'
else:
dtype = None
accept_sparse = ['csr', 'csc']
X, y = check_X_y(X, y, accept_sparse=accept_sparse, dtype=dtype)
if sample_weight is None:
# Initialize weights to 1 / n_samples
sample_weight = np.empty(X.shape[0], dtype=np.float)
sample_weight[:] = 1. / X.shape[0]
else:
# Normalize existing weights
sample_weight = sample_weight / sample_weight.sum(dtype=np.float64)
# Check that the sample weights sum is positive
if sample_weight.sum() <= 0:
raise ValueError(
"Attempting to fit with a non-positive "
"weighted number of samples.")
# Check parameters
self._validate_estimator()
# Clear any previous fit results
self.estimators_ = []
self.estimator_weights_ = np.zeros(self.n_estimators, dtype=np.float)
self.estimator_errors_ = np.ones(self.n_estimators, dtype=np.float)
for iboost in range(self.n_estimators):
# Boosting step
sample_weight, estimator_weight, estimator_error = self._boost(
iboost,
X, y,
sample_weight)
# Early termination
if sample_weight is None:
break
self.estimator_weights_[iboost] = estimator_weight
self.estimator_errors_[iboost] = estimator_error
# Stop if error is zero
if estimator_error == 0:
break
sample_weight_sum = np.sum(sample_weight)
# Stop if the sum of sample weights has become non-positive
if sample_weight_sum <= 0:
break
if iboost < self.n_estimators - 1:
# Normalize
sample_weight /= sample_weight_sum
return self
@abstractmethod
def _boost(self, iboost, X, y, sample_weight):
"""Implement a single boost.
Warning: This method needs to be overriden by subclasses.
Parameters
----------
iboost : int
The index of the current boost iteration.
X : {array-like, sparse matrix} of shape = [n_samples, n_features]
The training input samples. Sparse matrix can be CSC, CSR, COO,
DOK, or LIL. COO, DOK, and LIL are converted to CSR.
y : array-like of shape = [n_samples]
The target values (class labels).
sample_weight : array-like of shape = [n_samples]
The current sample weights.
Returns
-------
sample_weight : array-like of shape = [n_samples] or None
The reweighted sample weights.
If None then boosting has terminated early.
estimator_weight : float
The weight for the current boost.
If None then boosting has terminated early.
error : float
The classification error for the current boost.
If None then boosting has terminated early.
"""
pass
def staged_score(self, X, y, sample_weight=None):
"""Return staged scores for X, y.
This generator method yields the ensemble score after each iteration of
boosting and therefore allows monitoring, such as to determine the
score on a test set after each boost.
Parameters
----------
X : {array-like, sparse matrix} of shape = [n_samples, n_features]
The training input samples. Sparse matrix can be CSC, CSR, COO,
DOK, or LIL. DOK and LIL are converted to CSR.
y : array-like, shape = [n_samples]
Labels for X.
sample_weight : array-like, shape = [n_samples], optional
Sample weights.
Returns
-------
z : float
"""
for y_pred in self.staged_predict(X):
if isinstance(self, ClassifierMixin):
yield accuracy_score(y, y_pred, sample_weight=sample_weight)
else:
yield r2_score(y, y_pred, sample_weight=sample_weight)
@property
def feature_importances_(self):
"""Return the feature importances (the higher, the more important the
feature).
Returns
-------
feature_importances_ : array, shape = [n_features]
"""
if self.estimators_ is None or len(self.estimators_) == 0:
raise ValueError("Estimator not fitted, "
"call `fit` before `feature_importances_`.")
try:
norm = self.estimator_weights_.sum()
return (sum(weight * clf.feature_importances_ for weight, clf
in zip(self.estimator_weights_, self.estimators_))
/ norm)
except AttributeError:
raise AttributeError(
"Unable to compute feature importances "
"since base_estimator does not have a "
"feature_importances_ attribute")
def _validate_X_predict(self, X):
"""Ensure that X is in the proper format"""
if (self.base_estimator is None or
isinstance(self.base_estimator,
(BaseDecisionTree, BaseForest))):
X = check_array(X, accept_sparse='csr', dtype=DTYPE)
else:
X = check_array(X, accept_sparse=['csr', 'csc', 'coo'])
return X
def _samme_proba(estimator, n_classes, X):
"""Calculate algorithm 4, step 2, equation c) of Zhu et al [1].
References
----------
.. [1] J. Zhu, H. Zou, S. Rosset, T. Hastie, "Multi-class AdaBoost", 2009.
"""
proba = estimator.predict_proba(X)
# Displace zero probabilities so the log is defined.
# Also fix negative elements which may occur with
# negative sample weights.
proba[proba < np.finfo(proba.dtype).eps] = np.finfo(proba.dtype).eps
log_proba = np.log(proba)
return (n_classes - 1) * (log_proba - (1. / n_classes)
* log_proba.sum(axis=1)[:, np.newaxis])
class AdaBoostClassifier(BaseWeightBoosting, ClassifierMixin):
"""An AdaBoost classifier.
An AdaBoost [1] classifier is a meta-estimator that begins by fitting a
classifier on the original dataset and then fits additional copies of the
classifier on the same dataset but where the weights of incorrectly
classified instances are adjusted such that subsequent classifiers focus
more on difficult cases.
This class implements the algorithm known as AdaBoost-SAMME [2].
Read more in the :ref:`User Guide <adaboost>`.
Parameters
----------
base_estimator : object, optional (default=DecisionTreeClassifier)
The base estimator from which the boosted ensemble is built.
Support for sample weighting is required, as well as proper `classes_`
and `n_classes_` attributes.
n_estimators : integer, optional (default=50)
The maximum number of estimators at which boosting is terminated.
In case of perfect fit, the learning procedure is stopped early.
learning_rate : float, optional (default=1.)
Learning rate shrinks the contribution of each classifier by
``learning_rate``. There is a trade-off between ``learning_rate`` and
``n_estimators``.
algorithm : {'SAMME', 'SAMME.R'}, optional (default='SAMME.R')
If 'SAMME.R' then use the SAMME.R real boosting algorithm.
``base_estimator`` must support calculation of class probabilities.
If 'SAMME' then use the SAMME discrete boosting algorithm.
The SAMME.R algorithm typically converges faster than SAMME,
achieving a lower test error with fewer boosting iterations.
random_state : int, RandomState instance or None, optional (default=None)
If int, random_state is the seed used by the random number generator;
If RandomState instance, random_state is the random number generator;
If None, the random number generator is the RandomState instance used
by `np.random`.
Attributes
----------
estimators_ : list of classifiers
The collection of fitted sub-estimators.
classes_ : array of shape = [n_classes]
The classes labels.
n_classes_ : int
The number of classes.
estimator_weights_ : array of floats
Weights for each estimator in the boosted ensemble.
estimator_errors_ : array of floats
Classification error for each estimator in the boosted
ensemble.
feature_importances_ : array of shape = [n_features]
The feature importances if supported by the ``base_estimator``.
See also
--------
AdaBoostRegressor, GradientBoostingClassifier, DecisionTreeClassifier
References
----------
.. [1] Y. Freund, R. Schapire, "A Decision-Theoretic Generalization of
on-Line Learning and an Application to Boosting", 1995.
.. [2] J. Zhu, H. Zou, S. Rosset, T. Hastie, "Multi-class AdaBoost", 2009.
"""
def __init__(self,
base_estimator=None,
n_estimators=50,
learning_rate=1.,
algorithm='SAMME.R',
random_state=None):
super(AdaBoostClassifier, self).__init__(
base_estimator=base_estimator,
n_estimators=n_estimators,
learning_rate=learning_rate,
random_state=random_state)
self.algorithm = algorithm
def fit(self, X, y, sample_weight=None):
"""Build a boosted classifier from the training set (X, y).
Parameters
----------
X : {array-like, sparse matrix} of shape = [n_samples, n_features]
The training input samples. Sparse matrix can be CSC, CSR, COO,
DOK, or LIL. DOK and LIL are converted to CSR.
y : array-like of shape = [n_samples]
The target values (class labels).
sample_weight : array-like of shape = [n_samples], optional
Sample weights. If None, the sample weights are initialized to
``1 / n_samples``.
Returns
-------
self : object
Returns self.
"""
# Check that algorithm is supported
if self.algorithm not in ('SAMME', 'SAMME.R'):
raise ValueError("algorithm %s is not supported" % self.algorithm)
# Fit
return super(AdaBoostClassifier, self).fit(X, y, sample_weight)
def _validate_estimator(self):
"""Check the estimator and set the base_estimator_ attribute."""
super(AdaBoostClassifier, self)._validate_estimator(
default=DecisionTreeClassifier(max_depth=1))
# SAMME-R requires predict_proba-enabled base estimators
if self.algorithm == 'SAMME.R':
if not hasattr(self.base_estimator_, 'predict_proba'):
raise TypeError(
"AdaBoostClassifier with algorithm='SAMME.R' requires "
"that the weak learner supports the calculation of class "
"probabilities with a predict_proba method.\n"
"Please change the base estimator or set "
"algorithm='SAMME' instead.")
if not has_fit_parameter(self.base_estimator_, "sample_weight"):
raise ValueError("%s doesn't support sample_weight."
% self.base_estimator_.__class__.__name__)
def _boost(self, iboost, X, y, sample_weight):
"""Implement a single boost.
Perform a single boost according to the real multi-class SAMME.R
algorithm or to the discrete SAMME algorithm and return the updated
sample weights.
Parameters
----------
iboost : int
The index of the current boost iteration.
X : {array-like, sparse matrix} of shape = [n_samples, n_features]
The training input samples. Sparse matrix can be CSC, CSR, COO,
DOK, or LIL. DOK and LIL are converted to CSR.
y : array-like of shape = [n_samples]
The target values (class labels).
sample_weight : array-like of shape = [n_samples]
The current sample weights.
Returns
-------
sample_weight : array-like of shape = [n_samples] or None
The reweighted sample weights.
If None then boosting has terminated early.
estimator_weight : float
The weight for the current boost.
If None then boosting has terminated early.
estimator_error : float
The classification error for the current boost.
If None then boosting has terminated early.
"""
if self.algorithm == 'SAMME.R':
return self._boost_real(iboost, X, y, sample_weight)
else: # elif self.algorithm == "SAMME":
return self._boost_discrete(iboost, X, y, sample_weight)
def _boost_real(self, iboost, X, y, sample_weight):
"""Implement a single boost using the SAMME.R real algorithm."""
estimator = self._make_estimator()
try:
estimator.set_params(random_state=self.random_state)
except ValueError:
pass
estimator.fit(X, y, sample_weight=sample_weight)
y_predict_proba = estimator.predict_proba(X)
if iboost == 0:
self.classes_ = getattr(estimator, 'classes_', None)
self.n_classes_ = len(self.classes_)
y_predict = self.classes_.take(np.argmax(y_predict_proba, axis=1),
axis=0)
# Instances incorrectly classified
incorrect = y_predict != y
# Error fraction
estimator_error = np.mean(
np.average(incorrect, weights=sample_weight, axis=0))
# Stop if classification is perfect
if estimator_error <= 0:
return sample_weight, 1., 0.
# Construct y coding as described in Zhu et al [2]:
#
# y_k = 1 if c == k else -1 / (K - 1)
#
# where K == n_classes_ and c, k in [0, K) are indices along the second
# axis of the y coding with c being the index corresponding to the true
# class label.
n_classes = self.n_classes_
classes = self.classes_
y_codes = np.array([-1. / (n_classes - 1), 1.])
y_coding = y_codes.take(classes == y[:, np.newaxis])
# Displace zero probabilities so the log is defined.
# Also fix negative elements which may occur with
# negative sample weights.
proba = y_predict_proba # alias for readability
proba[proba < np.finfo(proba.dtype).eps] = np.finfo(proba.dtype).eps
# Boost weight using multi-class AdaBoost SAMME.R alg
estimator_weight = (-1. * self.learning_rate
* (((n_classes - 1.) / n_classes) *
inner1d(y_coding, np.log(y_predict_proba))))
# Only boost the weights if it will fit again
if not iboost == self.n_estimators - 1:
# Only boost positive weights
sample_weight *= np.exp(estimator_weight *
((sample_weight > 0) |
(estimator_weight < 0)))
return sample_weight, 1., estimator_error
def _boost_discrete(self, iboost, X, y, sample_weight):
"""Implement a single boost using the SAMME discrete algorithm."""
estimator = self._make_estimator()
try:
estimator.set_params(random_state=self.random_state)
except ValueError:
pass
estimator.fit(X, y, sample_weight=sample_weight)
y_predict = estimator.predict(X)
if iboost == 0:
self.classes_ = getattr(estimator, 'classes_', None)
self.n_classes_ = len(self.classes_)
# Instances incorrectly classified
incorrect = y_predict != y
# Error fraction
estimator_error = np.mean(
np.average(incorrect, weights=sample_weight, axis=0))
# Stop if classification is perfect
if estimator_error <= 0:
return sample_weight, 1., 0.
n_classes = self.n_classes_
# Stop if the error is at least as bad as random guessing
if estimator_error >= 1. - (1. / n_classes):
self.estimators_.pop(-1)
if len(self.estimators_) == 0:
raise ValueError('BaseClassifier in AdaBoostClassifier '
'ensemble is worse than random, ensemble '
'can not be fit.')
return None, None, None
# Boost weight using multi-class AdaBoost SAMME alg
estimator_weight = self.learning_rate * (
np.log((1. - estimator_error) / estimator_error) +
np.log(n_classes - 1.))
# Only boost the weights if I will fit again
if not iboost == self.n_estimators - 1:
# Only boost positive weights
sample_weight *= np.exp(estimator_weight * incorrect *
((sample_weight > 0) |
(estimator_weight < 0)))
return sample_weight, estimator_weight, estimator_error
def predict(self, X):
"""Predict classes for X.
The predicted class of an input sample is computed as the weighted mean
prediction of the classifiers in the ensemble.
Parameters
----------
X : {array-like, sparse matrix} of shape = [n_samples, n_features]
The training input samples. Sparse matrix can be CSC, CSR, COO,
DOK, or LIL. DOK and LIL are converted to CSR.
Returns
-------
y : array of shape = [n_samples]
The predicted classes.
"""
pred = self.decision_function(X)
if self.n_classes_ == 2:
return self.classes_.take(pred > 0, axis=0)
return self.classes_.take(np.argmax(pred, axis=1), axis=0)
def staged_predict(self, X):
"""Return staged predictions for X.
The predicted class of an input sample is computed as the weighted mean
prediction of the classifiers in the ensemble.
This generator method yields the ensemble prediction after each
iteration of boosting and therefore allows monitoring, such as to
determine the prediction on a test set after each boost.
Parameters
----------
X : array-like of shape = [n_samples, n_features]
The input samples.
Returns
-------
y : generator of array, shape = [n_samples]
The predicted classes.
"""
n_classes = self.n_classes_
classes = self.classes_
if n_classes == 2:
for pred in self.staged_decision_function(X):
yield np.array(classes.take(pred > 0, axis=0))
else:
for pred in self.staged_decision_function(X):
yield np.array(classes.take(
np.argmax(pred, axis=1), axis=0))
def decision_function(self, X):
"""Compute the decision function of ``X``.
Parameters
----------
X : {array-like, sparse matrix} of shape = [n_samples, n_features]
The training input samples. Sparse matrix can be CSC, CSR, COO,
DOK, or LIL. DOK and LIL are converted to CSR.
Returns
-------
score : array, shape = [n_samples, k]
The decision function of the input samples. The order of
outputs is the same of that of the `classes_` attribute.
Binary classification is a special cases with ``k == 1``,
otherwise ``k==n_classes``. For binary classification,
values closer to -1 or 1 mean more like the first or second
class in ``classes_``, respectively.
"""
check_is_fitted(self, "n_classes_")
X = self._validate_X_predict(X)
n_classes = self.n_classes_
classes = self.classes_[:, np.newaxis]
pred = None
if self.algorithm == 'SAMME.R':
# The weights are all 1. for SAMME.R
pred = sum(_samme_proba(estimator, n_classes, X)
for estimator in self.estimators_)
else: # self.algorithm == "SAMME"
pred = sum((estimator.predict(X) == classes).T * w
for estimator, w in zip(self.estimators_,
self.estimator_weights_))
pred /= self.estimator_weights_.sum()
if n_classes == 2:
pred[:, 0] *= -1
return pred.sum(axis=1)
return pred
def staged_decision_function(self, X):
"""Compute decision function of ``X`` for each boosting iteration.
This method allows monitoring (i.e. determine error on testing set)
after each boosting iteration.
Parameters
----------
X : {array-like, sparse matrix} of shape = [n_samples, n_features]
The training input samples. Sparse matrix can be CSC, CSR, COO,
DOK, or LIL. DOK and LIL are converted to CSR.
Returns
-------
score : generator of array, shape = [n_samples, k]
The decision function of the input samples. The order of
outputs is the same of that of the `classes_` attribute.
Binary classification is a special cases with ``k == 1``,
otherwise ``k==n_classes``. For binary classification,
values closer to -1 or 1 mean more like the first or second
class in ``classes_``, respectively.
"""
check_is_fitted(self, "n_classes_")
X = self._validate_X_predict(X)
n_classes = self.n_classes_
classes = self.classes_[:, np.newaxis]
pred = None
norm = 0.
for weight, estimator in zip(self.estimator_weights_,
self.estimators_):
norm += weight
if self.algorithm == 'SAMME.R':
# The weights are all 1. for SAMME.R
current_pred = _samme_proba(estimator, n_classes, X)
else: # elif self.algorithm == "SAMME":
current_pred = estimator.predict(X)
current_pred = (current_pred == classes).T * weight
if pred is None:
pred = current_pred
else:
pred += current_pred
if n_classes == 2:
tmp_pred = np.copy(pred)
tmp_pred[:, 0] *= -1
yield (tmp_pred / norm).sum(axis=1)
else:
yield pred / norm
def predict_proba(self, X):
"""Predict class probabilities for X.
The predicted class probabilities of an input sample is computed as
the weighted mean predicted class probabilities of the classifiers
in the ensemble.
Parameters
----------
X : {array-like, sparse matrix} of shape = [n_samples, n_features]
The training input samples. Sparse matrix can be CSC, CSR, COO,
DOK, or LIL. DOK and LIL are converted to CSR.
Returns
-------
p : array of shape = [n_samples]
The class probabilities of the input samples. The order of
outputs is the same of that of the `classes_` attribute.
"""
check_is_fitted(self, "n_classes_")
n_classes = self.n_classes_
X = self._validate_X_predict(X)
if self.algorithm == 'SAMME.R':
# The weights are all 1. for SAMME.R
proba = sum(_samme_proba(estimator, n_classes, X)
for estimator in self.estimators_)
else: # self.algorithm == "SAMME"
proba = sum(estimator.predict_proba(X) * w
for estimator, w in zip(self.estimators_,
self.estimator_weights_))
proba /= self.estimator_weights_.sum()
proba = np.exp((1. / (n_classes - 1)) * proba)
normalizer = proba.sum(axis=1)[:, np.newaxis]
normalizer[normalizer == 0.0] = 1.0
proba /= normalizer
return proba
def staged_predict_proba(self, X):
"""Predict class probabilities for X.
The predicted class probabilities of an input sample is computed as
the weighted mean predicted class probabilities of the classifiers
in the ensemble.
This generator method yields the ensemble predicted class probabilities
after each iteration of boosting and therefore allows monitoring, such
as to determine the predicted class probabilities on a test set after
each boost.
Parameters
----------
X : {array-like, sparse matrix} of shape = [n_samples, n_features]
The training input samples. Sparse matrix can be CSC, CSR, COO,
DOK, or LIL. DOK and LIL are converted to CSR.
Returns
-------
p : generator of array, shape = [n_samples]
The class probabilities of the input samples. The order of
outputs is the same of that of the `classes_` attribute.
"""
X = self._validate_X_predict(X)
n_classes = self.n_classes_
proba = None
norm = 0.
for weight, estimator in zip(self.estimator_weights_,
self.estimators_):
norm += weight
if self.algorithm == 'SAMME.R':
# The weights are all 1. for SAMME.R
current_proba = _samme_proba(estimator, n_classes, X)
else: # elif self.algorithm == "SAMME":
current_proba = estimator.predict_proba(X) * weight
if proba is None:
proba = current_proba
else:
proba += current_proba
real_proba = np.exp((1. / (n_classes - 1)) * (proba / norm))
normalizer = real_proba.sum(axis=1)[:, np.newaxis]
normalizer[normalizer == 0.0] = 1.0
real_proba /= normalizer
yield real_proba
def predict_log_proba(self, X):
"""Predict class log-probabilities for X.
The predicted class log-probabilities of an input sample is computed as
the weighted mean predicted class log-probabilities of the classifiers
in the ensemble.
Parameters
----------
X : {array-like, sparse matrix} of shape = [n_samples, n_features]
The training input samples. Sparse matrix can be CSC, CSR, COO,
DOK, or LIL. DOK and LIL are converted to CSR.
Returns
-------
p : array of shape = [n_samples]
The class probabilities of the input samples. The order of
outputs is the same of that of the `classes_` attribute.
"""
return np.log(self.predict_proba(X))
class AdaBoostRegressor(BaseWeightBoosting, RegressorMixin):
"""An AdaBoost regressor.
An AdaBoost [1] regressor is a meta-estimator that begins by fitting a
regressor on the original dataset and then fits additional copies of the
regressor on the same dataset but where the weights of instances are
adjusted according to the error of the current prediction. As such,
subsequent regressors focus more on difficult cases.
This class implements the algorithm known as AdaBoost.R2 [2].
Read more in the :ref:`User Guide <adaboost>`.
Parameters
----------
base_estimator : object, optional (default=DecisionTreeRegressor)
The base estimator from which the boosted ensemble is built.
Support for sample weighting is required.
n_estimators : integer, optional (default=50)
The maximum number of estimators at which boosting is terminated.
In case of perfect fit, the learning procedure is stopped early.
learning_rate : float, optional (default=1.)
Learning rate shrinks the contribution of each regressor by
``learning_rate``. There is a trade-off between ``learning_rate`` and
``n_estimators``.
loss : {'linear', 'square', 'exponential'}, optional (default='linear')
The loss function to use when updating the weights after each
boosting iteration.
random_state : int, RandomState instance or None, optional (default=None)
If int, random_state is the seed used by the random number generator;
If RandomState instance, random_state is the random number generator;
If None, the random number generator is the RandomState instance used
by `np.random`.
Attributes
----------
estimators_ : list of classifiers
The collection of fitted sub-estimators.
estimator_weights_ : array of floats
Weights for each estimator in the boosted ensemble.
estimator_errors_ : array of floats
Regression error for each estimator in the boosted ensemble.
feature_importances_ : array of shape = [n_features]
The feature importances if supported by the ``base_estimator``.
See also
--------
AdaBoostClassifier, GradientBoostingRegressor, DecisionTreeRegressor
References
----------
.. [1] Y. Freund, R. Schapire, "A Decision-Theoretic Generalization of
on-Line Learning and an Application to Boosting", 1995.
.. [2] H. Drucker, "Improving Regressors using Boosting Techniques", 1997.
"""
def __init__(self,
base_estimator=None,
n_estimators=50,
learning_rate=1.,
loss='linear',
random_state=None):
super(AdaBoostRegressor, self).__init__(
base_estimator=base_estimator,
n_estimators=n_estimators,
learning_rate=learning_rate,
random_state=random_state)
self.loss = loss
self.random_state = random_state
def fit(self, X, y, sample_weight=None):
"""Build a boosted regressor from the training set (X, y).
Parameters
----------
X : {array-like, sparse matrix} of shape = [n_samples, n_features]
The training input samples. Sparse matrix can be CSC, CSR, COO,
DOK, or LIL. DOK and LIL are converted to CSR.
y : array-like of shape = [n_samples]
The target values (real numbers).
sample_weight : array-like of shape = [n_samples], optional
Sample weights. If None, the sample weights are initialized to
1 / n_samples.
Returns
-------
self : object
Returns self.
"""
# Check loss
if self.loss not in ('linear', 'square', 'exponential'):
raise ValueError(
"loss must be 'linear', 'square', or 'exponential'")
# Fit
return super(AdaBoostRegressor, self).fit(X, y, sample_weight)
def _validate_estimator(self):
"""Check the estimator and set the base_estimator_ attribute."""
super(AdaBoostRegressor, self)._validate_estimator(
default=DecisionTreeRegressor(max_depth=3))
def _boost(self, iboost, X, y, sample_weight):
"""Implement a single boost for regression
Perform a single boost according to the AdaBoost.R2 algorithm and
return the updated sample weights.
Parameters
----------
iboost : int
The index of the current boost iteration.
X : {array-like, sparse matrix} of shape = [n_samples, n_features]
The training input samples. Sparse matrix can be CSC, CSR, COO,
DOK, or LIL. DOK and LIL are converted to CSR.
y : array-like of shape = [n_samples]
The target values (class labels in classification, real numbers in
regression).
sample_weight : array-like of shape = [n_samples]
The current sample weights.
Returns
-------
sample_weight : array-like of shape = [n_samples] or None
The reweighted sample weights.
If None then boosting has terminated early.
estimator_weight : float
The weight for the current boost.
If None then boosting has terminated early.
estimator_error : float
The regression error for the current boost.
If None then boosting has terminated early.
"""
estimator = self._make_estimator()
try:
estimator.set_params(random_state=self.random_state)
except ValueError:
pass
generator = check_random_state(self.random_state)
# Weighted sampling of the training set with replacement
# For NumPy >= 1.7.0 use np.random.choice
cdf = sample_weight.cumsum()
cdf /= cdf[-1]
uniform_samples = generator.random_sample(X.shape[0])
bootstrap_idx = cdf.searchsorted(uniform_samples, side='right')
# searchsorted returns a scalar
bootstrap_idx = np.array(bootstrap_idx, copy=False)
# Fit on the bootstrapped sample and obtain a prediction
# for all samples in the training set
estimator.fit(X[bootstrap_idx], y[bootstrap_idx])
y_predict = estimator.predict(X)
error_vect = np.abs(y_predict - y)
error_max = error_vect.max()
if error_max != 0.:
error_vect /= error_max
if self.loss == 'square':
error_vect **= 2
elif self.loss == 'exponential':
error_vect = 1. - np.exp(- error_vect)
# Calculate the average loss
estimator_error = (sample_weight * error_vect).sum()
if estimator_error <= 0:
# Stop if fit is perfect
return sample_weight, 1., 0.
elif estimator_error >= 0.5:
# Discard current estimator only if it isn't the only one
if len(self.estimators_) > 1:
self.estimators_.pop(-1)
return None, None, None
beta = estimator_error / (1. - estimator_error)
# Boost weight using AdaBoost.R2 alg
estimator_weight = self.learning_rate * np.log(1. / beta)
if not iboost == self.n_estimators - 1:
sample_weight *= np.power(
beta,
(1. - error_vect) * self.learning_rate)
return sample_weight, estimator_weight, estimator_error
def _get_median_predict(self, X, limit):
# Evaluate predictions of all estimators
predictions = np.array([
est.predict(X) for est in self.estimators_[:limit]]).T
# Sort the predictions
sorted_idx = np.argsort(predictions, axis=1)
# Find index of median prediction for each sample
weight_cdf = self.estimator_weights_[sorted_idx].cumsum(axis=1)
median_or_above = weight_cdf >= 0.5 * weight_cdf[:, -1][:, np.newaxis]
median_idx = median_or_above.argmax(axis=1)
median_estimators = sorted_idx[np.arange(X.shape[0]), median_idx]
# Return median predictions
return predictions[np.arange(X.shape[0]), median_estimators]
def predict(self, X):
"""Predict regression value for X.
The predicted regression value of an input sample is computed
as the weighted median prediction of the classifiers in the ensemble.
Parameters
----------
X : {array-like, sparse matrix} of shape = [n_samples, n_features]
The training input samples. Sparse matrix can be CSC, CSR, COO,
DOK, or LIL. DOK and LIL are converted to CSR.
Returns
-------
y : array of shape = [n_samples]
The predicted regression values.
"""
check_is_fitted(self, "estimator_weights_")
X = self._validate_X_predict(X)
return self._get_median_predict(X, len(self.estimators_))
def staged_predict(self, X):
"""Return staged predictions for X.
The predicted regression value of an input sample is computed
as the weighted median prediction of the classifiers in the ensemble.
This generator method yields the ensemble prediction after each
iteration of boosting and therefore allows monitoring, such as to
determine the prediction on a test set after each boost.
Parameters
----------
X : {array-like, sparse matrix} of shape = [n_samples, n_features]
The training input samples. Sparse matrix can be CSC, CSR, COO,
DOK, or LIL. DOK and LIL are converted to CSR.
Returns
-------
y : generator of array, shape = [n_samples]
The predicted regression values.
"""
check_is_fitted(self, "estimator_weights_")
X = self._validate_X_predict(X)
for i, _ in enumerate(self.estimators_, 1):
yield self._get_median_predict(X, limit=i)
| bsd-3-clause |
dilawar/moose-full | moose-examples/snippets/ionchannel.py | 2 | 8640 | # ionchannel.py ---
#
# Filename: ionchannel.py
# Description:
# Author: Subhasis Ray
# Maintainer:
# Created: Wed Sep 17 10:33:20 2014 (+0530)
# Version:
# Last-Updated:
# By:
# Update #: 0
# URL:
# Keywords:
# Compatibility:
#
#
# Commentary:
#
#
#
#
# Change log:
#
#
#
#
# This program is free software; you can redistribute it and/or
# modify it under the terms of the GNU General Public License as
# published by the Free Software Foundation; either version 3, or
# (at your option) any later version.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU
# General Public License for more details.
#
# You should have received a copy of the GNU General Public License
# along with this program; see the file COPYING. If not, write to
# the Free Software Foundation, Inc., 51 Franklin Street, Fifth
# Floor, Boston, MA 02110-1301, USA.
#
#
# Code:
"""This demo shows how to set the parameters for a Hodgkin-Huxley type ion channel.
Hodgkin-Huxley type ion channels are composed of one or more gates
that allow ions to cross the membrane. The gates transition between
open and closed states and this, taken over a large population of
ion channels over a patch of membrane has first order kinetics, where
the rate of change of fraction of open gates (n) is given by::
dn/dt = alpha(Vm) * (1 - n) - beta(Vm) * n
where alpha and beta are rate parameters for gate opening and
closing respectively that depend on the membrane potential.
The final channel conductance is computed as::
Gbar * m^x * h^y ...
where m, n are the fraction of open gates of different types and x,
y are the number of such gates in each channel. We can define the
channel by specifying the alpha and beta parameters as functions of
membrane potential and the exponents for each gate.
The number gates is commonly one or two.
Gate opening/closing rates have the form::
y(x) = (A + B * x) / (C + exp((x + D) / F))
where x is membrane voltage and y is the rate parameter for gate
closing or opening.
"""
import numpy as np
import matplotlib.pyplot as plt
import moose
EREST_ACT = -70e-3 #: Resting membrane potential
#: The parameters for defining m as a function of Vm
Na_m_params = [1e5 * (25e-3 + EREST_ACT), # 'A_A':
-1e5, # 'A_B':
-1.0, # 'A_C':
-25e-3 - EREST_ACT, # 'A_D':
-10e-3, # 'A_F':
4e3, # 'B_A':
0.0, # 'B_B':
0.0, # 'B_C':
0.0 - EREST_ACT, # 'B_D':
18e-3 # 'B_F':
]
#: Parameters for defining h gate of Na+ channel
Na_h_params = [ 70.0, # 'A_A':
0.0, # 'A_B':
0.0, # 'A_C':
0.0 - EREST_ACT, # 'A_D':
0.02, # 'A_F':
1000.0, # 'B_A':
0.0, # 'B_B':
1.0, # 'B_C':
-30e-3 - EREST_ACT, # 'B_D':
-0.01 # 'B_F':
]
#: K+ channel in Hodgkin-Huxley model has only one gate, n and these
#are the parameters for the same
K_n_params = [ 1e4 * (10e-3 + EREST_ACT), # 'A_A':
-1e4, # 'A_B':
-1.0, # 'A_C':
-10e-3 - EREST_ACT, # 'A_D':
-10e-3, # 'A_F':
0.125e3, # 'B_A':
0.0, # 'B_B':
0.0, # 'B_C':
0.0 - EREST_ACT, # 'B_D':
80e-3 # 'B_F':
]
#: We define the rate parameters, which are functions of Vm as
#: interpolation tables looked up by membrane potential.
#: Minimum x-value for the interpolation table
VMIN = -30e-3 + EREST_ACT
#: Maximum x-value for the interpolation table
VMAX = 120e-3 + EREST_ACT
#: Number of divisions in the interpolation table
VDIVS = 3000
def create_na_proto():
"""Create and return a Na+ channel prototype '/library/na'
The Na+ channel conductance has the equation::
g = Gbar * m^3 * h
For each gate, we use the HHChannel.setupAlpha function to set up
the interpolation table.
"""
lib = moose.Neutral('/library')
na = moose.HHChannel('/library/na')
na.tick = -1
#: The exponent for m gate is 3
na.Xpower = 3
#: channel/gateX is the m gate
#: setting Xpower to a positive number automatically creates this gate.
xGate = moose.element(na.path + '/gateX')
xGate.setupAlpha(Na_m_params +
[VDIVS, VMIN, VMAX])
#: channel/gateY is the h gate
#: Exponent for h gate is 1
na.Ypower = 1
yGate = moose.element(na.path + '/gateY')
yGate.setupAlpha(Na_h_params +
[VDIVS, VMIN, VMAX])
return na
def create_k_proto():
"""Create and return a K+ channel prototype '/library/k'.
The K+ channel conductance has the equation::
g = Gbar * n^4
"""
lib = moose.Neutral('/library')
k = moose.HHChannel('/library/k')
k.tick = -1
k.Xpower = 4.0
xGate = moose.HHGate(k.path + '/gateX')
xGate.setupAlpha(K_n_params +
[VDIVS, VMIN, VMAX])
return k
def create_1comp_neuron(path, number=1):
"""Create single-compartmental neuron with Na+ and K+ channels.
Parameters
----------
path : str
path of the compartment to be created
number : int
number of compartments to be created. If n is greater than 1,
we create a vec with that size, each having the same property.
Returns
-------
comp : moose.Compartment
a compartment vec with `number` elements.
"""
comps = moose.vec(path=path, n=number, dtype='Compartment')
diameter = 30e-6
length = 50e-6
sarea = np.pi * diameter * length
xarea = np.pi * diameter * diameter / 4.0
Em = EREST_ACT + 10.613e-3
comps.Em = Em
comps.initVm = EREST_ACT
#: CM = 1 uF/cm^2
comps.Cm = 1e-6 * sarea * 1e4
#: RM = 0.3 mS/cm^2
comps.Rm = 1 / (0.3e-3 * sarea * 1e4)
container = comps[0].parent.path
#: Here we create copies of the prototype channels
nachan = moose.copy(create_na_proto(), container, 'na_{}'.format(comps.name), number)
#: Gbar_Na = 120 mS/cm^2
nachan.Gbar = [120e-3 * sarea * 1e4] * len(nachan)
nachan.Ek = 115e-3 + EREST_ACT
moose.connect(nachan, 'channel', comps, 'channel', 'OneToOne')
kchan = moose.copy(create_k_proto(), container, 'k_{}'.format(comps.name), number)
#: Gbar_K = 36 mS/cm^2
kchan.Gbar = 36e-3 * sarea * 1e4
kchan.Ek = -12e-3 + EREST_ACT
moose.connect(kchan, 'channel', comps, 'channel', 'OneToOne')
return comps
def current_step_test(simtime, simdt, plotdt):
"""Create a single compartment and set it up for applying a step
current injection.
We use a PulseGen object to generate a 40 ms wide 1 nA current
pulse that starts 20 ms after start of simulation.
"""
model = moose.Neutral('/model')
comp = create_1comp_neuron('/model/neuron')
stim = moose.PulseGen('/model/stimulus')
stim.delay[0] = 20e-3
stim.level[0] = 1e-9
stim.width[0] = 40e-3
stim.delay[1] = 1e9
moose.connect(stim, 'output', comp, 'injectMsg')
data = moose.Neutral('/data')
current_tab = moose.Table('/data/current')
moose.connect(current_tab, 'requestOut', stim, 'getOutputValue')
vm_tab = moose.Table('/data/Vm')
moose.connect(vm_tab, 'requestOut', comp, 'getVm')
for i in range(10):
moose.setClock(i, simdt)
moose.setClock(8, plotdt)
moose.reinit()
moose.start(simtime)
ts = np.linspace(0, simtime, len(vm_tab.vector))
return ts, current_tab.vector, vm_tab.vector,
if __name__ == '__main__':
simtime = 0.1
simdt = 0.25e-5
plotdt = 0.25e-3
ts, current, vm = current_step_test(simtime, simdt, plotdt)
plt.plot(ts, vm * 1e3, label='Vm (mV)')
plt.plot(ts, current * 1e9, label='current (nA)')
plt.legend()
plt.show()
#
# ionchannel.py ends here
| gpl-2.0 |
obarquero/intro_machine_learning_udacity | Projects/ud120-projects-master/naive_bayes/nb_author_id.py | 1 | 1315 | #!/usr/bin/python
"""
this is the code to accompany the Lesson 1 (Naive Bayes) mini-project
use a Naive Bayes Classifier to identify emails by their authors
authors and labels:
Sara has label 0
Chris has label 1
"""
import sys
from time import time
sys.path.append("../tools/")
from email_preprocess import preprocess
### features_train and features_test are the features for the training
### and testing datasets, respectively
### labels_train and labels_test are the corresponding item labels
features_train, features_test, labels_train, labels_test = preprocess()
#########################################################
### your code goes here ###
#importing NB
from sklearn.naive_bayes import GaussianNB
from sklearn.metrics import accuracy_score #importing accuracy score
#create the object
clf = GaussianNB()
#fit to the train data
t0 = time()
clf.fit(features_train,labels_train)
print "training time:", round(time()-t0,3), "s"
#compute the accuracy on test data set
t0 = time()
y_pred = clf.predict(features_test)
print "training time:", round(time()-t0,3), "s"
#compute the accuracy
acc = accuracy_score(labels_test,y_pred)
print "accuracy: ", round(acc,4)
#test_time = time.clock() - start_time
#########################################################
| gpl-2.0 |
TNT-Samuel/Coding-Projects | DNS Server/Source - Copy/Lib/site-packages/dask/bag/core.py | 2 | 72889 | from __future__ import absolute_import, division, print_function
import io
import itertools
import math
import uuid
import warnings
from collections import Iterable, Iterator, defaultdict
from distutils.version import LooseVersion
from functools import wraps, partial
from operator import getitem
from random import Random
from toolz import (merge, take, reduce, valmap, map, partition_all, filter,
remove, compose, curry, first, second, accumulate, peek)
from toolz.compatibility import iteritems, zip
import toolz
_implement_accumulate = LooseVersion(toolz.__version__) > '0.7.4'
try:
import cytoolz
from cytoolz import (frequencies, merge_with, join, reduceby,
count, pluck, groupby, topk)
if LooseVersion(cytoolz.__version__) > '0.7.3':
from cytoolz import accumulate # noqa: F811
_implement_accumulate = True
except ImportError:
from toolz import (frequencies, merge_with, join, reduceby,
count, pluck, groupby, topk)
from .. import config
from ..base import tokenize, dont_optimize, is_dask_collection, DaskMethodsMixin
from ..bytes import open_files
from ..compatibility import apply, urlopen
from ..context import globalmethod
from ..core import quote, istask, get_dependencies, reverse_dict
from ..delayed import Delayed
from ..multiprocessing import get as mpget
from ..optimization import fuse, cull, inline
from ..utils import (system_encoding, takes_multiple_arguments, funcname,
digit, insert, ensure_dict, ensure_bytes, ensure_unicode)
no_default = '__no__default__'
no_result = type('no_result', (object,),
{'__slots__': (),
'__reduce__': lambda self: 'no_result'})
def lazify_task(task, start=True):
"""
Given a task, remove unnecessary calls to ``list`` and ``reify``.
This traverses tasks and small lists. We choose not to traverse down lists
of size >= 50 because it is unlikely that sequences this long contain other
sequences in practice.
Examples
--------
>>> task = (sum, (list, (map, inc, [1, 2, 3]))) # doctest: +SKIP
>>> lazify_task(task) # doctest: +SKIP
(sum, (map, inc, [1, 2, 3]))
"""
if type(task) is list and len(task) < 50:
return [lazify_task(arg, False) for arg in task]
if not istask(task):
return task
head, tail = task[0], task[1:]
if not start and head in (list, reify):
task = task[1]
return lazify_task(*tail, start=False)
else:
return (head,) + tuple([lazify_task(arg, False) for arg in tail])
def lazify(dsk):
"""
Remove unnecessary calls to ``list`` in tasks.
See Also
--------
``dask.bag.core.lazify_task``
"""
return valmap(lazify_task, dsk)
def inline_singleton_lists(dsk, dependencies=None):
""" Inline lists that are only used once.
>>> d = {'b': (list, 'a'),
... 'c': (f, 'b', 1)} # doctest: +SKIP
>>> inline_singleton_lists(d) # doctest: +SKIP
{'c': (f, (list, 'a'), 1)}
Pairs nicely with lazify afterwards.
"""
if dependencies is None:
dependencies = {k: get_dependencies(dsk, task=v)
for k, v in dsk.items()}
dependents = reverse_dict(dependencies)
keys = [k for k, v in dsk.items()
if istask(v) and v and v[0] is list and len(dependents[k]) == 1]
dsk = inline(dsk, keys, inline_constants=False)
for k in keys:
del dsk[k]
return dsk
def optimize(dsk, keys, fuse_keys=None, rename_fused_keys=True, **kwargs):
""" Optimize a dask from a dask Bag. """
dsk2, dependencies = cull(dsk, keys)
dsk3, dependencies = fuse(dsk2, keys + (fuse_keys or []), dependencies,
rename_keys=rename_fused_keys)
dsk4 = inline_singleton_lists(dsk3, dependencies)
dsk5 = lazify(dsk4)
return dsk5
def _to_textfiles_chunk(data, lazy_file, last_endline):
with lazy_file as f:
if isinstance(f, io.TextIOWrapper):
endline = u'\n'
ensure = ensure_unicode
else:
endline = b'\n'
ensure = ensure_bytes
started = False
for d in data:
if started:
f.write(endline)
else:
started = True
f.write(ensure(d))
if last_endline:
f.write(endline)
def to_textfiles(b, path, name_function=None, compression='infer',
encoding=system_encoding, compute=True, storage_options=None,
last_endline=False, **kwargs):
""" Write dask Bag to disk, one filename per partition, one line per element.
**Paths**: This will create one file for each partition in your bag. You
can specify the filenames in a variety of ways.
Use a globstring
>>> b.to_textfiles('/path/to/data/*.json.gz') # doctest: +SKIP
The * will be replaced by the increasing sequence 1, 2, ...
::
/path/to/data/0.json.gz
/path/to/data/1.json.gz
Use a globstring and a ``name_function=`` keyword argument. The
name_function function should expect an integer and produce a string.
Strings produced by name_function must preserve the order of their
respective partition indices.
>>> from datetime import date, timedelta
>>> def name(i):
... return str(date(2015, 1, 1) + i * timedelta(days=1))
>>> name(0)
'2015-01-01'
>>> name(15)
'2015-01-16'
>>> b.to_textfiles('/path/to/data/*.json.gz', name_function=name) # doctest: +SKIP
::
/path/to/data/2015-01-01.json.gz
/path/to/data/2015-01-02.json.gz
...
You can also provide an explicit list of paths.
>>> paths = ['/path/to/data/alice.json.gz', '/path/to/data/bob.json.gz', ...] # doctest: +SKIP
>>> b.to_textfiles(paths) # doctest: +SKIP
**Compression**: Filenames with extensions corresponding to known
compression algorithms (gz, bz2) will be compressed accordingly.
**Bag Contents**: The bag calling ``to_textfiles`` must be a bag of
text strings. For example, a bag of dictionaries could be written to
JSON text files by mapping ``json.dumps`` on to the bag first, and
then calling ``to_textfiles`` :
>>> b_dict.map(json.dumps).to_textfiles("/path/to/data/*.json") # doctest: +SKIP
**Last endline**: By default the last line does not end with a newline
character. Pass ``last_endline=True`` to invert the default.
"""
mode = 'wb' if encoding is None else 'wt'
files = open_files(path, compression=compression, mode=mode,
encoding=encoding, name_function=name_function,
num=b.npartitions, **(storage_options or {}))
name = 'to-textfiles-' + uuid.uuid4().hex
dsk = {(name, i): (_to_textfiles_chunk, (b.name, i), f, last_endline)
for i, f in enumerate(files)}
out = type(b)(merge(dsk, b.dask), name, b.npartitions)
if compute:
out.compute(**kwargs)
return [f.path for f in files]
else:
return out.to_delayed()
def finalize(results):
if not results:
return results
if isinstance(results, Iterator):
results = list(results)
if isinstance(results[0], Iterable) and not isinstance(results[0], str):
results = toolz.concat(results)
if isinstance(results, Iterator):
results = list(results)
return results
def finalize_item(results):
return results[0]
class StringAccessor(object):
""" String processing functions
Examples
--------
>>> import dask.bag as db
>>> b = db.from_sequence(['Alice Smith', 'Bob Jones', 'Charlie Smith'])
>>> list(b.str.lower())
['alice smith', 'bob jones', 'charlie smith']
>>> list(b.str.match('*Smith'))
['Alice Smith', 'Charlie Smith']
>>> list(b.str.split(' '))
[['Alice', 'Smith'], ['Bob', 'Jones'], ['Charlie', 'Smith']]
"""
def __init__(self, bag):
self._bag = bag
def __dir__(self):
return sorted(set(dir(type(self)) + dir(str)))
def _strmap(self, key, *args, **kwargs):
return self._bag.map(lambda s: getattr(s, key)(*args, **kwargs))
def __getattr__(self, key):
try:
return object.__getattribute__(self, key)
except AttributeError:
if key in dir(str):
func = getattr(str, key)
return robust_wraps(func)(partial(self._strmap, key))
else:
raise
def match(self, pattern):
""" Filter strings by those that match a pattern.
Examples
--------
>>> import dask.bag as db
>>> b = db.from_sequence(['Alice Smith', 'Bob Jones', 'Charlie Smith'])
>>> list(b.str.match('*Smith'))
['Alice Smith', 'Charlie Smith']
See Also
--------
fnmatch.fnmatch
"""
from fnmatch import fnmatch
return self._bag.filter(partial(fnmatch, pat=pattern))
def robust_wraps(wrapper):
""" A weak version of wraps that only copies doc. """
def _(wrapped):
wrapped.__doc__ = wrapper.__doc__
return wrapped
return _
class Item(DaskMethodsMixin):
def __init__(self, dsk, key):
self.dask = dsk
self.key = key
self.name = key
def __dask_graph__(self):
return self.dask
def __dask_keys__(self):
return [self.key]
def __dask_tokenize__(self):
return self.key
__dask_optimize__ = globalmethod(optimize, key='bag_optimize',
falsey=dont_optimize)
__dask_scheduler__ = staticmethod(mpget)
def __dask_postcompute__(self):
return finalize_item, ()
def __dask_postpersist__(self):
return Item, (self.key,)
@staticmethod
def from_delayed(value):
""" Create bag item from a dask.delayed value.
See ``dask.bag.from_delayed`` for details
"""
from dask.delayed import Delayed, delayed
if not isinstance(value, Delayed) and hasattr(value, 'key'):
value = delayed(value)
assert isinstance(value, Delayed)
return Item(ensure_dict(value.dask), value.key)
@property
def _args(self):
return (self.dask, self.key)
def __getstate__(self):
return self._args
def __setstate__(self, state):
self.dask, self.key = state
def apply(self, func):
name = 'apply-{0}-{1}'.format(funcname(func), tokenize(self, func))
dsk = {name: (func, self.key)}
return Item(merge(self.dask, dsk), name)
__int__ = __float__ = __complex__ = __bool__ = DaskMethodsMixin.compute
def to_delayed(self, optimize_graph=True):
"""Convert into a ``dask.delayed`` object.
Parameters
----------
optimize_graph : bool, optional
If True [default], the graph is optimized before converting into
``dask.delayed`` objects.
"""
from dask.delayed import Delayed
dsk = self.__dask_graph__()
if optimize_graph:
dsk = self.__dask_optimize__(dsk, self.__dask_keys__())
return Delayed(self.key, dsk)
class Bag(DaskMethodsMixin):
""" Parallel collection of Python objects
Examples
--------
Create Bag from sequence
>>> import dask.bag as db
>>> b = db.from_sequence(range(5))
>>> list(b.filter(lambda x: x % 2 == 0).map(lambda x: x * 10)) # doctest: +SKIP
[0, 20, 40]
Create Bag from filename or globstring of filenames
>>> b = db.read_text('/path/to/mydata.*.json.gz').map(json.loads) # doctest: +SKIP
Create manually (expert use)
>>> dsk = {('x', 0): (range, 5),
... ('x', 1): (range, 5),
... ('x', 2): (range, 5)}
>>> b = Bag(dsk, 'x', npartitions=3)
>>> sorted(b.map(lambda x: x * 10)) # doctest: +SKIP
[0, 0, 0, 10, 10, 10, 20, 20, 20, 30, 30, 30, 40, 40, 40]
>>> int(b.fold(lambda x, y: x + y)) # doctest: +SKIP
30
"""
def __init__(self, dsk, name, npartitions):
self.dask = dsk
self.name = name
self.npartitions = npartitions
def __dask_graph__(self):
return self.dask
def __dask_keys__(self):
return [(self.name, i) for i in range(self.npartitions)]
def __dask_tokenize__(self):
return self.name
__dask_optimize__ = globalmethod(optimize, key='bag_optimize',
falsey=dont_optimize)
__dask_scheduler__ = staticmethod(mpget)
def __dask_postcompute__(self):
return finalize, ()
def __dask_postpersist__(self):
return type(self), (self.name, self.npartitions)
def __str__(self):
name = self.name if len(self.name) < 10 else self.name[:7] + '...'
return 'dask.bag<%s, npartitions=%d>' % (name, self.npartitions)
__repr__ = __str__
str = property(fget=StringAccessor)
def map(self, func, *args, **kwargs):
"""Apply a function elementwise across one or more bags.
Note that all ``Bag`` arguments must be partitioned identically.
Parameters
----------
func : callable
*args, **kwargs : Bag, Item, or object
Extra arguments and keyword arguments to pass to ``func`` *after*
the calling bag instance. Non-Bag args/kwargs are broadcasted
across all calls to ``func``.
Notes
-----
For calls with multiple `Bag` arguments, corresponding partitions
should have the same length; if they do not, the call will error at
compute time.
Examples
--------
>>> import dask.bag as db
>>> b = db.from_sequence(range(5), npartitions=2)
>>> b2 = db.from_sequence(range(5, 10), npartitions=2)
Apply a function to all elements in a bag:
>>> b.map(lambda x: x + 1).compute()
[1, 2, 3, 4, 5]
Apply a function with arguments from multiple bags:
>>> from operator import add
>>> b.map(add, b2).compute()
[5, 7, 9, 11, 13]
Non-bag arguments are broadcast across all calls to the mapped
function:
>>> b.map(add, 1).compute()
[1, 2, 3, 4, 5]
Keyword arguments are also supported, and have the same semantics as
regular arguments:
>>> def myadd(x, y=0):
... return x + y
>>> b.map(myadd, y=b2).compute()
[5, 7, 9, 11, 13]
>>> b.map(myadd, y=1).compute()
[1, 2, 3, 4, 5]
Both arguments and keyword arguments can also be instances of
``dask.bag.Item``. Here we'll add the max value in the bag to each
element:
>>> b.map(myadd, b.max()).compute()
[4, 5, 6, 7, 8]
"""
return bag_map(func, self, *args, **kwargs)
def starmap(self, func, **kwargs):
"""Apply a function using argument tuples from the given bag.
This is similar to ``itertools.starmap``, except it also accepts
keyword arguments. In pseudocode, this is could be written as:
>>> def starmap(func, bag, **kwargs):
... return (func(*args, **kwargs) for args in bag)
Parameters
----------
func : callable
**kwargs : Item, Delayed, or object, optional
Extra keyword arguments to pass to ``func``. These can either be
normal objects, ``dask.bag.Item``, or ``dask.delayed.Delayed``.
Examples
--------
>>> import dask.bag as db
>>> data = [(1, 2), (3, 4), (5, 6), (7, 8), (9, 10)]
>>> b = db.from_sequence(data, npartitions=2)
Apply a function to each argument tuple:
>>> from operator import add
>>> b.starmap(add).compute()
[3, 7, 11, 15, 19]
Apply a function to each argument tuple, with additional keyword
arguments:
>>> def myadd(x, y, z=0):
... return x + y + z
>>> b.starmap(myadd, z=10).compute()
[13, 17, 21, 25, 29]
Keyword arguments can also be instances of ``dask.bag.Item`` or
``dask.delayed.Delayed``:
>>> max_second = b.pluck(1).max()
>>> max_second.compute()
10
>>> b.starmap(myadd, z=max_second).compute()
[13, 17, 21, 25, 29]
"""
name = 'starmap-{0}-{1}'.format(funcname(func),
tokenize(self, func, kwargs))
dsk = self.dask.copy()
if kwargs:
kw_dsk, kwargs = unpack_scalar_dask_kwargs(kwargs)
dsk.update(kw_dsk)
dsk.update({(name, i): (reify, (starmap_chunk, func, (self.name, i), kwargs))
for i in range(self.npartitions)})
return type(self)(dsk, name, self.npartitions)
@property
def _args(self):
return (self.dask, self.name, self.npartitions)
def __getstate__(self):
return self._args
def __setstate__(self, state):
self.dask, self.name, self.npartitions = state
def filter(self, predicate):
""" Filter elements in collection by a predicate function.
>>> def iseven(x):
... return x % 2 == 0
>>> import dask.bag as db
>>> b = db.from_sequence(range(5))
>>> list(b.filter(iseven)) # doctest: +SKIP
[0, 2, 4]
"""
name = 'filter-{0}-{1}'.format(funcname(predicate),
tokenize(self, predicate))
dsk = dict(((name, i), (reify, (filter, predicate, (self.name, i))))
for i in range(self.npartitions))
return type(self)(merge(self.dask, dsk), name, self.npartitions)
def random_sample(self, prob, random_state=None):
""" Return elements from bag with probability of ``prob``.
Parameters
----------
prob : float
A float between 0 and 1, representing the probability that each
element will be returned.
random_state : int or random.Random, optional
If an integer, will be used to seed a new ``random.Random`` object.
If provided, results in deterministic sampling.
Examples
--------
>>> import dask.bag as db
>>> b = db.from_sequence(range(5))
>>> list(b.random_sample(0.5, 42))
[1, 3]
>>> list(b.random_sample(0.5, 42))
[1, 3]
"""
if not 0 <= prob <= 1:
raise ValueError('prob must be a number in the interval [0, 1]')
if not isinstance(random_state, Random):
random_state = Random(random_state)
name = 'random-sample-%s' % tokenize(self, prob, random_state.getstate())
state_data = random_state_data_python(self.npartitions, random_state)
dsk = {(name, i): (reify, (random_sample, (self.name, i), state, prob))
for i, state in zip(range(self.npartitions), state_data)}
return type(self)(merge(self.dask, dsk), name, self.npartitions)
def remove(self, predicate):
""" Remove elements in collection that match predicate.
>>> def iseven(x):
... return x % 2 == 0
>>> import dask.bag as db
>>> b = db.from_sequence(range(5))
>>> list(b.remove(iseven)) # doctest: +SKIP
[1, 3]
"""
name = 'remove-{0}-{1}'.format(funcname(predicate),
tokenize(self, predicate))
dsk = dict(((name, i), (reify, (remove, predicate, (self.name, i))))
for i in range(self.npartitions))
return type(self)(merge(self.dask, dsk), name, self.npartitions)
def map_partitions(self, func, *args, **kwargs):
"""Apply a function to every partition across one or more bags.
Note that all ``Bag`` arguments must be partitioned identically.
Parameters
----------
func : callable
The function to be called on every partition.
This function should expect an ``Iterator`` or ``Iterable`` for
every partition and should return an ``Iterator`` or ``Iterable``
in return.
*args, **kwargs : Bag, Item, Delayed, or object
Arguments and keyword arguments to pass to ``func``.
Partitions from this bag will be the first argument, and these will
be passed *after*.
Examples
--------
>>> import dask.bag as db
>>> b = db.from_sequence(range(1, 101), npartitions=10)
>>> def div(nums, den=1):
... return [num / den for num in nums]
Using a python object:
>>> hi = b.max().compute()
>>> hi
100
>>> b.map_partitions(div, den=hi).take(5)
(0.01, 0.02, 0.03, 0.04, 0.05)
Using an ``Item``:
>>> b.map_partitions(div, den=b.max()).take(5)
(0.01, 0.02, 0.03, 0.04, 0.05)
Note that while both versions give the same output, the second forms a
single graph, and then computes everything at once, and in some cases
may be more efficient.
"""
return map_partitions(func, self, *args, **kwargs)
def pluck(self, key, default=no_default):
""" Select item from all tuples/dicts in collection.
>>> b = from_sequence([{'name': 'Alice', 'credits': [1, 2, 3]},
... {'name': 'Bob', 'credits': [10, 20]}])
>>> list(b.pluck('name')) # doctest: +SKIP
['Alice', 'Bob']
>>> list(b.pluck('credits').pluck(0)) # doctest: +SKIP
[1, 10]
"""
name = 'pluck-' + tokenize(self, key, default)
key = quote(key)
if default == no_default:
dsk = dict(((name, i), (list, (pluck, key, (self.name, i))))
for i in range(self.npartitions))
else:
dsk = dict(((name, i), (list, (pluck, key, (self.name, i), default)))
for i in range(self.npartitions))
return type(self)(merge(self.dask, dsk), name, self.npartitions)
def unzip(self, n):
"""Transform a bag of tuples to ``n`` bags of their elements.
Examples
--------
>>> b = from_sequence([(i, i + 1, i + 2) for i in range(10)])
>>> first, second, third = b.unzip(3)
>>> isinstance(first, Bag)
True
>>> first.compute()
[0, 1, 2, 3, 4, 5, 6, 7, 8, 9]
Note that this is equivalent to:
>>> first, second, third = (b.pluck(i) for i in range(3))
"""
return tuple(self.pluck(i) for i in range(n))
@wraps(to_textfiles)
def to_textfiles(self, path, name_function=None, compression='infer',
encoding=system_encoding, compute=True,
storage_options=None, last_endline=False, **kwargs):
return to_textfiles(self, path, name_function, compression, encoding,
compute, storage_options=storage_options,
last_endline=last_endline, **kwargs)
def fold(self, binop, combine=None, initial=no_default, split_every=None):
""" Parallelizable reduction
Fold is like the builtin function ``reduce`` except that it works in
parallel. Fold takes two binary operator functions, one to reduce each
partition of our dataset and another to combine results between
partitions
1. ``binop``: Binary operator to reduce within each partition
2. ``combine``: Binary operator to combine results from binop
Sequentially this would look like the following:
>>> intermediates = [reduce(binop, part) for part in partitions] # doctest: +SKIP
>>> final = reduce(combine, intermediates) # doctest: +SKIP
If only one function is given then it is used for both functions
``binop`` and ``combine`` as in the following example to compute the
sum:
>>> def add(x, y):
... return x + y
>>> b = from_sequence(range(5))
>>> b.fold(add).compute() # doctest: +SKIP
10
In full form we provide both binary operators as well as their default
arguments
>>> b.fold(binop=add, combine=add, initial=0).compute() # doctest: +SKIP
10
More complex binary operators are also doable
>>> def add_to_set(acc, x):
... ''' Add new element x to set acc '''
... return acc | set([x])
>>> b.fold(add_to_set, set.union, initial=set()).compute() # doctest: +SKIP
{1, 2, 3, 4, 5}
See Also
--------
Bag.foldby
"""
combine = combine or binop
if initial is not no_default:
return self.reduction(curry(_reduce, binop, initial=initial),
curry(_reduce, combine),
split_every=split_every)
else:
from toolz.curried import reduce
return self.reduction(reduce(binop), reduce(combine),
split_every=split_every)
def frequencies(self, split_every=None):
""" Count number of occurrences of each distinct element.
>>> b = from_sequence(['Alice', 'Bob', 'Alice'])
>>> dict(b.frequencies()) # doctest: +SKIP
{'Alice': 2, 'Bob', 1}
"""
return self.reduction(frequencies, merge_frequencies,
out_type=Bag, split_every=split_every,
name='frequencies').map_partitions(dictitems)
def topk(self, k, key=None, split_every=None):
""" K largest elements in collection
Optionally ordered by some key function
>>> b = from_sequence([10, 3, 5, 7, 11, 4])
>>> list(b.topk(2)) # doctest: +SKIP
[11, 10]
>>> list(b.topk(2, lambda x: -x)) # doctest: +SKIP
[3, 4]
"""
if key:
if callable(key) and takes_multiple_arguments(key):
key = partial(apply, key)
func = partial(topk, k, key=key)
else:
func = partial(topk, k)
return self.reduction(func, compose(func, toolz.concat), out_type=Bag,
split_every=split_every, name='topk')
def distinct(self):
""" Distinct elements of collection
Unordered without repeats.
>>> b = from_sequence(['Alice', 'Bob', 'Alice'])
>>> sorted(b.distinct())
['Alice', 'Bob']
"""
return self.reduction(set, merge_distinct, out_type=Bag,
name='distinct')
def reduction(self, perpartition, aggregate, split_every=None,
out_type=Item, name=None):
""" Reduce collection with reduction operators.
Parameters
----------
perpartition: function
reduction to apply to each partition
aggregate: function
reduction to apply to the results of all partitions
split_every: int (optional)
Group partitions into groups of this size while performing reduction
Defaults to 8
out_type: {Bag, Item}
The out type of the result, Item if a single element, Bag if a list
of elements. Defaults to Item.
Examples
--------
>>> b = from_sequence(range(10))
>>> b.reduction(sum, sum).compute()
45
"""
if split_every is None:
split_every = 8
if split_every is False:
split_every = self.npartitions
token = tokenize(self, perpartition, aggregate, split_every)
a = '%s-part-%s' % (name or funcname(perpartition), token)
is_last = self.npartitions == 1
dsk = {(a, i): (empty_safe_apply, perpartition, (self.name, i), is_last)
for i in range(self.npartitions)}
k = self.npartitions
b = a
fmt = '%s-aggregate-%s' % (name or funcname(aggregate), token)
depth = 0
while k > split_every:
c = fmt + str(depth)
dsk2 = dict(((c, i), (empty_safe_aggregate, aggregate,
[(b, j) for j in inds], False))
for i, inds in enumerate(partition_all(split_every,
range(k))))
dsk.update(dsk2)
k = len(dsk2)
b = c
depth += 1
dsk[(fmt, 0)] = (empty_safe_aggregate, aggregate,
[(b, j) for j in range(k)], True)
if out_type is Item:
dsk[fmt] = dsk.pop((fmt, 0))
return Item(merge(self.dask, dsk), fmt)
else:
return Bag(merge(self.dask, dsk), fmt, 1)
def sum(self, split_every=None):
""" Sum all elements """
return self.reduction(sum, sum, split_every=split_every)
def max(self, split_every=None):
""" Maximum element """
return self.reduction(max, max, split_every=split_every)
def min(self, split_every=None):
""" Minimum element """
return self.reduction(min, min, split_every=split_every)
def any(self, split_every=None):
""" Are any of the elements truthy? """
return self.reduction(any, any, split_every=split_every)
def all(self, split_every=None):
""" Are all elements truthy? """
return self.reduction(all, all, split_every=split_every)
def count(self, split_every=None):
""" Count the number of elements. """
return self.reduction(count, sum, split_every=split_every)
def mean(self):
""" Arithmetic mean """
def mean_chunk(seq):
total, n = 0.0, 0
for x in seq:
total += x
n += 1
return total, n
def mean_aggregate(x):
totals, counts = list(zip(*x))
return 1.0 * sum(totals) / sum(counts)
return self.reduction(mean_chunk, mean_aggregate, split_every=False)
def var(self, ddof=0):
""" Variance """
def var_chunk(seq):
squares, total, n = 0.0, 0.0, 0
for x in seq:
squares += x**2
total += x
n += 1
return squares, total, n
def var_aggregate(x):
squares, totals, counts = list(zip(*x))
x2, x, n = float(sum(squares)), float(sum(totals)), sum(counts)
result = (x2 / n) - (x / n)**2
return result * n / (n - ddof)
return self.reduction(var_chunk, var_aggregate, split_every=False)
def std(self, ddof=0):
""" Standard deviation """
return self.var(ddof=ddof).apply(math.sqrt)
def join(self, other, on_self, on_other=None):
""" Joins collection with another collection.
Other collection must be one of the following:
1. An iterable. We recommend tuples over lists for internal
performance reasons.
2. A delayed object, pointing to a tuple. This is recommended if the
other collection is sizable and you're using the distributed
scheduler. Dask is able to pass around data wrapped in delayed
objects with greater sophistication.
3. A Bag with a single partition
You might also consider Dask Dataframe, whose join operations are much
more heavily optimized.
Parameters
----------
other: Iterable, Delayed, Bag
Other collection on which to join
on_self: callable
Function to call on elements in this collection to determine a
match
on_other: callable (defaults to on_self)
Function to call on elements in the other collection to determine a
match
Examples
--------
>>> people = from_sequence(['Alice', 'Bob', 'Charlie'])
>>> fruit = ['Apple', 'Apricot', 'Banana']
>>> list(people.join(fruit, lambda x: x[0])) # doctest: +SKIP
[('Apple', 'Alice'), ('Apricot', 'Alice'), ('Banana', 'Bob')]
"""
name = 'join-' + tokenize(self, other, on_self, on_other)
dsk = {}
if isinstance(other, Bag):
if other.npartitions == 1:
dsk.update(other.dask)
other = other.__dask_keys__()[0]
dsk['join-%s-other' % name] = (list, other)
else:
msg = ("Multi-bag joins are not implemented. "
"We recommend Dask dataframe if appropriate")
raise NotImplementedError(msg)
elif isinstance(other, Delayed):
dsk.update(other.dask)
other = other._key
elif isinstance(other, Iterable):
other = other
else:
msg = ("Joined argument must be single-partition Bag, "
" delayed object, or Iterable, got %s" %
type(other).__name)
raise TypeError(msg)
if on_other is None:
on_other = on_self
dsk.update({(name, i): (list, (join, on_other, other,
on_self, (self.name, i)))
for i in range(self.npartitions)})
return type(self)(merge(self.dask, dsk), name, self.npartitions)
def product(self, other):
""" Cartesian product between two bags. """
assert isinstance(other, Bag)
name = 'product-' + tokenize(self, other)
n, m = self.npartitions, other.npartitions
dsk = dict(((name, i * m + j),
(list, (itertools.product, (self.name, i),
(other.name, j))))
for i in range(n) for j in range(m))
return type(self)(merge(self.dask, other.dask, dsk), name, n * m)
def foldby(self, key, binop, initial=no_default, combine=None,
combine_initial=no_default, split_every=None):
""" Combined reduction and groupby.
Foldby provides a combined groupby and reduce for efficient parallel
split-apply-combine tasks.
The computation
>>> b.foldby(key, binop, init) # doctest: +SKIP
is equivalent to the following:
>>> def reduction(group): # doctest: +SKIP
... return reduce(binop, group, init) # doctest: +SKIP
>>> b.groupby(key).map(lambda (k, v): (k, reduction(v)))# doctest: +SKIP
But uses minimal communication and so is *much* faster.
>>> b = from_sequence(range(10))
>>> iseven = lambda x: x % 2 == 0
>>> add = lambda x, y: x + y
>>> dict(b.foldby(iseven, add)) # doctest: +SKIP
{True: 20, False: 25}
**Key Function**
The key function determines how to group the elements in your bag.
In the common case where your bag holds dictionaries then the key
function often gets out one of those elements.
>>> def key(x):
... return x['name']
This case is so common that it is special cased, and if you provide a
key that is not a callable function then dask.bag will turn it into one
automatically. The following are equivalent:
>>> b.foldby(lambda x: x['name'], ...) # doctest: +SKIP
>>> b.foldby('name', ...) # doctest: +SKIP
**Binops**
It can be tricky to construct the right binary operators to perform
analytic queries. The ``foldby`` method accepts two binary operators,
``binop`` and ``combine``. Binary operators two inputs and output must
have the same type.
Binop takes a running total and a new element and produces a new total:
>>> def binop(total, x):
... return total + x['amount']
Combine takes two totals and combines them:
>>> def combine(total1, total2):
... return total1 + total2
Each of these binary operators may have a default first value for
total, before any other value is seen. For addition binary operators
like above this is often ``0`` or the identity element for your
operation.
**split_every**
Group partitions into groups of this size while performing reduction.
Defaults to 8.
>>> b.foldby('name', binop, 0, combine, 0) # doctest: +SKIP
See Also
--------
toolz.reduceby
pyspark.combineByKey
"""
if split_every is None:
split_every = 8
if split_every is False:
split_every = self.npartitions
token = tokenize(self, key, binop, initial, combine, combine_initial)
a = 'foldby-a-' + token
if combine is None:
combine = binop
if initial is not no_default:
dsk = {(a, i): (reduceby, key, binop, (self.name, i), initial)
for i in range(self.npartitions)}
else:
dsk = {(a, i): (reduceby, key, binop, (self.name, i))
for i in range(self.npartitions)}
def combine2(acc, x):
return combine(acc, x[1])
depth = 0
k = self.npartitions
b = a
while k > split_every:
c = b + str(depth)
if combine_initial is not no_default:
dsk2 = {(c, i): (reduceby, 0, combine2,
(toolz.concat, (map, dictitems,
[(b, j) for j in inds])),
combine_initial)
for i, inds in enumerate(partition_all(split_every,
range(k)))}
else:
dsk2 = {(c, i): (merge_with, (partial, reduce, combine),
[(b, j) for j in inds])
for i, inds in enumerate(partition_all(split_every,
range(k)))}
dsk.update(dsk2)
k = len(dsk2)
b = c
depth += 1
e = 'foldby-b-' + token
if combine_initial is not no_default:
dsk[(e, 0)] = (dictitems, (reduceby, 0, combine2,
(toolz.concat, (map, dictitems,
[(b, j) for j in range(k)])),
combine_initial))
else:
dsk[(e, 0)] = (dictitems, (merge_with, (partial, reduce, combine),
[(b, j) for j in range(k)]))
return type(self)(merge(self.dask, dsk), e, 1)
def take(self, k, npartitions=1, compute=True, warn=True):
""" Take the first k elements.
Parameters
----------
k : int
The number of elements to return
npartitions : int, optional
Elements are only taken from the first ``npartitions``, with a
default of 1. If there are fewer than ``k`` rows in the first
``npartitions`` a warning will be raised and any found rows
returned. Pass -1 to use all partitions.
compute : bool, optional
Whether to compute the result, default is True.
warn : bool, optional
Whether to warn if the number of elements returned is less than
requested, default is True.
>>> b = from_sequence(range(10))
>>> b.take(3) # doctest: +SKIP
(0, 1, 2)
"""
if npartitions <= -1:
npartitions = self.npartitions
if npartitions > self.npartitions:
raise ValueError("only {} partitions, take "
"received {}".format(self.npartitions, npartitions))
token = tokenize(self, k, npartitions)
name = 'take-' + token
if npartitions > 1:
name_p = 'take-partial-' + token
dsk = {}
for i in range(npartitions):
dsk[(name_p, i)] = (list, (take, k, (self.name, i)))
concat = (toolz.concat, ([(name_p, i) for i in range(npartitions)]))
dsk[(name, 0)] = (safe_take, k, concat, warn)
else:
dsk = {(name, 0): (safe_take, k, (self.name, 0), warn)}
b = Bag(merge(self.dask, dsk), name, 1)
if compute:
return tuple(b.compute())
else:
return b
def flatten(self):
""" Concatenate nested lists into one long list.
>>> b = from_sequence([[1], [2, 3]])
>>> list(b)
[[1], [2, 3]]
>>> list(b.flatten())
[1, 2, 3]
"""
name = 'flatten-' + tokenize(self)
dsk = dict(((name, i), (list, (toolz.concat, (self.name, i))))
for i in range(self.npartitions))
return type(self)(merge(self.dask, dsk), name, self.npartitions)
def __iter__(self):
return iter(self.compute())
def groupby(self, grouper, method=None, npartitions=None, blocksize=2**20,
max_branch=None, shuffle=None):
""" Group collection by key function
This requires a full dataset read, serialization and shuffle.
This is expensive. If possible you should use ``foldby``.
Parameters
----------
grouper: function
Function on which to group elements
shuffle: str
Either 'disk' for an on-disk shuffle or 'tasks' to use the task
scheduling framework. Use 'disk' if you are on a single machine
and 'tasks' if you are on a distributed cluster.
npartitions: int
If using the disk-based shuffle, the number of output partitions
blocksize: int
If using the disk-based shuffle, the size of shuffle blocks (bytes)
max_branch: int
If using the task-based shuffle, the amount of splitting each
partition undergoes. Increase this for fewer copies but more
scheduler overhead.
Examples
--------
>>> b = from_sequence(range(10))
>>> iseven = lambda x: x % 2 == 0
>>> dict(b.groupby(iseven)) # doctest: +SKIP
{True: [0, 2, 4, 6, 8], False: [1, 3, 5, 7, 9]}
See Also
--------
Bag.foldby
"""
if method is not None:
raise Exception("The method= keyword has been moved to shuffle=")
if shuffle is None:
shuffle = config.get('shuffle', None)
if shuffle is None:
if 'distributed' in config.get('scheduler', ''):
shuffle = 'tasks'
else:
shuffle = 'disk'
if shuffle == 'disk':
return groupby_disk(self, grouper, npartitions=npartitions,
blocksize=blocksize)
elif shuffle == 'tasks':
return groupby_tasks(self, grouper, max_branch=max_branch)
else:
msg = "Shuffle must be 'disk' or 'tasks'"
raise NotImplementedError(msg)
def to_dataframe(self, meta=None, columns=None):
""" Create Dask Dataframe from a Dask Bag.
Bag should contain tuples, dict records, or scalars.
Index will not be particularly meaningful. Use ``reindex`` afterwards
if necessary.
Parameters
----------
meta : pd.DataFrame, dict, iterable, optional
An empty ``pd.DataFrame`` that matches the dtypes and column names
of the output. This metadata is necessary for many algorithms in
dask dataframe to work. For ease of use, some alternative inputs
are also available. Instead of a ``DataFrame``, a ``dict`` of
``{name: dtype}`` or iterable of ``(name, dtype)`` can be provided.
If not provided or a list, a single element from the first
partition will be computed, triggering a potentially expensive call
to ``compute``. This may lead to unexpected results, so providing
``meta`` is recommended. For more information, see
``dask.dataframe.utils.make_meta``.
columns : sequence, optional
Column names to use. If the passed data do not have names
associated with them, this argument provides names for the columns.
Otherwise this argument indicates the order of the columns in the
result (any names not found in the data will become all-NA
columns). Note that if ``meta`` is provided, column names will be
taken from there and this parameter is invalid.
Examples
--------
>>> import dask.bag as db
>>> b = db.from_sequence([{'name': 'Alice', 'balance': 100},
... {'name': 'Bob', 'balance': 200},
... {'name': 'Charlie', 'balance': 300}],
... npartitions=2)
>>> df = b.to_dataframe()
>>> df.compute()
balance name
0 100 Alice
1 200 Bob
0 300 Charlie
"""
import pandas as pd
import dask.dataframe as dd
if meta is None:
if isinstance(columns, pd.DataFrame):
warnings.warn("Passing metadata to `columns` is deprecated. "
"Please use the `meta` keyword instead.")
meta = columns
else:
head = self.take(1, warn=False)
if len(head) == 0:
raise ValueError("`dask.bag.Bag.to_dataframe` failed to "
"properly infer metadata, please pass in "
"metadata via the `meta` keyword")
meta = pd.DataFrame(list(head), columns=columns)
elif columns is not None:
raise ValueError("Can't specify both `meta` and `columns`")
else:
meta = dd.utils.make_meta(meta)
# Serializing the columns and dtypes is much smaller than serializing
# the empty frame
cols = list(meta.columns)
dtypes = meta.dtypes.to_dict()
name = 'to_dataframe-' + tokenize(self, cols, dtypes)
dsk = self.__dask_optimize__(self.dask, self.__dask_keys__())
dsk.update({(name, i): (to_dataframe, (self.name, i), cols, dtypes)
for i in range(self.npartitions)})
divisions = [None] * (self.npartitions + 1)
return dd.DataFrame(dsk, name, meta, divisions)
def to_delayed(self, optimize_graph=True):
"""Convert into a list of ``dask.delayed`` objects, one per partition.
Parameters
----------
optimize_graph : bool, optional
If True [default], the graph is optimized before converting into
``dask.delayed`` objects.
See Also
--------
dask.bag.from_delayed
"""
from dask.delayed import Delayed
keys = self.__dask_keys__()
dsk = self.__dask_graph__()
if optimize_graph:
dsk = self.__dask_optimize__(dsk, keys)
return [Delayed(k, dsk) for k in keys]
def repartition(self, npartitions):
""" Coalesce bag into fewer partitions.
Examples
--------
>>> b.repartition(5) # set to have 5 partitions # doctest: +SKIP
"""
new_name = 'repartition-%d-%s' % (npartitions, tokenize(self, npartitions))
if npartitions == self.npartitions:
return self
elif npartitions < self.npartitions:
ratio = self.npartitions / npartitions
new_partitions_boundaries = [int(old_partition_index * ratio)
for old_partition_index in range(npartitions + 1)]
dsk = {}
for new_partition_index in range(npartitions):
value = (list, (toolz.concat,
[(self.name, old_partition_index)
for old_partition_index in
range(new_partitions_boundaries[new_partition_index],
new_partitions_boundaries[new_partition_index + 1])]))
dsk[new_name, new_partition_index] = value
else: # npartitions > self.npartitions
ratio = npartitions / self.npartitions
split_name = 'split-%s' % tokenize(self, npartitions)
dsk = {}
last = 0
j = 0
for i in range(self.npartitions):
new = last + ratio
if i == self.npartitions - 1:
k = npartitions - j
else:
k = int(new - last)
dsk[(split_name, i)] = (split, (self.name, i), k)
for jj in range(k):
dsk[(new_name, j)] = (getitem, (split_name, i), jj)
j += 1
last = new
return Bag(dsk=merge(self.dask, dsk), name=new_name, npartitions=npartitions)
def accumulate(self, binop, initial=no_default):
""" Repeatedly apply binary function to a sequence, accumulating results.
This assumes that the bag is ordered. While this is typically the case
not all Dask.bag functions preserve this property.
Examples
--------
>>> from operator import add
>>> b = from_sequence([1, 2, 3, 4, 5], npartitions=2)
>>> b.accumulate(add).compute() # doctest: +SKIP
[1, 3, 6, 10, 15]
Accumulate also takes an optional argument that will be used as the
first value.
>>> b.accumulate(add, initial=-1) # doctest: +SKIP
[-1, 0, 2, 5, 9, 14]
"""
if not _implement_accumulate:
raise NotImplementedError("accumulate requires `toolz` > 0.7.4"
" or `cytoolz` > 0.7.3.")
token = tokenize(self, binop, initial)
binop_name = funcname(binop)
a = '%s-part-%s' % (binop_name, token)
b = '%s-first-%s' % (binop_name, token)
c = '%s-second-%s' % (binop_name, token)
dsk = {(a, 0): (accumulate_part, binop, (self.name, 0), initial, True),
(b, 0): (first, (a, 0)),
(c, 0): (second, (a, 0))}
for i in range(1, self.npartitions):
dsk[(a, i)] = (accumulate_part, binop, (self.name, i), (c, i - 1))
dsk[(b, i)] = (first, (a, i))
dsk[(c, i)] = (second, (a, i))
return Bag(merge(self.dask, dsk), b, self.npartitions)
def accumulate_part(binop, seq, initial, is_first=False):
if initial == no_default:
res = list(accumulate(binop, seq))
else:
res = list(accumulate(binop, seq, initial=initial))
if is_first:
return res, res[-1] if res else [], initial
return res[1:], res[-1]
def partition(grouper, sequence, npartitions, p, nelements=2**20):
""" Partition a bag along a grouper, store partitions on disk. """
for block in partition_all(nelements, sequence):
d = groupby(grouper, block)
d2 = defaultdict(list)
for k, v in d.items():
d2[abs(hash(k)) % npartitions].extend(v)
p.append(d2, fsync=True)
return p
def collect(grouper, group, p, barrier_token):
""" Collect partitions from disk and yield k,v group pairs. """
d = groupby(grouper, p.get(group, lock=False))
return list(d.items())
def from_sequence(seq, partition_size=None, npartitions=None):
""" Create a dask Bag from Python sequence.
This sequence should be relatively small in memory. Dask Bag works
best when it handles loading your data itself. Commonly we load a
sequence of filenames into a Bag and then use ``.map`` to open them.
Parameters
----------
seq: Iterable
A sequence of elements to put into the dask
partition_size: int (optional)
The length of each partition
npartitions: int (optional)
The number of desired partitions
It is best to provide either ``partition_size`` or ``npartitions``
(though not both.)
Examples
--------
>>> b = from_sequence(['Alice', 'Bob', 'Chuck'], partition_size=2)
See Also
--------
read_text: Create bag from text files
"""
seq = list(seq)
if npartitions and not partition_size:
partition_size = int(math.ceil(len(seq) / npartitions))
if npartitions is None and partition_size is None:
if len(seq) < 100:
partition_size = 1
else:
partition_size = int(len(seq) / 100)
parts = list(partition_all(partition_size, seq))
name = 'from_sequence-' + tokenize(seq, partition_size)
d = dict(((name, i), list(part)) for i, part in enumerate(parts))
return Bag(d, name, len(d))
def from_url(urls):
"""Create a dask Bag from a url.
Examples
--------
>>> a = from_url('http://raw.githubusercontent.com/dask/dask/master/README.rst') # doctest: +SKIP
>>> a.npartitions # doctest: +SKIP
1
>>> a.take(8) # doctest: +SKIP
(b'Dask\\n',
b'====\\n',
b'\\n',
b'|Build Status| |Coverage| |Doc Status| |Gitter| |Version Status|\\n',
b'\\n',
b'Dask is a flexible parallel computing library for analytics. See\\n',
b'documentation_ for more information.\\n',
b'\\n')
>>> b = from_url(['http://github.com', 'http://google.com']) # doctest: +SKIP
>>> b.npartitions # doctest: +SKIP
2
"""
if isinstance(urls, str):
urls = [urls]
name = 'from_url-' + uuid.uuid4().hex
dsk = {}
for i, u in enumerate(urls):
dsk[(name, i)] = (list, (urlopen, u))
return Bag(dsk, name, len(urls))
def dictitems(d):
""" A pickleable version of dict.items
>>> dictitems({'x': 1})
[('x', 1)]
"""
return list(d.items())
def concat(bags):
""" Concatenate many bags together, unioning all elements.
>>> import dask.bag as db
>>> a = db.from_sequence([1, 2, 3])
>>> b = db.from_sequence([4, 5, 6])
>>> c = db.concat([a, b])
>>> list(c)
[1, 2, 3, 4, 5, 6]
"""
name = 'concat-' + tokenize(*bags)
counter = itertools.count(0)
dsk = {(name, next(counter)): key
for bag in bags for key in bag.__dask_keys__()}
return Bag(merge(dsk, *[b.dask for b in bags]), name, len(dsk))
def reify(seq):
if isinstance(seq, Iterator):
seq = list(seq)
if seq and isinstance(seq[0], Iterator):
seq = list(map(list, seq))
return seq
def from_delayed(values):
""" Create bag from many dask Delayed objects.
These objects will become the partitions of the resulting Bag. They should
evaluate to a ``list`` or some other concrete sequence.
Parameters
----------
values: list of delayed values
An iterable of dask Delayed objects. Each evaluating to a list.
Returns
-------
Bag
Examples
--------
>>> x, y, z = [delayed(load_sequence_from_file)(fn)
... for fn in filenames] # doctest: +SKIP
>>> b = from_delayed([x, y, z]) # doctest: +SKIP
See also
--------
dask.delayed
"""
from dask.delayed import Delayed, delayed
if isinstance(values, Delayed):
values = [values]
values = [delayed(v)
if not isinstance(v, Delayed) and hasattr(v, 'key')
else v
for v in values]
dsk = merge(ensure_dict(v.dask) for v in values)
name = 'bag-from-delayed-' + tokenize(*values)
names = [(name, i) for i in range(len(values))]
values = [(reify, v.key) for v in values]
dsk2 = dict(zip(names, values))
return Bag(merge(dsk, dsk2), name, len(values))
def merge_distinct(seqs):
return set().union(*seqs)
def merge_frequencies(seqs):
if isinstance(seqs, Iterable):
seqs = list(seqs)
if not seqs:
return {}
first, rest = seqs[0], seqs[1:]
if not rest:
return first
out = defaultdict(int)
out.update(first)
for d in rest:
for k, v in iteritems(d):
out[k] += v
return out
def bag_range(n, npartitions):
""" Numbers from zero to n
Examples
--------
>>> import dask.bag as db
>>> b = db.range(5, npartitions=2)
>>> list(b)
[0, 1, 2, 3, 4]
"""
size = n // npartitions
name = 'range-%d-npartitions-%d' % (n, npartitions)
ijs = list(enumerate(take(npartitions, range(0, n, size))))
dsk = dict(((name, i), (reify, (range, j, min(j + size, n))))
for i, j in ijs)
if n % npartitions != 0:
i, j = ijs[-1]
dsk[(name, i)] = (reify, (range, j, n))
return Bag(dsk, name, npartitions)
def bag_zip(*bags):
""" Partition-wise bag zip
All passed bags must have the same number of partitions.
NOTE: corresponding partitions should have the same length; if they do not,
the "extra" elements from the longer partition(s) will be dropped. If you
have this case chances are that what you really need is a data alignment
mechanism like pandas's, and not a missing value filler like zip_longest.
Examples
--------
Correct usage:
>>> import dask.bag as db
>>> evens = db.from_sequence(range(0, 10, 2), partition_size=4)
>>> odds = db.from_sequence(range(1, 10, 2), partition_size=4)
>>> pairs = db.zip(evens, odds)
>>> list(pairs)
[(0, 1), (2, 3), (4, 5), (6, 7), (8, 9)]
Incorrect usage:
>>> numbers = db.range(20) # doctest: +SKIP
>>> fizz = numbers.filter(lambda n: n % 3 == 0) # doctest: +SKIP
>>> buzz = numbers.filter(lambda n: n % 5 == 0) # doctest: +SKIP
>>> fizzbuzz = db.zip(fizz, buzz) # doctest: +SKIP
>>> list(fizzbuzzz) # doctest: +SKIP
[(0, 0), (3, 5), (6, 10), (9, 15), (12, 20), (15, 25), (18, 30)]
When what you really wanted was more along the lines of the following:
>>> list(fizzbuzzz) # doctest: +SKIP
[(0, 0), (3, None), (None, 5), (6, None), (None 10), (9, None),
(12, None), (15, 15), (18, None), (None, 20), (None, 25), (None, 30)]
"""
npartitions = bags[0].npartitions
assert all(bag.npartitions == npartitions for bag in bags)
# TODO: do more checks
name = 'zip-' + tokenize(*bags)
dsk = dict(
((name, i), (reify, (zip,) + tuple((bag.name, i) for bag in bags)))
for i in range(npartitions))
bags_dsk = merge(*(bag.dask for bag in bags))
return Bag(merge(bags_dsk, dsk), name, npartitions)
def map_chunk(f, args, bag_kwargs, kwargs):
if kwargs:
f = partial(f, **kwargs)
args = [iter(a) for a in args]
iters = list(args)
if bag_kwargs:
keys = list(bag_kwargs)
kw_val_iters = [iter(v) for v in bag_kwargs.values()]
iters.extend(kw_val_iters)
kw_iter = (dict(zip(keys, k)) for k in zip(*kw_val_iters))
if args:
for a, k in zip(zip(*args), kw_iter):
yield f(*a, **k)
else:
for k in kw_iter:
yield f(**k)
else:
for a in zip(*args):
yield f(*a)
# Check that all iterators are fully exhausted
if len(iters) > 1:
for i in iters:
if isinstance(i, itertools.repeat):
continue
try:
next(i)
except StopIteration:
pass
else:
msg = ("map called with multiple bags that aren't identically "
"partitioned. Please ensure that all bag arguments "
"have the same partition lengths")
raise ValueError(msg)
def starmap_chunk(f, x, kwargs):
if kwargs:
f = partial(f, **kwargs)
return itertools.starmap(f, x)
def unpack_scalar_dask_kwargs(kwargs):
"""Extracts dask values from kwargs.
Currently only ``dask.bag.Item`` and ``dask.delayed.Delayed`` are
supported. Returns a merged dask graph and a task resulting in a keyword
dict.
"""
dsk = {}
kwargs2 = {}
for k, v in kwargs.items():
if isinstance(v, (Delayed, Item)):
dsk.update(ensure_dict(v.dask))
kwargs2[k] = v.key
elif is_dask_collection(v):
raise NotImplementedError("dask.bag doesn't support kwargs of "
"type %s" % type(v).__name__)
else:
kwargs2[k] = v
if dsk:
kwargs = (dict, (zip, list(kwargs2), list(kwargs2.values())))
return dsk, kwargs
def bag_map(func, *args, **kwargs):
"""Apply a function elementwise across one or more bags.
Note that all ``Bag`` arguments must be partitioned identically.
Parameters
----------
func : callable
*args, **kwargs : Bag, Item, Delayed, or object
Arguments and keyword arguments to pass to ``func``. Non-Bag args/kwargs
are broadcasted across all calls to ``func``.
Notes
-----
For calls with multiple `Bag` arguments, corresponding partitions should
have the same length; if they do not, the call will error at compute time.
Examples
--------
>>> import dask.bag as db
>>> b = db.from_sequence(range(5), npartitions=2)
>>> b2 = db.from_sequence(range(5, 10), npartitions=2)
Apply a function to all elements in a bag:
>>> db.map(lambda x: x + 1, b).compute()
[1, 2, 3, 4, 5]
Apply a function with arguments from multiple bags:
>>> from operator import add
>>> db.map(add, b, b2).compute()
[5, 7, 9, 11, 13]
Non-bag arguments are broadcast across all calls to the mapped function:
>>> db.map(add, b, 1).compute()
[1, 2, 3, 4, 5]
Keyword arguments are also supported, and have the same semantics as
regular arguments:
>>> def myadd(x, y=0):
... return x + y
>>> db.map(myadd, b, y=b2).compute()
[5, 7, 9, 11, 13]
>>> db.map(myadd, b, y=1).compute()
[1, 2, 3, 4, 5]
Both arguments and keyword arguments can also be instances of
``dask.bag.Item`` or ``dask.delayed.Delayed``. Here we'll add the max value
in the bag to each element:
>>> db.map(myadd, b, b.max()).compute()
[4, 5, 6, 7, 8]
"""
name = 'map-%s-%s' % (funcname(func), tokenize(func, args, kwargs))
dsk = {}
bags = []
args2 = []
for a in args:
if isinstance(a, Bag):
bags.append(a)
args2.append(a)
dsk.update(a.dask)
elif isinstance(a, (Item, Delayed)):
args2.append((itertools.repeat, a.key))
dsk.update(ensure_dict(a.dask))
else:
args2.append((itertools.repeat, a))
bag_kwargs = {}
other_kwargs = {}
for k, v in kwargs.items():
if isinstance(v, Bag):
bag_kwargs[k] = v
bags.append(v)
dsk.update(v.dask)
else:
other_kwargs[k] = v
kw_dsk, other_kwargs = unpack_scalar_dask_kwargs(other_kwargs)
dsk.update(kw_dsk)
if not bags:
raise ValueError("At least one argument must be a Bag.")
npartitions = {b.npartitions for b in bags}
if len(npartitions) > 1:
raise ValueError("All bags must have the same number of partitions.")
npartitions = npartitions.pop()
def build_args(n):
return [(a.name, n) if isinstance(a, Bag) else a for a in args2]
def build_bag_kwargs(n):
if not bag_kwargs:
return None
return (dict, (zip, list(bag_kwargs),
[(b.name, n) for b in bag_kwargs.values()]))
dsk.update({(name, n): (reify, (map_chunk, func, build_args(n),
build_bag_kwargs(n), other_kwargs))
for n in range(npartitions)})
# If all bags are the same type, use that type, otherwise fallback to Bag
return_type = set(map(type, bags))
return_type = return_type.pop() if len(return_type) == 1 else Bag
return return_type(dsk, name, npartitions)
def map_partitions(func, *args, **kwargs):
"""Apply a function to every partition across one or more bags.
Note that all ``Bag`` arguments must be partitioned identically.
Parameters
----------
func : callable
*args, **kwargs : Bag, Item, Delayed, or object
Arguments and keyword arguments to pass to ``func``.
Examples
--------
>>> import dask.bag as db
>>> b = db.from_sequence(range(1, 101), npartitions=10)
>>> def div(nums, den=1):
... return [num / den for num in nums]
Using a python object:
>>> hi = b.max().compute()
>>> hi
100
>>> b.map_partitions(div, den=hi).take(5)
(0.01, 0.02, 0.03, 0.04, 0.05)
Using an ``Item``:
>>> b.map_partitions(div, den=b.max()).take(5)
(0.01, 0.02, 0.03, 0.04, 0.05)
Note that while both versions give the same output, the second forms a
single graph, and then computes everything at once, and in some cases
may be more efficient.
"""
name = 'map-partitions-%s-%s' % (funcname(func),
tokenize(func, args, kwargs))
# Extract bag arguments, build initial graph
bags = []
dsk = {}
for vals in [args, kwargs.values()]:
for a in vals:
if isinstance(a, (Bag, Item, Delayed)):
dsk.update(ensure_dict(a.dask))
if isinstance(a, Bag):
bags.append(a)
elif is_dask_collection(a):
raise NotImplementedError("dask.bag doesn't support args of "
"type %s" % type(a).__name__)
if not bags:
raise ValueError("At least one argument must be a Bag.")
npartitions = {b.npartitions for b in bags}
if len(npartitions) > 1:
raise ValueError("All bags must have the same number of partitions.")
npartitions = npartitions.pop()
def build_task(n):
args2 = [(a.name, n) if isinstance(a, Bag) else a.key
if isinstance(a, (Item, Delayed)) else a for a in args]
if any(isinstance(v, (Bag, Item, Delayed)) for v in kwargs.values()):
vals = [(v.name, n) if isinstance(v, Bag) else v.key
if isinstance(v, (Item, Delayed)) else v
for v in kwargs.values()]
kwargs2 = (dict, (zip, list(kwargs), vals))
else:
kwargs2 = kwargs
if kwargs2 or len(args2) > 1:
return (apply, func, args2, kwargs2)
return (func, args2[0])
dsk.update({(name, n): build_task(n) for n in range(npartitions)})
# If all bags are the same type, use that type, otherwise fallback to Bag
return_type = set(map(type, bags))
return_type = return_type.pop() if len(return_type) == 1 else Bag
return return_type(dsk, name, npartitions)
def _reduce(binop, sequence, initial=no_default):
if initial is not no_default:
return reduce(binop, sequence, initial)
else:
return reduce(binop, sequence)
def make_group(k, stage):
def h(x):
return x[0] // k ** stage % k
return h
def groupby_tasks(b, grouper, hash=hash, max_branch=32):
max_branch = max_branch or 32
n = b.npartitions
stages = int(math.ceil(math.log(n) / math.log(max_branch)))
if stages > 1:
k = int(math.ceil(n ** (1 / stages)))
else:
k = n
groups = []
splits = []
joins = []
inputs = [tuple(digit(i, j, k) for j in range(stages))
for i in range(k**stages)]
b2 = b.map(lambda x: (hash(grouper(x)), x))
token = tokenize(b, grouper, hash, max_branch)
start = dict((('shuffle-join-' + token, 0, inp),
(b2.name, i) if i < b.npartitions else [])
for i, inp in enumerate(inputs))
for stage in range(1, stages + 1):
group = dict((('shuffle-group-' + token, stage, inp),
(groupby,
(make_group, k, stage - 1),
('shuffle-join-' + token, stage - 1, inp)))
for inp in inputs)
split = dict((('shuffle-split-' + token, stage, i, inp),
(dict.get, ('shuffle-group-' + token, stage, inp), i, {}))
for i in range(k)
for inp in inputs)
join = dict((('shuffle-join-' + token, stage, inp),
(list, (toolz.concat, [('shuffle-split-' + token, stage, inp[stage - 1],
insert(inp, stage - 1, j)) for j in range(k)])))
for inp in inputs)
groups.append(group)
splits.append(split)
joins.append(join)
end = dict((('shuffle-' + token, i),
(list, (dict.items, (groupby, grouper, (pluck, 1, j)))))
for i, j in enumerate(join))
dsk = merge(b2.dask, start, end, *(groups + splits + joins))
return type(b)(dsk, 'shuffle-' + token, len(inputs))
def groupby_disk(b, grouper, npartitions=None, blocksize=2**20):
if npartitions is None:
npartitions = b.npartitions
token = tokenize(b, grouper, npartitions, blocksize)
import partd
p = ('partd-' + token,)
dirname = config.get('temporary_directory', None)
if dirname:
file = (apply, partd.File, (), {'dir': dirname})
else:
file = (partd.File,)
try:
dsk1 = {p: (partd.Python, (partd.Snappy, file))}
except AttributeError:
dsk1 = {p: (partd.Python, file)}
# Partition data on disk
name = 'groupby-part-{0}-{1}'.format(funcname(grouper), token)
dsk2 = dict(((name, i), (partition, grouper, (b.name, i),
npartitions, p, blocksize))
for i in range(b.npartitions))
# Barrier
barrier_token = 'groupby-barrier-' + token
def barrier(args):
return 0
dsk3 = {barrier_token: (barrier, list(dsk2))}
# Collect groups
name = 'groupby-collect-' + token
dsk4 = dict(((name, i),
(collect, grouper, i, p, barrier_token))
for i in range(npartitions))
return type(b)(merge(b.dask, dsk1, dsk2, dsk3, dsk4), name, npartitions)
def empty_safe_apply(func, part, is_last):
if isinstance(part, Iterator):
try:
_, part = peek(part)
except StopIteration:
if not is_last:
return no_result
return func(part)
elif not is_last and len(part) == 0:
return no_result
else:
return func(part)
def empty_safe_aggregate(func, parts, is_last):
parts2 = (p for p in parts if p is not no_result)
return empty_safe_apply(func, parts2, is_last)
def safe_take(n, b, warn=True):
r = list(take(n, b))
if len(r) != n and warn:
warnings.warn("Insufficient elements for `take`. {0} elements "
"requested, only {1} elements available. Try passing "
"larger `npartitions` to `take`.".format(n, len(r)))
return r
def random_sample(x, state_data, prob):
"""Filter elements of `x` by a probability `prob`.
Parameters
----------
x : iterable
state_data : tuple
A tuple that can be passed to ``random.Random``.
prob : float
A float between 0 and 1, representing the probability that each
element will be yielded.
"""
random_state = Random(state_data)
for i in x:
if random_state.random() < prob:
yield i
def random_state_data_python(n, random_state=None):
"""Return a list of tuples that can initialize.
``random.Random``.
Parameters
----------
n : int
Number of tuples to return.
random_state : int or ``random.Random``, optional
If an int, is used to seed a new ``random.Random``.
"""
if not isinstance(random_state, Random):
random_state = Random(random_state)
maxuint32 = 1 << 32
return [tuple(random_state.randint(0, maxuint32) for i in range(624))
for i in range(n)]
def split(seq, n):
""" Split apart a sequence into n equal pieces.
>>> split(range(10), 3)
[[0, 1, 2], [3, 4, 5], [6, 7, 8, 9]]
"""
if not isinstance(seq, (list, tuple)):
seq = list(seq)
part = len(seq) / n
L = [seq[int(part * i): int(part * (i + 1))] for i in range(n - 1)]
L.append(seq[int(part * (n - 1)):])
return L
def to_dataframe(seq, columns, dtypes):
import pandas as pd
seq = reify(seq)
# pd.DataFrame expects lists, only copy if necessary
if not isinstance(seq, list):
seq = list(seq)
res = pd.DataFrame(seq, columns=list(columns))
return res.astype(dtypes, copy=False)
| gpl-3.0 |
juharris/tensorflow | tensorflow/examples/skflow/out_of_core_data_classification.py | 9 | 2462 | # Copyright 2016 The TensorFlow Authors. All Rights Reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
"""Example of loading karge data sets into out-of-core dataframe."""
from __future__ import absolute_import
from __future__ import division
from __future__ import print_function
from sklearn import cross_validation
from sklearn import datasets
from sklearn import metrics
# pylint: disable=g-bad-import-order
import dask.dataframe as dd
import pandas as pd
from tensorflow.contrib import learn
# pylint: enable=g-bad-import-order
# Sometimes when your dataset is too large to hold in the memory
# you may want to load it into a out-of-core dataframe as provided by dask
# library to firstly draw sample batches and then load into memory for training.
# Load dataset.
iris = datasets.load_iris()
x_train, x_test, y_train, y_test = cross_validation.train_test_split(
iris.data, iris.target, test_size=0.2, random_state=42)
# Note that we use iris here just for demo purposes
# You can load your own large dataset into a out-of-core dataframe
# using dask's methods, e.g. read_csv() in dask
# details please see: http://dask.pydata.org/en/latest/dataframe.html
# We firstly load them into pandas dataframe and then convert into dask
# dataframe.
x_train, y_train, x_test, y_test = [
pd.DataFrame(data) for data in [x_train, y_train, x_test, y_test]]
x_train, y_train, x_test, y_test = [
dd.from_pandas(data, npartitions=2)
for data in [x_train, y_train, x_test, y_test]]
# Initialize a TensorFlow linear classifier
classifier = learn.LinearClassifier(
feature_columns=learn.infer_real_valued_columns_from_input(x_train),
n_classes=3)
# Fit the model using training set.
classifier.fit(x_train, y_train, steps=200)
# Make predictions on each partitions of testing data
predictions = x_test.map_partitions(classifier.predict).compute()
# Calculate accuracy
score = metrics.accuracy_score(y_test.compute(), predictions)
| apache-2.0 |
giulioungaretti/qt_dot_fitter | stat.py | 1 | 1164 | import numpy as np
import os
import matplotlib.pyplot as plt
def sigmaz(results, sigma=2):
return (results[abs(results-results.mean())/results.std() < sigma])
def do(folder):
all = os.listdir(folder)
files = [i for i in all if 'csv' in i]
print files
tmp = []
for file in files:
try:
array = np.loadtxt(str(folder)+'/'+str(file))
tmp.append(array)
except:
continue
res = np.hstack(tmp)
res = sigmaz(res, 2)
print res.mean()
print res.std()
return res.mean(), res.std()/np.sqrt(len(res))
def get_folders():
folders = []
for i, j, k in os.walk('./'):
if './.' not in i and len(i) > 2:
folders.append(i)
return folders
def plot_results(folders):
fig, ax = plt.subplots()
for folder in folders:
ax.errorbar(int(folder[2:]), do(folder)[0],
yerr=do(folder)[1], fmt='o-')
ax.set_title(os.getcwd().split('/')[-1])
ax.set_xlabel('dose')
ax.set_ylabel('radius (nm)')
fig.show()
raw_input('close..')
if __name__ == "__main__":
folders = get_folders()
plot_results(folders)
| mit |
subdir/yndx-astana-demo-bot | yndx_astana_demo_bot/voice_gender.py | 1 | 3166 | #train_models.py
import os
from glob import glob
import cPickle
import numpy as np
from scipy.io.wavfile import read
from sklearn.mixture import GMM
from sklearn import preprocessing
import python_speech_features as mfcc
import warnings
warnings.filterwarnings('ignore', 'Class GMM is deprecated', DeprecationWarning)
warnings.filterwarnings('ignore', 'Function distribute_covar_matrix_to_match_covariance_type is deprecated', DeprecationWarning)
warnings.filterwarnings('ignore', 'Function log_multivariate_normal_density is deprecated', DeprecationWarning)
VOICES_DIR = 'voices'
def trained_models_exist():
return os.path.exists("male.gmm") and os.path.exists("female.gmm")
def is_male(f):
models = []
with open("male.gmm") as model:
models.append( cPickle.load(model) )
with open("female.gmm") as model:
models.append( cPickle.load(model) )
sr, audio = read(f)
features = get_mfcc(sr,audio)
scores = None
log_likelihood = np.zeros(len(models))
for i, gmm in enumerate(models):
scores = np.array(gmm.score(features))
log_likelihood[i] = scores.sum()
return np.argmax(log_likelihood) == 0
def get_mfcc(sr, audio):
features = get_features(sr, audio)
feat = np.asarray(())
for i in range(features.shape[0]):
temp = features[i,:]
if np.isnan(np.min(temp)):
continue
else:
if feat.size == 0:
feat = temp
else:
feat = np.vstack((feat, temp))
features = feat;
features = preprocessing.scale(features)
return features
def get_features(sr, audio):
return mfcc.mfcc(audio, sr, 0.025, 0.01, 13, appendEnergy=False, nfft=2048)
def save_wav(ogg, wav):
os.system("ffmpeg -i {} -c:a pcm_f32le {}".format(ogg, wav))
def add_new_female_voice(fname):
if not os.path.exists(VOICES_DIR):
os.makedirs(VOICES_DIR)
save_wav(fname, VOICES_DIR + "/female-{}.wav".format(os.path.basename(fname)))
def add_new_male_voice(fname):
if not os.path.exists(VOICES_DIR):
os.makedirs(VOICES_DIR)
save_wav(fname, VOICES_DIR + "/male-{}.wav".format(os.path.basename(fname)))
def learn(files):
features = np.asarray(());
for f in files:
sr,audio = read(f)
vector = preprocessing.scale(get_features(sr, audio))
if features.size == 0:
features = vector
else:
features = np.vstack((features, vector))
gmm = GMM(n_components=8, n_iter=200, covariance_type='diag', n_init=3)
gmm.fit(features)
return gmm
def male_voices():
return glob(VOICES_DIR + "/male-*.wav")
def female_voices():
return glob(VOICES_DIR + "/female-*.wav")
def refresh_gmm_models():
male_voices_files = male_voices()
if male_voices_files:
with open("male.gmm", 'w') as model:
cPickle.dump(learn(male_voices_files), model)
female_voices_files = female_voices()
if female_voices_files:
with open("female.gmm", 'w') as model:
cPickle.dump(learn(female_voices_files), model)
if __name__ == '__main__':
refresh_gmm_models()
| unlicense |
zorojean/tushare | tushare/datayes/trading.py | 14 | 4741 | #!/usr/bin/env python
# -*- coding:utf-8 -*-
"""
Created on 2015年7月4日
@author: JimmyLiu
@QQ:52799046
"""
from tushare.datayes import vars as vs
import pandas as pd
from pandas.compat import StringIO
class Trading():
def __init__(self, client):
self.client = client
def dy_market_tickRT(self, securityID='000001.XSHG,000001.XSHE', field=vs.TICK_RT_DEFAULT_COLS):
"""
获取最新市场信息快照
获取一只或多只证券最新Level1股票信息。
输入一只或多只证券代码,如000001.XSHG (上证指数) 或000001.XSHE(平安银行),
还有所选字段, 得到证券的最新交易快照。
证券可以是股票,指数, 部分债券或 基金。
getTickRTSnapshot
"""
code, result = self.client.getData(vs.TICK_RT%(securityID, field))
return _ret_data(code, result)
def dy_market_tickRtIndex(self, securityID='', field=''):
"""
获取指数成份股的最新市场信息快照
获取一个指数的成份股的最新Level1股票信息。
输入一个指数的证券代码,如000001.XSHG (上证指数) 或000300.XSHG(沪深300),
还有所选字段, 得到指数成份股的最新交易快照。
getTickRTSnapshotIndex
"""
code, result = self.client.getData(vs.TICK_RT_INDEX%(securityID, field))
return _ret_data(code, result)
def dy_market_industry_rt(self, securityID='', field=''):
"""
获取行业(证监会行业标准)资金流向
内容包括小单成交金额、中单成交金额、大单成交金额、超大单成交金额、本次成交单总金额等。
getIndustryTickRTSnapshot
"""
code, result = self.client.getData(vs.INDUSTRY_TICK_RT%(securityID, field))
return _ret_data(code, result)
def dy_market_future_rt(self, instrumentID='', field=''):
"""
获取一只或多只期货的最新市场信息快照
getFutureTickRTSnapshot
"""
code, result = self.client.getData(vs.FUTURE_TICK_RT%(instrumentID, field))
return _ret_data(code, result)
def dy_market_equ_rtrank(self, exchangeCD='', pagesize='',
pagenum='', desc='', field=''):
"""
获取沪深股票涨跌幅排行
getEquRTRank
"""
code, result = self.client.getData(vs.EQU_RT_RANK%(exchangeCD, pagesize,
pagenum, desc, field))
return _ret_data(code, result)
def dy_market_option_rt(self, optionId='', field=''):
"""
获取期权最新市场信息快照
getOptionTickRTSnapshot
"""
code, result = self.client.getData(vs.OPTION_RT%(optionId, field))
return _ret_data(code, result)
def dy_market_sectips(self, tipsTypeCD='H', field=''):
"""
上海证券交易所、深圳证券交易所今日停复牌股票列表。数据更新频率:日。
getSecTips
"""
code, result = self.client.getData(vs.SEC_TIPS%(tipsTypeCD, field))
return _ret_data(code, result)
def dy_market_tickrt_intraday(self, securityID='000001.XSHE', startTime='',
endTime='', field=''):
"""
获取一只股票,指数,债券,基金在当日内时间段Level1信息
对应:getTickRTIntraDay
"""
code, result = self.client.getData(vs.TICK_RT_INTRADAY%(securityID, startTime,
endTime, field))
return _ret_data(code, result)
def dy_market_bar_rt(self, securityID='000001.XSHE', startTime='',
endTime='', unit='1', field=''):
"""
获取一只证券当日的分钟线信息。
输入一只证券代码,如000001.XSHE(平安银行), 得到此证券的当日的分钟线。
证券目前是股票,指数,基金和部分债券。
分钟线的有效数据上午从09:30 到11:30,下午从13:01到15:00
对应:getBarRTIntraDay
"""
code, result = self.client.getData(vs.TICK_RT_INTRADAY%(securityID, startTime,
endTime, field))
return _ret_data(code, result)
def _ret_data(code, result):
if code==200:
result = result.decode('utf-8') if vs.PY3 else result
df = pd.read_csv(StringIO(result))
return df
else:
print(result)
return None
| bsd-3-clause |
izu-mi/py-tensor | utils/lstm.py | 1 | 5171 | """ LSTM Module for stock prediction algorithm """
import time
import warnings
from six.moves import xrange
import numpy as np
from numpy import newaxis
import pandas as pd
from keras.layers.core import Dense, Activation, Dropout
from keras.layers.recurrent import LSTM
from keras.models import Sequential
import matplotlib.pyplot as plt
warnings.filterwarnings("ignore")
def plot2(x, y):
plt.plot(x, color='red', label='prediction')
plt.plot(y, color='blue', label='y_test')
plt.legend(loc='upper left')
plt.savefig('stock_predicton.png', bbox_inches='tight')
plt.close()
#get stock data from url
def get_stock_data(stock_name, normalized=0):
#url = "http://www.google.com/finance/historical?q=" + stock_name + "&startdate=Jul+12%2C+2013&enddate=Jul+11%2C+2017&num=30&ei=rCtlWZGSFN3KsQHwrqWQCw&output=csv"
url="http://www.google.com/finance/historical?q=%s&ei=u-lHWfGPNNWIsgHHqIqICw&output=csv" % stock_name
col_names = ['Date', 'Open', 'High', 'Low', 'Close', 'Volume']
stocks = pd.read_csv(url, header=0, names=col_names)
df = pd.DataFrame(stocks)
df.drop(df.columns[[0, 3, 5]], axis=1, inplace=True)
return df
def plot_results_multiple(predicted_data, true_data, prediction_len):
fig = plt.figure(facecolor='white')
ax = fig.add_subplot(111)
ax.plot(true_data, label='True Data')
# Pad the list of predictions to shift it in the graph to it's correct start
for i, data in enumerate(predicted_data):
padding = [None for p in xrange(i * prediction_len)]
plt.plot(padding + data, label='Prediction')
plt.legend()
plt.savefig('stock_predicton.png', bbox_inches='tight')
plt.close(fig)
def load_data(stock, seq_len):
amount_of_features = len(stock.columns)
data = stock.as_matrix() #pd.DataFrame(stock)
sequence_length = seq_len + 1
result = []
for index in range(len(data) - sequence_length):
result.append(data[index: index + sequence_length])
result = np.array(result)
row = round(0.9 * result.shape[0])
train = result[:int(row), :]
x_train = train[:, :-1]
y_train = train[:, -1][:,-1]
x_test = result[int(row):, :-1]
y_test = result[int(row):, -1][:, -1]
x_train = np.reshape(x_train, (x_train.shape[0], x_train.shape[1], amount_of_features))
x_test = np.reshape(x_test, (x_test.shape[0], x_test.shape[1], amount_of_features))
return [x_train, y_train, x_test, y_test]
def normalise_windows(window_data):
normalised_data = []
for window in window_data:
normalised_window = [((float(p) / float(window[0])) - 1)
for p in window]
normalised_data.append(normalised_window)
return normalised_data
def build_model(layers):
model = Sequential()
model.add(LSTM(
input_dim=layers[0],
output_dim=layers[1],
return_sequences=True))
model.add(Dropout(0.2))
model.add(LSTM(
layers[2],
return_sequences=False))
model.add(Dropout(0.2))
model.add(Dense(
output_dim=layers[2]))
model.add(Activation("linear"))
start = time.time()
model.compile(loss="mse", optimizer="rmsprop",metrics=['accuracy'])
print("Compilation Time : ", time.time() - start)
return model
def build_model2(layers):
d = 0.2
model = Sequential()
model.add(LSTM(128, input_shape=(layers[1], layers[0]), return_sequences=True))
model.add(Dropout(d))
model.add(LSTM(64, input_shape=(layers[1], layers[0]), return_sequences=False))
model.add(Dropout(d))
model.add(Dense(16,init='uniform',activation='relu'))
model.add(Dense(1,init='uniform',activation='relu'))
model.compile(loss='mse',optimizer='adam',metrics=['accuracy'])
return model
def predict_point_by_point(model, data):
# Predict each timestep given the last sequence of true data, in effect only predicting 1 step ahead each time
predicted = model.predict(data)
predicted = np.reshape(predicted, (predicted.size,))
return predicted
def predict_sequence_full(model, data, window_size):
# Shift the window by 1 new prediction each time, re-run predictions on new window
curr_frame = data[0]
predicted = []
for i in range(len(data)):
predicted.append(model.predict(curr_frame[newaxis, :, :])[0, 0])
curr_frame = curr_frame[1:]
curr_frame = np.insert(
curr_frame, [window_size - 1], predicted[-1], axis=0)
return predicted
def predict_sequences_multiple(model, data, window_size, prediction_len):
# Predict sequence of 50 steps before shifting prediction run forward by 50 steps
prediction_seqs = []
for i in range(len(data) // prediction_len):
curr_frame = data[i * prediction_len]
predicted = []
for j in range(prediction_len):
predicted.append(model.predict(curr_frame[newaxis, :, :])[0, 0])
curr_frame = curr_frame[1:]
curr_frame = np.insert(
curr_frame, [window_size - 1], predicted[-1], axis=0)
prediction_seqs.append(predicted)
return prediction_seqs
| mit |
linebp/pandas | pandas/core/internals.py | 1 | 178177 | import copy
import itertools
import re
import operator
from datetime import datetime, timedelta, date
from collections import defaultdict
import numpy as np
from pandas.core.base import PandasObject
from pandas.core.dtypes.dtypes import (
ExtensionDtype, DatetimeTZDtype,
CategoricalDtype)
from pandas.core.dtypes.common import (
_TD_DTYPE, _NS_DTYPE,
_ensure_int64, _ensure_platform_int,
is_integer,
is_dtype_equal,
is_timedelta64_dtype,
is_datetime64_dtype, is_datetimetz, is_sparse,
is_categorical, is_categorical_dtype,
is_integer_dtype,
is_datetime64tz_dtype,
is_object_dtype,
is_datetimelike_v_numeric,
is_float_dtype, is_numeric_dtype,
is_numeric_v_string_like, is_extension_type,
is_list_like,
is_re,
is_re_compilable,
is_scalar,
_get_dtype)
from pandas.core.dtypes.cast import (
maybe_downcast_to_dtype,
maybe_convert_string_to_object,
maybe_upcast,
maybe_convert_scalar, maybe_promote,
infer_dtype_from_scalar,
soft_convert_objects,
maybe_convert_objects,
astype_nansafe,
find_common_type)
from pandas.core.dtypes.missing import (
isnull, array_equivalent,
_is_na_compat,
is_null_datelike_scalar)
import pandas.core.dtypes.concat as _concat
from pandas.core.dtypes.generic import ABCSeries
from pandas.core.common import is_null_slice
import pandas.core.algorithms as algos
from pandas.core.index import Index, MultiIndex, _ensure_index
from pandas.core.indexing import maybe_convert_indices, length_of_indexer
from pandas.core.categorical import Categorical, maybe_to_categorical
from pandas.core.indexes.datetimes import DatetimeIndex
from pandas.io.formats.printing import pprint_thing
import pandas.core.missing as missing
from pandas.core.sparse.array import _maybe_to_sparse, SparseArray
from pandas._libs import lib, tslib
from pandas._libs.tslib import Timedelta
from pandas._libs.lib import BlockPlacement
import pandas.core.computation.expressions as expressions
from pandas.util._decorators import cache_readonly
from pandas.util._validators import validate_bool_kwarg
from pandas import compat, _np_version_under1p9
from pandas.compat import range, map, zip, u
class Block(PandasObject):
"""
Canonical n-dimensional unit of homogeneous dtype contained in a pandas
data structure
Index-ignorant; let the container take care of that
"""
__slots__ = ['_mgr_locs', 'values', 'ndim']
is_numeric = False
is_float = False
is_integer = False
is_complex = False
is_datetime = False
is_datetimetz = False
is_timedelta = False
is_bool = False
is_object = False
is_categorical = False
is_sparse = False
_box_to_block_values = True
_can_hold_na = False
_downcast_dtype = None
_can_consolidate = True
_verify_integrity = True
_validate_ndim = True
_ftype = 'dense'
_holder = None
def __init__(self, values, placement, ndim=None, fastpath=False):
if ndim is None:
ndim = values.ndim
elif values.ndim != ndim:
raise ValueError('Wrong number of dimensions')
self.ndim = ndim
self.mgr_locs = placement
self.values = values
if ndim and len(self.mgr_locs) != len(self.values):
raise ValueError('Wrong number of items passed %d, placement '
'implies %d' % (len(self.values),
len(self.mgr_locs)))
@property
def _consolidate_key(self):
return (self._can_consolidate, self.dtype.name)
@property
def _is_single_block(self):
return self.ndim == 1
@property
def is_view(self):
""" return a boolean if I am possibly a view """
return self.values.base is not None
@property
def is_datelike(self):
""" return True if I am a non-datelike """
return self.is_datetime or self.is_timedelta
def is_categorical_astype(self, dtype):
"""
validate that we have a astypeable to categorical,
returns a boolean if we are a categorical
"""
if is_categorical_dtype(dtype):
if dtype == CategoricalDtype():
return True
# this is a pd.Categorical, but is not
# a valid type for astypeing
raise TypeError("invalid type {0} for astype".format(dtype))
return False
def external_values(self, dtype=None):
""" return an outside world format, currently just the ndarray """
return self.values
def internal_values(self, dtype=None):
""" return an internal format, currently just the ndarray
this should be the pure internal API format
"""
return self.values
def get_values(self, dtype=None):
"""
return an internal format, currently just the ndarray
this is often overriden to handle to_dense like operations
"""
if is_object_dtype(dtype):
return self.values.astype(object)
return self.values
def to_dense(self):
return self.values.view()
def to_object_block(self, mgr):
""" return myself as an object block """
values = self.get_values(dtype=object)
return self.make_block(values, klass=ObjectBlock)
@property
def _na_value(self):
return np.nan
@property
def fill_value(self):
return np.nan
@property
def mgr_locs(self):
return self._mgr_locs
@property
def array_dtype(self):
""" the dtype to return if I want to construct this block as an
array
"""
return self.dtype
def make_block(self, values, placement=None, ndim=None, **kwargs):
"""
Create a new block, with type inference propagate any values that are
not specified
"""
if placement is None:
placement = self.mgr_locs
if ndim is None:
ndim = self.ndim
return make_block(values, placement=placement, ndim=ndim, **kwargs)
def make_block_scalar(self, values, **kwargs):
"""
Create a ScalarBlock
"""
return ScalarBlock(values)
def make_block_same_class(self, values, placement=None, fastpath=True,
**kwargs):
""" Wrap given values in a block of same type as self. """
if placement is None:
placement = self.mgr_locs
return make_block(values, placement=placement, klass=self.__class__,
fastpath=fastpath, **kwargs)
@mgr_locs.setter
def mgr_locs(self, new_mgr_locs):
if not isinstance(new_mgr_locs, BlockPlacement):
new_mgr_locs = BlockPlacement(new_mgr_locs)
self._mgr_locs = new_mgr_locs
def __unicode__(self):
# don't want to print out all of the items here
name = pprint_thing(self.__class__.__name__)
if self._is_single_block:
result = '%s: %s dtype: %s' % (name, len(self), self.dtype)
else:
shape = ' x '.join([pprint_thing(s) for s in self.shape])
result = '%s: %s, %s, dtype: %s' % (name, pprint_thing(
self.mgr_locs.indexer), shape, self.dtype)
return result
def __len__(self):
return len(self.values)
def __getstate__(self):
return self.mgr_locs.indexer, self.values
def __setstate__(self, state):
self.mgr_locs = BlockPlacement(state[0])
self.values = state[1]
self.ndim = self.values.ndim
def _slice(self, slicer):
""" return a slice of my values """
return self.values[slicer]
def reshape_nd(self, labels, shape, ref_items, mgr=None):
"""
Parameters
----------
labels : list of new axis labels
shape : new shape
ref_items : new ref_items
return a new block that is transformed to a nd block
"""
return _block2d_to_blocknd(values=self.get_values().T,
placement=self.mgr_locs, shape=shape,
labels=labels, ref_items=ref_items)
def getitem_block(self, slicer, new_mgr_locs=None):
"""
Perform __getitem__-like, return result as block.
As of now, only supports slices that preserve dimensionality.
"""
if new_mgr_locs is None:
if isinstance(slicer, tuple):
axis0_slicer = slicer[0]
else:
axis0_slicer = slicer
new_mgr_locs = self.mgr_locs[axis0_slicer]
new_values = self._slice(slicer)
if self._validate_ndim and new_values.ndim != self.ndim:
raise ValueError("Only same dim slicing is allowed")
return self.make_block_same_class(new_values, new_mgr_locs)
@property
def shape(self):
return self.values.shape
@property
def itemsize(self):
return self.values.itemsize
@property
def dtype(self):
return self.values.dtype
@property
def ftype(self):
return "%s:%s" % (self.dtype, self._ftype)
def merge(self, other):
return _merge_blocks([self, other])
def reindex_axis(self, indexer, method=None, axis=1, fill_value=None,
limit=None, mask_info=None):
"""
Reindex using pre-computed indexer information
"""
if axis < 1:
raise AssertionError('axis must be at least 1, got %d' % axis)
if fill_value is None:
fill_value = self.fill_value
new_values = algos.take_nd(self.values, indexer, axis,
fill_value=fill_value, mask_info=mask_info)
return self.make_block(new_values, fastpath=True)
def get(self, item):
loc = self.items.get_loc(item)
return self.values[loc]
def iget(self, i):
return self.values[i]
def set(self, locs, values, check=False):
"""
Modify Block in-place with new item value
Returns
-------
None
"""
self.values[locs] = values
def delete(self, loc):
"""
Delete given loc(-s) from block in-place.
"""
self.values = np.delete(self.values, loc, 0)
self.mgr_locs = self.mgr_locs.delete(loc)
def apply(self, func, mgr=None, **kwargs):
""" apply the function to my values; return a block if we are not
one
"""
with np.errstate(all='ignore'):
result = func(self.values, **kwargs)
if not isinstance(result, Block):
result = self.make_block(values=_block_shape(result,
ndim=self.ndim))
return result
def fillna(self, value, limit=None, inplace=False, downcast=None,
mgr=None):
""" fillna on the block with the value. If we fail, then convert to
ObjectBlock and try again
"""
inplace = validate_bool_kwarg(inplace, 'inplace')
if not self._can_hold_na:
if inplace:
return self
else:
return self.copy()
original_value = value
mask = isnull(self.values)
if limit is not None:
if not is_integer(limit):
raise ValueError('Limit must be an integer')
if limit < 1:
raise ValueError('Limit must be greater than 0')
if self.ndim > 2:
raise NotImplementedError("number of dimensions for 'fillna' "
"is currently limited to 2")
mask[mask.cumsum(self.ndim - 1) > limit] = False
# fillna, but if we cannot coerce, then try again as an ObjectBlock
try:
values, _, value, _ = self._try_coerce_args(self.values, value)
blocks = self.putmask(mask, value, inplace=inplace)
blocks = [b.make_block(values=self._try_coerce_result(b.values))
for b in blocks]
return self._maybe_downcast(blocks, downcast)
except (TypeError, ValueError):
# we can't process the value, but nothing to do
if not mask.any():
return self if inplace else self.copy()
# we cannot coerce the underlying object, so
# make an ObjectBlock
return self.to_object_block(mgr=mgr).fillna(original_value,
limit=limit,
inplace=inplace,
downcast=False)
def _maybe_downcast(self, blocks, downcast=None):
# no need to downcast our float
# unless indicated
if downcast is None and self.is_float:
return blocks
elif downcast is None and (self.is_timedelta or self.is_datetime):
return blocks
return _extend_blocks([b.downcast(downcast) for b in blocks])
def downcast(self, dtypes=None, mgr=None):
""" try to downcast each item to the dict of dtypes if present """
# turn it off completely
if dtypes is False:
return self
values = self.values
# single block handling
if self._is_single_block:
# try to cast all non-floats here
if dtypes is None:
dtypes = 'infer'
nv = maybe_downcast_to_dtype(values, dtypes)
return self.make_block(nv, fastpath=True)
# ndim > 1
if dtypes is None:
return self
if not (dtypes == 'infer' or isinstance(dtypes, dict)):
raise ValueError("downcast must have a dictionary or 'infer' as "
"its argument")
# item-by-item
# this is expensive as it splits the blocks items-by-item
blocks = []
for i, rl in enumerate(self.mgr_locs):
if dtypes == 'infer':
dtype = 'infer'
else:
raise AssertionError("dtypes as dict is not supported yet")
# TODO: This either should be completed or removed
dtype = dtypes.get(item, self._downcast_dtype) # noqa
if dtype is None:
nv = _block_shape(values[i], ndim=self.ndim)
else:
nv = maybe_downcast_to_dtype(values[i], dtype)
nv = _block_shape(nv, ndim=self.ndim)
blocks.append(self.make_block(nv, fastpath=True, placement=[rl]))
return blocks
def astype(self, dtype, copy=False, errors='raise', values=None, **kwargs):
return self._astype(dtype, copy=copy, errors=errors, values=values,
**kwargs)
def _astype(self, dtype, copy=False, errors='raise', values=None,
klass=None, mgr=None, raise_on_error=False, **kwargs):
"""
Coerce to the new type (if copy=True, return a new copy)
raise on an except if raise == True
"""
errors_legal_values = ('raise', 'ignore')
if errors not in errors_legal_values:
invalid_arg = ("Expected value of kwarg 'errors' to be one of {}. "
"Supplied value is '{}'".format(
list(errors_legal_values), errors))
raise ValueError(invalid_arg)
# may need to convert to categorical
# this is only called for non-categoricals
if self.is_categorical_astype(dtype):
return self.make_block(Categorical(self.values, **kwargs))
# astype processing
dtype = np.dtype(dtype)
if self.dtype == dtype:
if copy:
return self.copy()
return self
if klass is None:
if dtype == np.object_:
klass = ObjectBlock
try:
# force the copy here
if values is None:
if issubclass(dtype.type,
(compat.text_type, compat.string_types)):
# use native type formatting for datetime/tz/timedelta
if self.is_datelike:
values = self.to_native_types()
# astype formatting
else:
values = self.values
else:
values = self.get_values(dtype=dtype)
# _astype_nansafe works fine with 1-d only
values = astype_nansafe(values.ravel(), dtype, copy=True)
values = values.reshape(self.shape)
newb = make_block(values, placement=self.mgr_locs, dtype=dtype,
klass=klass)
except:
if errors == 'raise':
raise
newb = self.copy() if copy else self
if newb.is_numeric and self.is_numeric:
if newb.shape != self.shape:
raise TypeError("cannot set astype for copy = [%s] for dtype "
"(%s [%s]) with smaller itemsize that current "
"(%s [%s])" % (copy, self.dtype.name,
self.itemsize, newb.dtype.name,
newb.itemsize))
return newb
def convert(self, copy=True, **kwargs):
""" attempt to coerce any object types to better types return a copy
of the block (if copy = True) by definition we are not an ObjectBlock
here!
"""
return self.copy() if copy else self
def _can_hold_element(self, value):
raise NotImplementedError()
def _try_cast(self, value):
raise NotImplementedError()
def _try_cast_result(self, result, dtype=None):
""" try to cast the result to our original type, we may have
roundtripped thru object in the mean-time
"""
if dtype is None:
dtype = self.dtype
if self.is_integer or self.is_bool or self.is_datetime:
pass
elif self.is_float and result.dtype == self.dtype:
# protect against a bool/object showing up here
if isinstance(dtype, compat.string_types) and dtype == 'infer':
return result
if not isinstance(dtype, type):
dtype = dtype.type
if issubclass(dtype, (np.bool_, np.object_)):
if issubclass(dtype, np.bool_):
if isnull(result).all():
return result.astype(np.bool_)
else:
result = result.astype(np.object_)
result[result == 1] = True
result[result == 0] = False
return result
else:
return result.astype(np.object_)
return result
# may need to change the dtype here
return maybe_downcast_to_dtype(result, dtype)
def _try_operate(self, values):
""" return a version to operate on as the input """
return values
def _try_coerce_args(self, values, other):
""" provide coercion to our input arguments """
return values, False, other, False
def _try_coerce_result(self, result):
""" reverse of try_coerce_args """
return result
def _try_coerce_and_cast_result(self, result, dtype=None):
result = self._try_coerce_result(result)
result = self._try_cast_result(result, dtype=dtype)
return result
def _try_fill(self, value):
return value
def to_native_types(self, slicer=None, na_rep='nan', quoting=None,
**kwargs):
""" convert to our native types format, slicing if desired """
values = self.values
if slicer is not None:
values = values[:, slicer]
mask = isnull(values)
if not self.is_object and not quoting:
values = values.astype(str)
else:
values = np.array(values, dtype='object')
values[mask] = na_rep
return values
# block actions ####
def copy(self, deep=True, mgr=None):
""" copy constructor """
values = self.values
if deep:
values = values.copy()
return self.make_block_same_class(values)
def replace(self, to_replace, value, inplace=False, filter=None,
regex=False, convert=True, mgr=None):
""" replace the to_replace value with value, possible to create new
blocks here this is just a call to putmask. regex is not used here.
It is used in ObjectBlocks. It is here for API
compatibility.
"""
inplace = validate_bool_kwarg(inplace, 'inplace')
original_to_replace = to_replace
mask = isnull(self.values)
# try to replace, if we raise an error, convert to ObjectBlock and
# retry
try:
values, _, to_replace, _ = self._try_coerce_args(self.values,
to_replace)
mask = missing.mask_missing(values, to_replace)
if filter is not None:
filtered_out = ~self.mgr_locs.isin(filter)
mask[filtered_out.nonzero()[0]] = False
blocks = self.putmask(mask, value, inplace=inplace)
if convert:
blocks = [b.convert(by_item=True, numeric=False,
copy=not inplace) for b in blocks]
return blocks
except (TypeError, ValueError):
# we can't process the value, but nothing to do
if not mask.any():
return self if inplace else self.copy()
return self.to_object_block(mgr=mgr).replace(
to_replace=original_to_replace, value=value, inplace=inplace,
filter=filter, regex=regex, convert=convert)
def _replace_single(self, *args, **kwargs):
""" no-op on a non-ObjectBlock """
return self if kwargs['inplace'] else self.copy()
def setitem(self, indexer, value, mgr=None):
""" set the value inplace; return a new block (of a possibly different
dtype)
indexer is a direct slice/positional indexer; value must be a
compatible shape
"""
# coerce None values, if appropriate
if value is None:
if self.is_numeric:
value = np.nan
# coerce args
values, _, value, _ = self._try_coerce_args(self.values, value)
arr_value = np.array(value)
# cast the values to a type that can hold nan (if necessary)
if not self._can_hold_element(value):
dtype, _ = maybe_promote(arr_value.dtype)
values = values.astype(dtype)
transf = (lambda x: x.T) if self.ndim == 2 else (lambda x: x)
values = transf(values)
l = len(values)
# length checking
# boolean with truth values == len of the value is ok too
if isinstance(indexer, (np.ndarray, list)):
if is_list_like(value) and len(indexer) != len(value):
if not (isinstance(indexer, np.ndarray) and
indexer.dtype == np.bool_ and
len(indexer[indexer]) == len(value)):
raise ValueError("cannot set using a list-like indexer "
"with a different length than the value")
# slice
elif isinstance(indexer, slice):
if is_list_like(value) and l:
if len(value) != length_of_indexer(indexer, values):
raise ValueError("cannot set using a slice indexer with a "
"different length than the value")
try:
def _is_scalar_indexer(indexer):
# return True if we are all scalar indexers
if arr_value.ndim == 1:
if not isinstance(indexer, tuple):
indexer = tuple([indexer])
return all([is_scalar(idx) for idx in indexer])
return False
def _is_empty_indexer(indexer):
# return a boolean if we have an empty indexer
if arr_value.ndim == 1:
if not isinstance(indexer, tuple):
indexer = tuple([indexer])
return any(isinstance(idx, np.ndarray) and len(idx) == 0
for idx in indexer)
return False
# empty indexers
# 8669 (empty)
if _is_empty_indexer(indexer):
pass
# setting a single element for each dim and with a rhs that could
# be say a list
# GH 6043
elif _is_scalar_indexer(indexer):
values[indexer] = value
# if we are an exact match (ex-broadcasting),
# then use the resultant dtype
elif (len(arr_value.shape) and
arr_value.shape[0] == values.shape[0] and
np.prod(arr_value.shape) == np.prod(values.shape)):
values[indexer] = value
values = values.astype(arr_value.dtype)
# set
else:
values[indexer] = value
# coerce and try to infer the dtypes of the result
if hasattr(value, 'dtype') and is_dtype_equal(values.dtype,
value.dtype):
dtype = value.dtype
elif is_scalar(value):
dtype, _ = infer_dtype_from_scalar(value)
else:
dtype = 'infer'
values = self._try_coerce_and_cast_result(values, dtype)
block = self.make_block(transf(values), fastpath=True)
# may have to soft convert_objects here
if block.is_object and not self.is_object:
block = block.convert(numeric=False)
return block
except ValueError:
raise
except TypeError:
# cast to the passed dtype if possible
# otherwise raise the original error
try:
# e.g. we are uint32 and our value is uint64
# this is for compat with older numpies
block = self.make_block(transf(values.astype(value.dtype)))
return block.setitem(indexer=indexer, value=value, mgr=mgr)
except:
pass
raise
except Exception:
pass
return [self]
def putmask(self, mask, new, align=True, inplace=False, axis=0,
transpose=False, mgr=None):
""" putmask the data to the block; it is possible that we may create a
new dtype of block
return the resulting block(s)
Parameters
----------
mask : the condition to respect
new : a ndarray/object
align : boolean, perform alignment on other/cond, default is True
inplace : perform inplace modification, default is False
axis : int
transpose : boolean
Set to True if self is stored with axes reversed
Returns
-------
a list of new blocks, the result of the putmask
"""
new_values = self.values if inplace else self.values.copy()
if hasattr(new, 'reindex_axis'):
new = new.values
if hasattr(mask, 'reindex_axis'):
mask = mask.values
# if we are passed a scalar None, convert it here
if not is_list_like(new) and isnull(new) and not self.is_object:
new = self.fill_value
if self._can_hold_element(new):
if transpose:
new_values = new_values.T
new = self._try_cast(new)
# If the default repeat behavior in np.putmask would go in the
# wrong direction, then explictly repeat and reshape new instead
if getattr(new, 'ndim', 0) >= 1:
if self.ndim - 1 == new.ndim and axis == 1:
new = np.repeat(
new, new_values.shape[-1]).reshape(self.shape)
new = new.astype(new_values.dtype)
np.putmask(new_values, mask, new)
# maybe upcast me
elif mask.any():
if transpose:
mask = mask.T
if isinstance(new, np.ndarray):
new = new.T
axis = new_values.ndim - axis - 1
# Pseudo-broadcast
if getattr(new, 'ndim', 0) >= 1:
if self.ndim - 1 == new.ndim:
new_shape = list(new.shape)
new_shape.insert(axis, 1)
new = new.reshape(tuple(new_shape))
# need to go column by column
new_blocks = []
if self.ndim > 1:
for i, ref_loc in enumerate(self.mgr_locs):
m = mask[i]
v = new_values[i]
# need a new block
if m.any():
if isinstance(new, np.ndarray):
n = np.squeeze(new[i % new.shape[0]])
else:
n = np.array(new)
# type of the new block
dtype, _ = maybe_promote(n.dtype)
# we need to explicitly astype here to make a copy
n = n.astype(dtype)
nv = _putmask_smart(v, m, n)
else:
nv = v if inplace else v.copy()
# Put back the dimension that was taken from it and make
# a block out of the result.
block = self.make_block(values=nv[np.newaxis],
placement=[ref_loc], fastpath=True)
new_blocks.append(block)
else:
nv = _putmask_smart(new_values, mask, new)
new_blocks.append(self.make_block(values=nv, fastpath=True))
return new_blocks
if inplace:
return [self]
if transpose:
new_values = new_values.T
return [self.make_block(new_values, fastpath=True)]
def interpolate(self, method='pad', axis=0, index=None, values=None,
inplace=False, limit=None, limit_direction='forward',
fill_value=None, coerce=False, downcast=None, mgr=None,
**kwargs):
inplace = validate_bool_kwarg(inplace, 'inplace')
def check_int_bool(self, inplace):
# Only FloatBlocks will contain NaNs.
# timedelta subclasses IntBlock
if (self.is_bool or self.is_integer) and not self.is_timedelta:
if inplace:
return self
else:
return self.copy()
# a fill na type method
try:
m = missing.clean_fill_method(method)
except:
m = None
if m is not None:
r = check_int_bool(self, inplace)
if r is not None:
return r
return self._interpolate_with_fill(method=m, axis=axis,
inplace=inplace, limit=limit,
fill_value=fill_value,
coerce=coerce,
downcast=downcast, mgr=mgr)
# try an interp method
try:
m = missing.clean_interp_method(method, **kwargs)
except:
m = None
if m is not None:
r = check_int_bool(self, inplace)
if r is not None:
return r
return self._interpolate(method=m, index=index, values=values,
axis=axis, limit=limit,
limit_direction=limit_direction,
fill_value=fill_value, inplace=inplace,
downcast=downcast, mgr=mgr, **kwargs)
raise ValueError("invalid method '{0}' to interpolate.".format(method))
def _interpolate_with_fill(self, method='pad', axis=0, inplace=False,
limit=None, fill_value=None, coerce=False,
downcast=None, mgr=None):
""" fillna but using the interpolate machinery """
inplace = validate_bool_kwarg(inplace, 'inplace')
# if we are coercing, then don't force the conversion
# if the block can't hold the type
if coerce:
if not self._can_hold_na:
if inplace:
return [self]
else:
return [self.copy()]
values = self.values if inplace else self.values.copy()
values, _, fill_value, _ = self._try_coerce_args(values, fill_value)
values = self._try_operate(values)
values = missing.interpolate_2d(values, method=method, axis=axis,
limit=limit, fill_value=fill_value,
dtype=self.dtype)
values = self._try_coerce_result(values)
blocks = [self.make_block(values, klass=self.__class__, fastpath=True)]
return self._maybe_downcast(blocks, downcast)
def _interpolate(self, method=None, index=None, values=None,
fill_value=None, axis=0, limit=None,
limit_direction='forward', inplace=False, downcast=None,
mgr=None, **kwargs):
""" interpolate using scipy wrappers """
inplace = validate_bool_kwarg(inplace, 'inplace')
data = self.values if inplace else self.values.copy()
# only deal with floats
if not self.is_float:
if not self.is_integer:
return self
data = data.astype(np.float64)
if fill_value is None:
fill_value = self.fill_value
if method in ('krogh', 'piecewise_polynomial', 'pchip'):
if not index.is_monotonic:
raise ValueError("{0} interpolation requires that the "
"index be monotonic.".format(method))
# process 1-d slices in the axis direction
def func(x):
# process a 1-d slice, returning it
# should the axis argument be handled below in apply_along_axis?
# i.e. not an arg to missing.interpolate_1d
return missing.interpolate_1d(index, x, method=method, limit=limit,
limit_direction=limit_direction,
fill_value=fill_value,
bounds_error=False, **kwargs)
# interp each column independently
interp_values = np.apply_along_axis(func, axis, data)
blocks = [self.make_block(interp_values, klass=self.__class__,
fastpath=True)]
return self._maybe_downcast(blocks, downcast)
def take_nd(self, indexer, axis, new_mgr_locs=None, fill_tuple=None):
"""
Take values according to indexer and return them as a block.bb
"""
# algos.take_nd dispatches for DatetimeTZBlock, CategoricalBlock
# so need to preserve types
# sparse is treated like an ndarray, but needs .get_values() shaping
values = self.values
if self.is_sparse:
values = self.get_values()
if fill_tuple is None:
fill_value = self.fill_value
new_values = algos.take_nd(values, indexer, axis=axis,
allow_fill=False)
else:
fill_value = fill_tuple[0]
new_values = algos.take_nd(values, indexer, axis=axis,
allow_fill=True, fill_value=fill_value)
if new_mgr_locs is None:
if axis == 0:
slc = lib.indexer_as_slice(indexer)
if slc is not None:
new_mgr_locs = self.mgr_locs[slc]
else:
new_mgr_locs = self.mgr_locs[indexer]
else:
new_mgr_locs = self.mgr_locs
if not is_dtype_equal(new_values.dtype, self.dtype):
return self.make_block(new_values, new_mgr_locs)
else:
return self.make_block_same_class(new_values, new_mgr_locs)
def diff(self, n, axis=1, mgr=None):
""" return block for the diff of the values """
new_values = algos.diff(self.values, n, axis=axis)
return [self.make_block(values=new_values, fastpath=True)]
def shift(self, periods, axis=0, mgr=None):
""" shift the block by periods, possibly upcast """
# convert integer to float if necessary. need to do a lot more than
# that, handle boolean etc also
new_values, fill_value = maybe_upcast(self.values)
# make sure array sent to np.roll is c_contiguous
f_ordered = new_values.flags.f_contiguous
if f_ordered:
new_values = new_values.T
axis = new_values.ndim - axis - 1
if np.prod(new_values.shape):
new_values = np.roll(new_values, _ensure_platform_int(periods),
axis=axis)
axis_indexer = [slice(None)] * self.ndim
if periods > 0:
axis_indexer[axis] = slice(None, periods)
else:
axis_indexer[axis] = slice(periods, None)
new_values[tuple(axis_indexer)] = fill_value
# restore original order
if f_ordered:
new_values = new_values.T
return [self.make_block(new_values, fastpath=True)]
def eval(self, func, other, raise_on_error=True, try_cast=False, mgr=None):
"""
evaluate the block; return result block from the result
Parameters
----------
func : how to combine self, other
other : a ndarray/object
raise_on_error : if True, raise when I can't perform the function,
False by default (and just return the data that we had coming in)
try_cast : try casting the results to the input type
Returns
-------
a new block, the result of the func
"""
values = self.values
if hasattr(other, 'reindex_axis'):
other = other.values
# make sure that we can broadcast
is_transposed = False
if hasattr(other, 'ndim') and hasattr(values, 'ndim'):
if values.ndim != other.ndim:
is_transposed = True
else:
if values.shape == other.shape[::-1]:
is_transposed = True
elif values.shape[0] == other.shape[-1]:
is_transposed = True
else:
# this is a broadcast error heree
raise ValueError("cannot broadcast shape [%s] with block "
"values [%s]" % (values.T.shape,
other.shape))
transf = (lambda x: x.T) if is_transposed else (lambda x: x)
# coerce/transpose the args if needed
values, values_mask, other, other_mask = self._try_coerce_args(
transf(values), other)
# get the result, may need to transpose the other
def get_result(other):
# avoid numpy warning of comparisons again None
if other is None:
result = not func.__name__ == 'eq'
# avoid numpy warning of elementwise comparisons to object
elif is_numeric_v_string_like(values, other):
result = False
else:
result = func(values, other)
# mask if needed
if isinstance(values_mask, np.ndarray) and values_mask.any():
result = result.astype('float64', copy=False)
result[values_mask] = np.nan
if other_mask is True:
result = result.astype('float64', copy=False)
result[:] = np.nan
elif isinstance(other_mask, np.ndarray) and other_mask.any():
result = result.astype('float64', copy=False)
result[other_mask.ravel()] = np.nan
return self._try_coerce_result(result)
# error handler if we have an issue operating with the function
def handle_error():
if raise_on_error:
# The 'detail' variable is defined in outer scope.
raise TypeError('Could not operate %s with block values %s' %
(repr(other), str(detail))) # noqa
else:
# return the values
result = np.empty(values.shape, dtype='O')
result.fill(np.nan)
return result
# get the result
try:
with np.errstate(all='ignore'):
result = get_result(other)
# if we have an invalid shape/broadcast error
# GH4576, so raise instead of allowing to pass through
except ValueError as detail:
raise
except Exception as detail:
result = handle_error()
# technically a broadcast error in numpy can 'work' by returning a
# boolean False
if not isinstance(result, np.ndarray):
if not isinstance(result, np.ndarray):
# differentiate between an invalid ndarray-ndarray comparison
# and an invalid type comparison
if isinstance(values, np.ndarray) and is_list_like(other):
raise ValueError('Invalid broadcasting comparison [%s] '
'with block values' % repr(other))
raise TypeError('Could not compare [%s] with block values' %
repr(other))
# transpose if needed
result = transf(result)
# try to cast if requested
if try_cast:
result = self._try_cast_result(result)
return [self.make_block(result, fastpath=True, )]
def where(self, other, cond, align=True, raise_on_error=True,
try_cast=False, axis=0, transpose=False, mgr=None):
"""
evaluate the block; return result block(s) from the result
Parameters
----------
other : a ndarray/object
cond : the condition to respect
align : boolean, perform alignment on other/cond
raise_on_error : if True, raise when I can't perform the function,
False by default (and just return the data that we had coming in)
axis : int
transpose : boolean
Set to True if self is stored with axes reversed
Returns
-------
a new block(s), the result of the func
"""
values = self.values
if transpose:
values = values.T
if hasattr(other, 'reindex_axis'):
other = other.values
if hasattr(cond, 'reindex_axis'):
cond = cond.values
# If the default broadcasting would go in the wrong direction, then
# explictly reshape other instead
if getattr(other, 'ndim', 0) >= 1:
if values.ndim - 1 == other.ndim and axis == 1:
other = other.reshape(tuple(other.shape + (1, )))
if not hasattr(cond, 'shape'):
raise ValueError("where must have a condition that is ndarray "
"like")
other = maybe_convert_string_to_object(other)
other = maybe_convert_scalar(other)
# our where function
def func(cond, values, other):
if cond.ravel().all():
return values
values, values_mask, other, other_mask = self._try_coerce_args(
values, other)
try:
return self._try_coerce_result(expressions.where(
cond, values, other, raise_on_error=True))
except Exception as detail:
if raise_on_error:
raise TypeError('Could not operate [%s] with block values '
'[%s]' % (repr(other), str(detail)))
else:
# return the values
result = np.empty(values.shape, dtype='float64')
result.fill(np.nan)
return result
# see if we can operate on the entire block, or need item-by-item
# or if we are a single block (ndim == 1)
result = func(cond, values, other)
if self._can_hold_na or self.ndim == 1:
if transpose:
result = result.T
# try to cast if requested
if try_cast:
result = self._try_cast_result(result)
return self.make_block(result)
# might need to separate out blocks
axis = cond.ndim - 1
cond = cond.swapaxes(axis, 0)
mask = np.array([cond[i].all() for i in range(cond.shape[0])],
dtype=bool)
result_blocks = []
for m in [mask, ~mask]:
if m.any():
r = self._try_cast_result(result.take(m.nonzero()[0],
axis=axis))
result_blocks.append(
self.make_block(r.T, placement=self.mgr_locs[m]))
return result_blocks
def equals(self, other):
if self.dtype != other.dtype or self.shape != other.shape:
return False
return array_equivalent(self.values, other.values)
def quantile(self, qs, interpolation='linear', axis=0, mgr=None):
"""
compute the quantiles of the
Parameters
----------
qs: a scalar or list of the quantiles to be computed
interpolation: type of interpolation, default 'linear'
axis: axis to compute, default 0
Returns
-------
tuple of (axis, block)
"""
if _np_version_under1p9:
if interpolation != 'linear':
raise ValueError("Interpolation methods other than linear "
"are not supported in numpy < 1.9.")
kw = {}
if not _np_version_under1p9:
kw.update({'interpolation': interpolation})
values = self.get_values()
values, _, _, _ = self._try_coerce_args(values, values)
def _nanpercentile1D(values, mask, q, **kw):
values = values[~mask]
if len(values) == 0:
if is_scalar(q):
return self._na_value
else:
return np.array([self._na_value] * len(q),
dtype=values.dtype)
return np.percentile(values, q, **kw)
def _nanpercentile(values, q, axis, **kw):
mask = isnull(self.values)
if not is_scalar(mask) and mask.any():
if self.ndim == 1:
return _nanpercentile1D(values, mask, q, **kw)
else:
# for nonconsolidatable blocks mask is 1D, but values 2D
if mask.ndim < values.ndim:
mask = mask.reshape(values.shape)
if axis == 0:
values = values.T
mask = mask.T
result = [_nanpercentile1D(val, m, q, **kw) for (val, m)
in zip(list(values), list(mask))]
result = np.array(result, dtype=values.dtype, copy=False).T
return result
else:
return np.percentile(values, q, axis=axis, **kw)
from pandas import Float64Index
is_empty = values.shape[axis] == 0
if is_list_like(qs):
ax = Float64Index(qs)
if is_empty:
if self.ndim == 1:
result = self._na_value
else:
# create the array of na_values
# 2d len(values) * len(qs)
result = np.repeat(np.array([self._na_value] * len(qs)),
len(values)).reshape(len(values),
len(qs))
else:
try:
result = _nanpercentile(values, np.array(qs) * 100,
axis=axis, **kw)
except ValueError:
# older numpies don't handle an array for q
result = [_nanpercentile(values, q * 100,
axis=axis, **kw) for q in qs]
result = np.array(result, copy=False)
if self.ndim > 1:
result = result.T
else:
if self.ndim == 1:
ax = Float64Index([qs])
else:
ax = mgr.axes[0]
if is_empty:
if self.ndim == 1:
result = self._na_value
else:
result = np.array([self._na_value] * len(self))
else:
result = _nanpercentile(values, qs * 100, axis=axis, **kw)
ndim = getattr(result, 'ndim', None) or 0
result = self._try_coerce_result(result)
if is_scalar(result):
return ax, self.make_block_scalar(result)
return ax, make_block(result,
placement=np.arange(len(result)),
ndim=ndim)
class ScalarBlock(Block):
"""
a scalar compat Block
"""
__slots__ = ['_mgr_locs', 'values', 'ndim']
def __init__(self, values):
self.ndim = 0
self.mgr_locs = [0]
self.values = values
@property
def dtype(self):
return type(self.values)
@property
def shape(self):
return tuple([0])
def __len__(self):
return 0
class NonConsolidatableMixIn(object):
""" hold methods for the nonconsolidatable blocks """
_can_consolidate = False
_verify_integrity = False
_validate_ndim = False
_holder = None
def __init__(self, values, placement, ndim=None, fastpath=False, **kwargs):
# Placement must be converted to BlockPlacement via property setter
# before ndim logic, because placement may be a slice which doesn't
# have a length.
self.mgr_locs = placement
# kludgetastic
if ndim is None:
if len(self.mgr_locs) != 1:
ndim = 1
else:
ndim = 2
self.ndim = ndim
if not isinstance(values, self._holder):
raise TypeError("values must be {0}".format(self._holder.__name__))
self.values = values
@property
def shape(self):
if self.ndim == 1:
return (len(self.values)),
return (len(self.mgr_locs), len(self.values))
def get_values(self, dtype=None):
""" need to to_dense myself (and always return a ndim sized object) """
values = self.values.to_dense()
if values.ndim == self.ndim - 1:
values = values.reshape((1,) + values.shape)
return values
def iget(self, col):
if self.ndim == 2 and isinstance(col, tuple):
col, loc = col
if not is_null_slice(col) and col != 0:
raise IndexError("{0} only contains one item".format(self))
return self.values[loc]
else:
if col != 0:
raise IndexError("{0} only contains one item".format(self))
return self.values
def should_store(self, value):
return isinstance(value, self._holder)
def set(self, locs, values, check=False):
assert locs.tolist() == [0]
self.values = values
def get(self, item):
if self.ndim == 1:
loc = self.items.get_loc(item)
return self.values[loc]
else:
return self.values
def putmask(self, mask, new, align=True, inplace=False, axis=0,
transpose=False, mgr=None):
"""
putmask the data to the block; we must be a single block and not
generate other blocks
return the resulting block
Parameters
----------
mask : the condition to respect
new : a ndarray/object
align : boolean, perform alignment on other/cond, default is True
inplace : perform inplace modification, default is False
Returns
-------
a new block(s), the result of the putmask
"""
inplace = validate_bool_kwarg(inplace, 'inplace')
# use block's copy logic.
# .values may be an Index which does shallow copy by default
new_values = self.values if inplace else self.copy().values
new_values, _, new, _ = self._try_coerce_args(new_values, new)
if isinstance(new, np.ndarray) and len(new) == len(mask):
new = new[mask]
mask = _safe_reshape(mask, new_values.shape)
new_values[mask] = new
new_values = self._try_coerce_result(new_values)
return [self.make_block(values=new_values)]
def _slice(self, slicer):
""" return a slice of my values (but densify first) """
return self.get_values()[slicer]
def _try_cast_result(self, result, dtype=None):
return result
class NumericBlock(Block):
__slots__ = ()
is_numeric = True
_can_hold_na = True
class FloatOrComplexBlock(NumericBlock):
__slots__ = ()
def equals(self, other):
if self.dtype != other.dtype or self.shape != other.shape:
return False
left, right = self.values, other.values
return ((left == right) | (np.isnan(left) & np.isnan(right))).all()
class FloatBlock(FloatOrComplexBlock):
__slots__ = ()
is_float = True
_downcast_dtype = 'int64'
def _can_hold_element(self, element):
if is_list_like(element):
element = np.array(element)
tipo = element.dtype.type
return (issubclass(tipo, (np.floating, np.integer)) and
not issubclass(tipo, (np.datetime64, np.timedelta64)))
return (isinstance(element, (float, int, np.float_, np.int_)) and
not isinstance(element, (bool, np.bool_, datetime, timedelta,
np.datetime64, np.timedelta64)))
def _try_cast(self, element):
try:
return float(element)
except: # pragma: no cover
return element
def to_native_types(self, slicer=None, na_rep='', float_format=None,
decimal='.', quoting=None, **kwargs):
""" convert to our native types format, slicing if desired """
values = self.values
if slicer is not None:
values = values[:, slicer]
# see gh-13418: no special formatting is desired at the
# output (important for appropriate 'quoting' behaviour),
# so do not pass it through the FloatArrayFormatter
if float_format is None and decimal == '.':
mask = isnull(values)
if not quoting:
values = values.astype(str)
else:
values = np.array(values, dtype='object')
values[mask] = na_rep
return values
from pandas.io.formats.format import FloatArrayFormatter
formatter = FloatArrayFormatter(values, na_rep=na_rep,
float_format=float_format,
decimal=decimal, quoting=quoting,
fixed_width=False)
return formatter.get_result_as_array()
def should_store(self, value):
# when inserting a column should not coerce integers to floats
# unnecessarily
return (issubclass(value.dtype.type, np.floating) and
value.dtype == self.dtype)
class ComplexBlock(FloatOrComplexBlock):
__slots__ = ()
is_complex = True
def _can_hold_element(self, element):
if is_list_like(element):
element = np.array(element)
return issubclass(element.dtype.type,
(np.floating, np.integer, np.complexfloating))
return (isinstance(element,
(float, int, complex, np.float_, np.int_)) and
not isinstance(bool, np.bool_))
def _try_cast(self, element):
try:
return complex(element)
except: # pragma: no cover
return element
def should_store(self, value):
return issubclass(value.dtype.type, np.complexfloating)
class IntBlock(NumericBlock):
__slots__ = ()
is_integer = True
_can_hold_na = False
def _can_hold_element(self, element):
if is_list_like(element):
element = np.array(element)
tipo = element.dtype.type
return (issubclass(tipo, np.integer) and
not issubclass(tipo, (np.datetime64, np.timedelta64)))
return is_integer(element)
def _try_cast(self, element):
try:
return int(element)
except: # pragma: no cover
return element
def should_store(self, value):
return is_integer_dtype(value) and value.dtype == self.dtype
class DatetimeLikeBlockMixin(object):
@property
def _na_value(self):
return tslib.NaT
@property
def fill_value(self):
return tslib.iNaT
def _try_operate(self, values):
""" return a version to operate on """
return values.view('i8')
def get_values(self, dtype=None):
"""
return object dtype as boxed values, such as Timestamps/Timedelta
"""
if is_object_dtype(dtype):
return lib.map_infer(self.values.ravel(),
self._box_func).reshape(self.values.shape)
return self.values
class TimeDeltaBlock(DatetimeLikeBlockMixin, IntBlock):
__slots__ = ()
is_timedelta = True
_can_hold_na = True
is_numeric = False
@property
def _box_func(self):
return lambda x: tslib.Timedelta(x, unit='ns')
def fillna(self, value, **kwargs):
# allow filling with integers to be
# interpreted as seconds
if not isinstance(value, np.timedelta64) and is_integer(value):
value = Timedelta(value, unit='s')
return super(TimeDeltaBlock, self).fillna(value, **kwargs)
def _try_coerce_args(self, values, other):
"""
Coerce values and other to int64, with null values converted to
iNaT. values is always ndarray-like, other may not be
Parameters
----------
values : ndarray-like
other : ndarray-like or scalar
Returns
-------
base-type values, values mask, base-type other, other mask
"""
values_mask = isnull(values)
values = values.view('i8')
other_mask = False
if isinstance(other, bool):
raise TypeError
elif is_null_datelike_scalar(other):
other = tslib.iNaT
other_mask = True
elif isinstance(other, Timedelta):
other_mask = isnull(other)
other = other.value
elif isinstance(other, np.timedelta64):
other_mask = isnull(other)
other = Timedelta(other).value
elif isinstance(other, timedelta):
other = Timedelta(other).value
elif isinstance(other, np.ndarray):
other_mask = isnull(other)
other = other.astype('i8', copy=False).view('i8')
else:
# scalar
other = Timedelta(other)
other_mask = isnull(other)
other = other.value
return values, values_mask, other, other_mask
def _try_coerce_result(self, result):
""" reverse of try_coerce_args / try_operate """
if isinstance(result, np.ndarray):
mask = isnull(result)
if result.dtype.kind in ['i', 'f', 'O']:
result = result.astype('m8[ns]')
result[mask] = tslib.iNaT
elif isinstance(result, (np.integer, np.float)):
result = self._box_func(result)
return result
def should_store(self, value):
return issubclass(value.dtype.type, np.timedelta64)
def to_native_types(self, slicer=None, na_rep=None, quoting=None,
**kwargs):
""" convert to our native types format, slicing if desired """
values = self.values
if slicer is not None:
values = values[:, slicer]
mask = isnull(values)
rvalues = np.empty(values.shape, dtype=object)
if na_rep is None:
na_rep = 'NaT'
rvalues[mask] = na_rep
imask = (~mask).ravel()
# FIXME:
# should use the formats.format.Timedelta64Formatter here
# to figure what format to pass to the Timedelta
# e.g. to not show the decimals say
rvalues.flat[imask] = np.array([Timedelta(val)._repr_base(format='all')
for val in values.ravel()[imask]],
dtype=object)
return rvalues
class BoolBlock(NumericBlock):
__slots__ = ()
is_bool = True
_can_hold_na = False
def _can_hold_element(self, element):
if is_list_like(element):
element = np.array(element)
return issubclass(element.dtype.type, np.integer)
return isinstance(element, (int, bool))
def _try_cast(self, element):
try:
return bool(element)
except: # pragma: no cover
return element
def should_store(self, value):
return issubclass(value.dtype.type, np.bool_)
def replace(self, to_replace, value, inplace=False, filter=None,
regex=False, convert=True, mgr=None):
inplace = validate_bool_kwarg(inplace, 'inplace')
to_replace_values = np.atleast_1d(to_replace)
if not np.can_cast(to_replace_values, bool):
return self
return super(BoolBlock, self).replace(to_replace, value,
inplace=inplace, filter=filter,
regex=regex, convert=convert,
mgr=mgr)
class ObjectBlock(Block):
__slots__ = ()
is_object = True
_can_hold_na = True
def __init__(self, values, ndim=2, fastpath=False, placement=None,
**kwargs):
if issubclass(values.dtype.type, compat.string_types):
values = np.array(values, dtype=object)
super(ObjectBlock, self).__init__(values, ndim=ndim, fastpath=fastpath,
placement=placement, **kwargs)
@property
def is_bool(self):
""" we can be a bool if we have only bool values but are of type
object
"""
return lib.is_bool_array(self.values.ravel())
# TODO: Refactor when convert_objects is removed since there will be 1 path
def convert(self, *args, **kwargs):
""" attempt to coerce any object types to better types return a copy of
the block (if copy = True) by definition we ARE an ObjectBlock!!!!!
can return multiple blocks!
"""
if args:
raise NotImplementedError
by_item = True if 'by_item' not in kwargs else kwargs['by_item']
new_inputs = ['coerce', 'datetime', 'numeric', 'timedelta']
new_style = False
for kw in new_inputs:
new_style |= kw in kwargs
if new_style:
fn = soft_convert_objects
fn_inputs = new_inputs
else:
fn = maybe_convert_objects
fn_inputs = ['convert_dates', 'convert_numeric',
'convert_timedeltas']
fn_inputs += ['copy']
fn_kwargs = {}
for key in fn_inputs:
if key in kwargs:
fn_kwargs[key] = kwargs[key]
# attempt to create new type blocks
blocks = []
if by_item and not self._is_single_block:
for i, rl in enumerate(self.mgr_locs):
values = self.iget(i)
shape = values.shape
values = fn(values.ravel(), **fn_kwargs)
try:
values = values.reshape(shape)
values = _block_shape(values, ndim=self.ndim)
except (AttributeError, NotImplementedError):
pass
newb = make_block(values, ndim=self.ndim, placement=[rl])
blocks.append(newb)
else:
values = fn(self.values.ravel(), **fn_kwargs)
try:
values = values.reshape(self.values.shape)
except NotImplementedError:
pass
blocks.append(make_block(values, ndim=self.ndim,
placement=self.mgr_locs))
return blocks
def set(self, locs, values, check=False):
"""
Modify Block in-place with new item value
Returns
-------
None
"""
# GH6026
if check:
try:
if (self.values[locs] == values).all():
return
except:
pass
try:
self.values[locs] = values
except (ValueError):
# broadcasting error
# see GH6171
new_shape = list(values.shape)
new_shape[0] = len(self.items)
self.values = np.empty(tuple(new_shape), dtype=self.dtype)
self.values.fill(np.nan)
self.values[locs] = values
def _maybe_downcast(self, blocks, downcast=None):
if downcast is not None:
return blocks
# split and convert the blocks
return _extend_blocks([b.convert(datetime=True, numeric=False)
for b in blocks])
def _can_hold_element(self, element):
return True
def _try_cast(self, element):
return element
def should_store(self, value):
return not (issubclass(value.dtype.type,
(np.integer, np.floating, np.complexfloating,
np.datetime64, np.bool_)) or
is_extension_type(value))
def replace(self, to_replace, value, inplace=False, filter=None,
regex=False, convert=True, mgr=None):
to_rep_is_list = is_list_like(to_replace)
value_is_list = is_list_like(value)
both_lists = to_rep_is_list and value_is_list
either_list = to_rep_is_list or value_is_list
result_blocks = []
blocks = [self]
if not either_list and is_re(to_replace):
return self._replace_single(to_replace, value, inplace=inplace,
filter=filter, regex=True,
convert=convert, mgr=mgr)
elif not (either_list or regex):
return super(ObjectBlock, self).replace(to_replace, value,
inplace=inplace,
filter=filter, regex=regex,
convert=convert, mgr=mgr)
elif both_lists:
for to_rep, v in zip(to_replace, value):
result_blocks = []
for b in blocks:
result = b._replace_single(to_rep, v, inplace=inplace,
filter=filter, regex=regex,
convert=convert, mgr=mgr)
result_blocks = _extend_blocks(result, result_blocks)
blocks = result_blocks
return result_blocks
elif to_rep_is_list and regex:
for to_rep in to_replace:
result_blocks = []
for b in blocks:
result = b._replace_single(to_rep, value, inplace=inplace,
filter=filter, regex=regex,
convert=convert, mgr=mgr)
result_blocks = _extend_blocks(result, result_blocks)
blocks = result_blocks
return result_blocks
return self._replace_single(to_replace, value, inplace=inplace,
filter=filter, convert=convert,
regex=regex, mgr=mgr)
def _replace_single(self, to_replace, value, inplace=False, filter=None,
regex=False, convert=True, mgr=None):
inplace = validate_bool_kwarg(inplace, 'inplace')
# to_replace is regex compilable
to_rep_re = regex and is_re_compilable(to_replace)
# regex is regex compilable
regex_re = is_re_compilable(regex)
# only one will survive
if to_rep_re and regex_re:
raise AssertionError('only one of to_replace and regex can be '
'regex compilable')
# if regex was passed as something that can be a regex (rather than a
# boolean)
if regex_re:
to_replace = regex
regex = regex_re or to_rep_re
# try to get the pattern attribute (compiled re) or it's a string
try:
pattern = to_replace.pattern
except AttributeError:
pattern = to_replace
# if the pattern is not empty and to_replace is either a string or a
# regex
if regex and pattern:
rx = re.compile(to_replace)
else:
# if the thing to replace is not a string or compiled regex call
# the superclass method -> to_replace is some kind of object
return super(ObjectBlock, self).replace(to_replace, value,
inplace=inplace,
filter=filter, regex=regex,
mgr=mgr)
new_values = self.values if inplace else self.values.copy()
# deal with replacing values with objects (strings) that match but
# whose replacement is not a string (numeric, nan, object)
if isnull(value) or not isinstance(value, compat.string_types):
def re_replacer(s):
try:
return value if rx.search(s) is not None else s
except TypeError:
return s
else:
# value is guaranteed to be a string here, s can be either a string
# or null if it's null it gets returned
def re_replacer(s):
try:
return rx.sub(value, s)
except TypeError:
return s
f = np.vectorize(re_replacer, otypes=[self.dtype])
if filter is None:
filt = slice(None)
else:
filt = self.mgr_locs.isin(filter).nonzero()[0]
new_values[filt] = f(new_values[filt])
# convert
block = self.make_block(new_values)
if convert:
block = block.convert(by_item=True, numeric=False)
return block
class CategoricalBlock(NonConsolidatableMixIn, ObjectBlock):
__slots__ = ()
is_categorical = True
_verify_integrity = True
_can_hold_na = True
_holder = Categorical
def __init__(self, values, placement, fastpath=False, **kwargs):
# coerce to categorical if we can
super(CategoricalBlock, self).__init__(maybe_to_categorical(values),
fastpath=True,
placement=placement, **kwargs)
@property
def is_view(self):
""" I am never a view """
return False
def to_dense(self):
return self.values.to_dense().view()
def convert(self, copy=True, **kwargs):
return self.copy() if copy else self
@property
def array_dtype(self):
""" the dtype to return if I want to construct this block as an
array
"""
return np.object_
def _slice(self, slicer):
""" return a slice of my values """
# slice the category
# return same dims as we currently have
return self.values._slice(slicer)
def _try_coerce_result(self, result):
""" reverse of try_coerce_args """
# GH12564: CategoricalBlock is 1-dim only
# while returned results could be any dim
if ((not is_categorical_dtype(result)) and
isinstance(result, np.ndarray)):
result = _block_shape(result, ndim=self.ndim)
return result
def fillna(self, value, limit=None, inplace=False, downcast=None,
mgr=None):
# we may need to upcast our fill to match our dtype
if limit is not None:
raise NotImplementedError("specifying a limit for 'fillna' has "
"not been implemented yet")
values = self.values if inplace else self.values.copy()
values = self._try_coerce_result(values.fillna(value=value,
limit=limit))
return [self.make_block(values=values)]
def interpolate(self, method='pad', axis=0, inplace=False, limit=None,
fill_value=None, **kwargs):
values = self.values if inplace else self.values.copy()
return self.make_block_same_class(
values=values.fillna(fill_value=fill_value, method=method,
limit=limit),
placement=self.mgr_locs)
def shift(self, periods, axis=0, mgr=None):
return self.make_block_same_class(values=self.values.shift(periods),
placement=self.mgr_locs)
def take_nd(self, indexer, axis=0, new_mgr_locs=None, fill_tuple=None):
"""
Take values according to indexer and return them as a block.bb
"""
if fill_tuple is None:
fill_value = None
else:
fill_value = fill_tuple[0]
# axis doesn't matter; we are really a single-dim object
# but are passed the axis depending on the calling routing
# if its REALLY axis 0, then this will be a reindex and not a take
new_values = self.values.take_nd(indexer, fill_value=fill_value)
# if we are a 1-dim object, then always place at 0
if self.ndim == 1:
new_mgr_locs = [0]
else:
if new_mgr_locs is None:
new_mgr_locs = self.mgr_locs
return self.make_block_same_class(new_values, new_mgr_locs)
def _astype(self, dtype, copy=False, errors='raise', values=None,
klass=None, mgr=None):
"""
Coerce to the new type (if copy=True, return a new copy)
raise on an except if raise == True
"""
if self.is_categorical_astype(dtype):
values = self.values
else:
values = np.asarray(self.values).astype(dtype, copy=False)
if copy:
values = values.copy()
return self.make_block(values)
def to_native_types(self, slicer=None, na_rep='', quoting=None, **kwargs):
""" convert to our native types format, slicing if desired """
values = self.values
if slicer is not None:
# Categorical is always one dimension
values = values[slicer]
mask = isnull(values)
values = np.array(values, dtype='object')
values[mask] = na_rep
# we are expected to return a 2-d ndarray
return values.reshape(1, len(values))
class DatetimeBlock(DatetimeLikeBlockMixin, Block):
__slots__ = ()
is_datetime = True
_can_hold_na = True
def __init__(self, values, placement, fastpath=False, **kwargs):
if values.dtype != _NS_DTYPE:
values = tslib.cast_to_nanoseconds(values)
super(DatetimeBlock, self).__init__(values, fastpath=True,
placement=placement, **kwargs)
def _astype(self, dtype, mgr=None, **kwargs):
"""
these automatically copy, so copy=True has no effect
raise on an except if raise == True
"""
# if we are passed a datetime64[ns, tz]
if is_datetime64tz_dtype(dtype):
dtype = DatetimeTZDtype(dtype)
values = self.values
if getattr(values, 'tz', None) is None:
values = DatetimeIndex(values).tz_localize('UTC')
values = values.tz_convert(dtype.tz)
return self.make_block(values)
# delegate
return super(DatetimeBlock, self)._astype(dtype=dtype, **kwargs)
def _can_hold_element(self, element):
if is_list_like(element):
element = np.array(element)
return element.dtype == _NS_DTYPE or element.dtype == np.int64
return (is_integer(element) or isinstance(element, datetime) or
isnull(element))
def _try_cast(self, element):
try:
return int(element)
except:
return element
def _try_coerce_args(self, values, other):
"""
Coerce values and other to dtype 'i8'. NaN and NaT convert to
the smallest i8, and will correctly round-trip to NaT if converted
back in _try_coerce_result. values is always ndarray-like, other
may not be
Parameters
----------
values : ndarray-like
other : ndarray-like or scalar
Returns
-------
base-type values, values mask, base-type other, other mask
"""
values_mask = isnull(values)
values = values.view('i8')
other_mask = False
if isinstance(other, bool):
raise TypeError
elif is_null_datelike_scalar(other):
other = tslib.iNaT
other_mask = True
elif isinstance(other, (datetime, np.datetime64, date)):
other = self._box_func(other)
if getattr(other, 'tz') is not None:
raise TypeError("cannot coerce a Timestamp with a tz on a "
"naive Block")
other_mask = isnull(other)
other = other.asm8.view('i8')
elif hasattr(other, 'dtype') and is_integer_dtype(other):
other = other.view('i8')
else:
try:
other = np.asarray(other)
other_mask = isnull(other)
other = other.astype('i8', copy=False).view('i8')
except ValueError:
# coercion issues
# let higher levels handle
raise TypeError
return values, values_mask, other, other_mask
def _try_coerce_result(self, result):
""" reverse of try_coerce_args """
if isinstance(result, np.ndarray):
if result.dtype.kind in ['i', 'f', 'O']:
try:
result = result.astype('M8[ns]')
except ValueError:
pass
elif isinstance(result, (np.integer, np.float, np.datetime64)):
result = self._box_func(result)
return result
@property
def _box_func(self):
return tslib.Timestamp
def to_native_types(self, slicer=None, na_rep=None, date_format=None,
quoting=None, **kwargs):
""" convert to our native types format, slicing if desired """
values = self.values
if slicer is not None:
values = values[..., slicer]
from pandas.io.formats.format import _get_format_datetime64_from_values
format = _get_format_datetime64_from_values(values, date_format)
result = tslib.format_array_from_datetime(
values.view('i8').ravel(), tz=getattr(self.values, 'tz', None),
format=format, na_rep=na_rep).reshape(values.shape)
return np.atleast_2d(result)
def should_store(self, value):
return (issubclass(value.dtype.type, np.datetime64) and
not is_datetimetz(value))
def set(self, locs, values, check=False):
"""
Modify Block in-place with new item value
Returns
-------
None
"""
if values.dtype != _NS_DTYPE:
# Workaround for numpy 1.6 bug
values = tslib.cast_to_nanoseconds(values)
self.values[locs] = values
class DatetimeTZBlock(NonConsolidatableMixIn, DatetimeBlock):
""" implement a datetime64 block with a tz attribute """
__slots__ = ()
_holder = DatetimeIndex
is_datetimetz = True
def __init__(self, values, placement, ndim=2, **kwargs):
if not isinstance(values, self._holder):
values = self._holder(values)
dtype = kwargs.pop('dtype', None)
if dtype is not None:
if isinstance(dtype, compat.string_types):
dtype = DatetimeTZDtype.construct_from_string(dtype)
values = values._shallow_copy(tz=dtype.tz)
if values.tz is None:
raise ValueError("cannot create a DatetimeTZBlock without a tz")
super(DatetimeTZBlock, self).__init__(values, placement=placement,
ndim=ndim, **kwargs)
def copy(self, deep=True, mgr=None):
""" copy constructor """
values = self.values
if deep:
values = values.copy(deep=True)
return self.make_block_same_class(values)
def external_values(self):
""" we internally represent the data as a DatetimeIndex, but for
external compat with ndarray, export as a ndarray of Timestamps
"""
return self.values.astype('datetime64[ns]').values
def get_values(self, dtype=None):
# return object dtype as Timestamps with the zones
if is_object_dtype(dtype):
f = lambda x: lib.Timestamp(x, tz=self.values.tz)
return lib.map_infer(
self.values.ravel(), f).reshape(self.values.shape)
return self.values
def to_object_block(self, mgr):
"""
return myself as an object block
Since we keep the DTI as a 1-d object, this is different
depends on BlockManager's ndim
"""
values = self.get_values(dtype=object)
kwargs = {}
if mgr.ndim > 1:
values = _block_shape(values, ndim=mgr.ndim)
kwargs['ndim'] = mgr.ndim
kwargs['placement'] = [0]
return self.make_block(values, klass=ObjectBlock, **kwargs)
def _slice(self, slicer):
""" return a slice of my values """
if isinstance(slicer, tuple):
col, loc = slicer
if not is_null_slice(col) and col != 0:
raise IndexError("{0} only contains one item".format(self))
return self.values[loc]
return self.values[slicer]
def _try_coerce_args(self, values, other):
"""
localize and return i8 for the values
Parameters
----------
values : ndarray-like
other : ndarray-like or scalar
Returns
-------
base-type values, values mask, base-type other, other mask
"""
values_mask = _block_shape(isnull(values), ndim=self.ndim)
# asi8 is a view, needs copy
values = _block_shape(values.asi8, ndim=self.ndim)
other_mask = False
if isinstance(other, ABCSeries):
other = self._holder(other)
other_mask = isnull(other)
if isinstance(other, bool):
raise TypeError
elif (is_null_datelike_scalar(other) or
(is_scalar(other) and isnull(other))):
other = tslib.iNaT
other_mask = True
elif isinstance(other, self._holder):
if other.tz != self.values.tz:
raise ValueError("incompatible or non tz-aware value")
other = other.asi8
other_mask = isnull(other)
elif isinstance(other, (np.datetime64, datetime, date)):
other = lib.Timestamp(other)
tz = getattr(other, 'tz', None)
# test we can have an equal time zone
if tz is None or str(tz) != str(self.values.tz):
raise ValueError("incompatible or non tz-aware value")
other_mask = isnull(other)
other = other.value
return values, values_mask, other, other_mask
def _try_coerce_result(self, result):
""" reverse of try_coerce_args """
if isinstance(result, np.ndarray):
if result.dtype.kind in ['i', 'f', 'O']:
result = result.astype('M8[ns]')
elif isinstance(result, (np.integer, np.float, np.datetime64)):
result = lib.Timestamp(result, tz=self.values.tz)
if isinstance(result, np.ndarray):
# allow passing of > 1dim if its trivial
if result.ndim > 1:
result = result.reshape(np.prod(result.shape))
result = self.values._shallow_copy(result)
return result
@property
def _box_func(self):
return lambda x: tslib.Timestamp(x, tz=self.dtype.tz)
def shift(self, periods, axis=0, mgr=None):
""" shift the block by periods """
# think about moving this to the DatetimeIndex. This is a non-freq
# (number of periods) shift ###
N = len(self)
indexer = np.zeros(N, dtype=int)
if periods > 0:
indexer[periods:] = np.arange(N - periods)
else:
indexer[:periods] = np.arange(-periods, N)
new_values = self.values.asi8.take(indexer)
if periods > 0:
new_values[:periods] = tslib.iNaT
else:
new_values[periods:] = tslib.iNaT
new_values = self.values._shallow_copy(new_values)
return [self.make_block_same_class(new_values,
placement=self.mgr_locs)]
class SparseBlock(NonConsolidatableMixIn, Block):
""" implement as a list of sparse arrays of the same dtype """
__slots__ = ()
is_sparse = True
is_numeric = True
_box_to_block_values = False
_can_hold_na = True
_ftype = 'sparse'
_holder = SparseArray
@property
def shape(self):
return (len(self.mgr_locs), self.sp_index.length)
@property
def itemsize(self):
return self.dtype.itemsize
@property
def fill_value(self):
# return np.nan
return self.values.fill_value
@fill_value.setter
def fill_value(self, v):
self.values.fill_value = v
def to_dense(self):
return self.values.to_dense().view()
@property
def sp_values(self):
return self.values.sp_values
@sp_values.setter
def sp_values(self, v):
# reset the sparse values
self.values = SparseArray(v, sparse_index=self.sp_index,
kind=self.kind, dtype=v.dtype,
fill_value=self.values.fill_value,
copy=False)
@property
def sp_index(self):
return self.values.sp_index
@property
def kind(self):
return self.values.kind
def _astype(self, dtype, copy=False, raise_on_error=True, values=None,
klass=None, mgr=None, **kwargs):
if values is None:
values = self.values
values = values.astype(dtype, copy=copy)
return self.make_block_same_class(values=values,
placement=self.mgr_locs)
def __len__(self):
try:
return self.sp_index.length
except:
return 0
def copy(self, deep=True, mgr=None):
return self.make_block_same_class(values=self.values,
sparse_index=self.sp_index,
kind=self.kind, copy=deep,
placement=self.mgr_locs)
def make_block_same_class(self, values, placement, sparse_index=None,
kind=None, dtype=None, fill_value=None,
copy=False, fastpath=True, **kwargs):
""" return a new block """
if dtype is None:
dtype = values.dtype
if fill_value is None and not isinstance(values, SparseArray):
fill_value = self.values.fill_value
# if not isinstance(values, SparseArray) and values.ndim != self.ndim:
# raise ValueError("ndim mismatch")
if values.ndim == 2:
nitems = values.shape[0]
if nitems == 0:
# kludgy, but SparseBlocks cannot handle slices, where the
# output is 0-item, so let's convert it to a dense block: it
# won't take space since there's 0 items, plus it will preserve
# the dtype.
return self.make_block(np.empty(values.shape, dtype=dtype),
placement,
fastpath=True)
elif nitems > 1:
raise ValueError("Only 1-item 2d sparse blocks are supported")
else:
values = values.reshape(values.shape[1])
new_values = SparseArray(values, sparse_index=sparse_index,
kind=kind or self.kind, dtype=dtype,
fill_value=fill_value, copy=copy)
return self.make_block(new_values, fastpath=fastpath,
placement=placement)
def interpolate(self, method='pad', axis=0, inplace=False, limit=None,
fill_value=None, **kwargs):
values = missing.interpolate_2d(self.values.to_dense(), method, axis,
limit, fill_value)
return self.make_block_same_class(values=values,
placement=self.mgr_locs)
def fillna(self, value, limit=None, inplace=False, downcast=None,
mgr=None):
# we may need to upcast our fill to match our dtype
if limit is not None:
raise NotImplementedError("specifying a limit for 'fillna' has "
"not been implemented yet")
values = self.values if inplace else self.values.copy()
values = values.fillna(value, downcast=downcast)
return [self.make_block_same_class(values=values,
placement=self.mgr_locs)]
def shift(self, periods, axis=0, mgr=None):
""" shift the block by periods """
N = len(self.values.T)
indexer = np.zeros(N, dtype=int)
if periods > 0:
indexer[periods:] = np.arange(N - periods)
else:
indexer[:periods] = np.arange(-periods, N)
new_values = self.values.to_dense().take(indexer)
# convert integer to float if necessary. need to do a lot more than
# that, handle boolean etc also
new_values, fill_value = maybe_upcast(new_values)
if periods > 0:
new_values[:periods] = fill_value
else:
new_values[periods:] = fill_value
return [self.make_block_same_class(new_values,
placement=self.mgr_locs)]
def reindex_axis(self, indexer, method=None, axis=1, fill_value=None,
limit=None, mask_info=None):
"""
Reindex using pre-computed indexer information
"""
if axis < 1:
raise AssertionError('axis must be at least 1, got %d' % axis)
# taking on the 0th axis always here
if fill_value is None:
fill_value = self.fill_value
return self.make_block_same_class(self.values.take(indexer),
fill_value=fill_value,
placement=self.mgr_locs)
def sparse_reindex(self, new_index):
""" sparse reindex and return a new block
current reindex only works for float64 dtype! """
values = self.values
values = values.sp_index.to_int_index().reindex(
values.sp_values.astype('float64'), values.fill_value, new_index)
return self.make_block_same_class(values, sparse_index=new_index,
placement=self.mgr_locs)
def make_block(values, placement, klass=None, ndim=None, dtype=None,
fastpath=False):
if klass is None:
dtype = dtype or values.dtype
vtype = dtype.type
if isinstance(values, SparseArray):
klass = SparseBlock
elif issubclass(vtype, np.floating):
klass = FloatBlock
elif (issubclass(vtype, np.integer) and
issubclass(vtype, np.timedelta64)):
klass = TimeDeltaBlock
elif (issubclass(vtype, np.integer) and
not issubclass(vtype, np.datetime64)):
klass = IntBlock
elif dtype == np.bool_:
klass = BoolBlock
elif issubclass(vtype, np.datetime64):
if hasattr(values, 'tz'):
klass = DatetimeTZBlock
else:
klass = DatetimeBlock
elif is_datetimetz(values):
klass = DatetimeTZBlock
elif issubclass(vtype, np.complexfloating):
klass = ComplexBlock
elif is_categorical(values):
klass = CategoricalBlock
else:
klass = ObjectBlock
elif klass is DatetimeTZBlock and not is_datetimetz(values):
return klass(values, ndim=ndim, fastpath=fastpath,
placement=placement, dtype=dtype)
return klass(values, ndim=ndim, fastpath=fastpath, placement=placement)
# TODO: flexible with index=None and/or items=None
class BlockManager(PandasObject):
"""
Core internal data structure to implement DataFrame, Series, Panel, etc.
Manage a bunch of labeled 2D mixed-type ndarrays. Essentially it's a
lightweight blocked set of labeled data to be manipulated by the DataFrame
public API class
Attributes
----------
shape
ndim
axes
values
items
Methods
-------
set_axis(axis, new_labels)
copy(deep=True)
get_dtype_counts
get_ftype_counts
get_dtypes
get_ftypes
apply(func, axes, block_filter_fn)
get_bool_data
get_numeric_data
get_slice(slice_like, axis)
get(label)
iget(loc)
get_scalar(label_tup)
take(indexer, axis)
reindex_axis(new_labels, axis)
reindex_indexer(new_labels, indexer, axis)
delete(label)
insert(loc, label, value)
set(label, value)
Parameters
----------
Notes
-----
This is *not* a public API class
"""
__slots__ = ['axes', 'blocks', '_ndim', '_shape', '_known_consolidated',
'_is_consolidated', '_blknos', '_blklocs']
def __init__(self, blocks, axes, do_integrity_check=True, fastpath=True):
self.axes = [_ensure_index(ax) for ax in axes]
self.blocks = tuple(blocks)
for block in blocks:
if block.is_sparse:
if len(block.mgr_locs) != 1:
raise AssertionError("Sparse block refers to multiple "
"items")
else:
if self.ndim != block.ndim:
raise AssertionError('Number of Block dimensions (%d) '
'must equal number of axes (%d)' %
(block.ndim, self.ndim))
if do_integrity_check:
self._verify_integrity()
self._consolidate_check()
self._rebuild_blknos_and_blklocs()
def make_empty(self, axes=None):
""" return an empty BlockManager with the items axis of len 0 """
if axes is None:
axes = [_ensure_index([])] + [_ensure_index(a)
for a in self.axes[1:]]
# preserve dtype if possible
if self.ndim == 1:
blocks = np.array([], dtype=self.array_dtype)
else:
blocks = []
return self.__class__(blocks, axes)
def __nonzero__(self):
return True
# Python3 compat
__bool__ = __nonzero__
@property
def shape(self):
return tuple(len(ax) for ax in self.axes)
@property
def ndim(self):
return len(self.axes)
def set_axis(self, axis, new_labels):
new_labels = _ensure_index(new_labels)
old_len = len(self.axes[axis])
new_len = len(new_labels)
if new_len != old_len:
raise ValueError('Length mismatch: Expected axis has %d elements, '
'new values have %d elements' %
(old_len, new_len))
self.axes[axis] = new_labels
def rename_axis(self, mapper, axis, copy=True, level=None):
"""
Rename one of axes.
Parameters
----------
mapper : unary callable
axis : int
copy : boolean, default True
level : int, default None
"""
obj = self.copy(deep=copy)
obj.set_axis(axis, _transform_index(self.axes[axis], mapper, level))
return obj
def add_prefix(self, prefix):
f = (str(prefix) + '%s').__mod__
return self.rename_axis(f, axis=0)
def add_suffix(self, suffix):
f = ('%s' + str(suffix)).__mod__
return self.rename_axis(f, axis=0)
@property
def _is_single_block(self):
if self.ndim == 1:
return True
if len(self.blocks) != 1:
return False
blk = self.blocks[0]
return (blk.mgr_locs.is_slice_like and
blk.mgr_locs.as_slice == slice(0, len(self), 1))
def _rebuild_blknos_and_blklocs(self):
"""
Update mgr._blknos / mgr._blklocs.
"""
new_blknos = np.empty(self.shape[0], dtype=np.int64)
new_blklocs = np.empty(self.shape[0], dtype=np.int64)
new_blknos.fill(-1)
new_blklocs.fill(-1)
for blkno, blk in enumerate(self.blocks):
rl = blk.mgr_locs
new_blknos[rl.indexer] = blkno
new_blklocs[rl.indexer] = np.arange(len(rl))
if (new_blknos == -1).any():
raise AssertionError("Gaps in blk ref_locs")
self._blknos = new_blknos
self._blklocs = new_blklocs
# make items read only for now
def _get_items(self):
return self.axes[0]
items = property(fget=_get_items)
def _get_counts(self, f):
""" return a dict of the counts of the function in BlockManager """
self._consolidate_inplace()
counts = dict()
for b in self.blocks:
v = f(b)
counts[v] = counts.get(v, 0) + b.shape[0]
return counts
def get_dtype_counts(self):
return self._get_counts(lambda b: b.dtype.name)
def get_ftype_counts(self):
return self._get_counts(lambda b: b.ftype)
def get_dtypes(self):
dtypes = np.array([blk.dtype for blk in self.blocks])
return algos.take_1d(dtypes, self._blknos, allow_fill=False)
def get_ftypes(self):
ftypes = np.array([blk.ftype for blk in self.blocks])
return algos.take_1d(ftypes, self._blknos, allow_fill=False)
def __getstate__(self):
block_values = [b.values for b in self.blocks]
block_items = [self.items[b.mgr_locs.indexer] for b in self.blocks]
axes_array = [ax for ax in self.axes]
extra_state = {
'0.14.1': {
'axes': axes_array,
'blocks': [dict(values=b.values, mgr_locs=b.mgr_locs.indexer)
for b in self.blocks]
}
}
# First three elements of the state are to maintain forward
# compatibility with 0.13.1.
return axes_array, block_values, block_items, extra_state
def __setstate__(self, state):
def unpickle_block(values, mgr_locs):
# numpy < 1.7 pickle compat
if values.dtype == 'M8[us]':
values = values.astype('M8[ns]')
return make_block(values, placement=mgr_locs)
if (isinstance(state, tuple) and len(state) >= 4 and
'0.14.1' in state[3]):
state = state[3]['0.14.1']
self.axes = [_ensure_index(ax) for ax in state['axes']]
self.blocks = tuple(unpickle_block(b['values'], b['mgr_locs'])
for b in state['blocks'])
else:
# discard anything after 3rd, support beta pickling format for a
# little while longer
ax_arrays, bvalues, bitems = state[:3]
self.axes = [_ensure_index(ax) for ax in ax_arrays]
if len(bitems) == 1 and self.axes[0].equals(bitems[0]):
# This is a workaround for pre-0.14.1 pickles that didn't
# support unpickling multi-block frames/panels with non-unique
# columns/items, because given a manager with items ["a", "b",
# "a"] there's no way of knowing which block's "a" is where.
#
# Single-block case can be supported under the assumption that
# block items corresponded to manager items 1-to-1.
all_mgr_locs = [slice(0, len(bitems[0]))]
else:
all_mgr_locs = [self.axes[0].get_indexer(blk_items)
for blk_items in bitems]
self.blocks = tuple(
unpickle_block(values, mgr_locs)
for values, mgr_locs in zip(bvalues, all_mgr_locs))
self._post_setstate()
def _post_setstate(self):
self._is_consolidated = False
self._known_consolidated = False
self._rebuild_blknos_and_blklocs()
def __len__(self):
return len(self.items)
def __unicode__(self):
output = pprint_thing(self.__class__.__name__)
for i, ax in enumerate(self.axes):
if i == 0:
output += u('\nItems: %s') % ax
else:
output += u('\nAxis %d: %s') % (i, ax)
for block in self.blocks:
output += u('\n%s') % pprint_thing(block)
return output
def _verify_integrity(self):
mgr_shape = self.shape
tot_items = sum(len(x.mgr_locs) for x in self.blocks)
for block in self.blocks:
if block._verify_integrity and block.shape[1:] != mgr_shape[1:]:
construction_error(tot_items, block.shape[1:], self.axes)
if len(self.items) != tot_items:
raise AssertionError('Number of manager items must equal union of '
'block items\n# manager items: {0}, # '
'tot_items: {1}'.format(
len(self.items), tot_items))
def apply(self, f, axes=None, filter=None, do_integrity_check=False,
consolidate=True, **kwargs):
"""
iterate over the blocks, collect and create a new block manager
Parameters
----------
f : the callable or function name to operate on at the block level
axes : optional (if not supplied, use self.axes)
filter : list, if supplied, only call the block if the filter is in
the block
do_integrity_check : boolean, default False. Do the block manager
integrity check
consolidate: boolean, default True. Join together blocks having same
dtype
Returns
-------
Block Manager (new object)
"""
result_blocks = []
# filter kwarg is used in replace-* family of methods
if filter is not None:
filter_locs = set(self.items.get_indexer_for(filter))
if len(filter_locs) == len(self.items):
# All items are included, as if there were no filtering
filter = None
else:
kwargs['filter'] = filter_locs
if consolidate:
self._consolidate_inplace()
if f == 'where':
align_copy = True
if kwargs.get('align', True):
align_keys = ['other', 'cond']
else:
align_keys = ['cond']
elif f == 'putmask':
align_copy = False
if kwargs.get('align', True):
align_keys = ['new', 'mask']
else:
align_keys = ['mask']
elif f == 'eval':
align_copy = False
align_keys = ['other']
elif f == 'fillna':
# fillna internally does putmask, maybe it's better to do this
# at mgr, not block level?
align_copy = False
align_keys = ['value']
else:
align_keys = []
aligned_args = dict((k, kwargs[k])
for k in align_keys
if hasattr(kwargs[k], 'reindex_axis'))
for b in self.blocks:
if filter is not None:
if not b.mgr_locs.isin(filter_locs).any():
result_blocks.append(b)
continue
if aligned_args:
b_items = self.items[b.mgr_locs.indexer]
for k, obj in aligned_args.items():
axis = getattr(obj, '_info_axis_number', 0)
kwargs[k] = obj.reindex_axis(b_items, axis=axis,
copy=align_copy)
kwargs['mgr'] = self
applied = getattr(b, f)(**kwargs)
result_blocks = _extend_blocks(applied, result_blocks)
if len(result_blocks) == 0:
return self.make_empty(axes or self.axes)
bm = self.__class__(result_blocks, axes or self.axes,
do_integrity_check=do_integrity_check)
bm._consolidate_inplace()
return bm
def reduction(self, f, axis=0, consolidate=True, transposed=False,
**kwargs):
"""
iterate over the blocks, collect and create a new block manager.
This routine is intended for reduction type operations and
will do inference on the generated blocks.
Parameters
----------
f: the callable or function name to operate on at the block level
axis: reduction axis, default 0
consolidate: boolean, default True. Join together blocks having same
dtype
transposed: boolean, default False
we are holding transposed data
Returns
-------
Block Manager (new object)
"""
if consolidate:
self._consolidate_inplace()
axes, blocks = [], []
for b in self.blocks:
kwargs['mgr'] = self
axe, block = getattr(b, f)(axis=axis, **kwargs)
axes.append(axe)
blocks.append(block)
# note that some DatetimeTZ, Categorical are always ndim==1
ndim = set([b.ndim for b in blocks])
if 2 in ndim:
new_axes = list(self.axes)
# multiple blocks that are reduced
if len(blocks) > 1:
new_axes[1] = axes[0]
# reset the placement to the original
for b, sb in zip(blocks, self.blocks):
b.mgr_locs = sb.mgr_locs
else:
new_axes[axis] = Index(np.concatenate(
[ax.values for ax in axes]))
if transposed:
new_axes = new_axes[::-1]
blocks = [b.make_block(b.values.T,
placement=np.arange(b.shape[1])
) for b in blocks]
return self.__class__(blocks, new_axes)
# 0 ndim
if 0 in ndim and 1 not in ndim:
values = np.array([b.values for b in blocks])
if len(values) == 1:
return values.item()
blocks = [make_block(values, ndim=1)]
axes = Index([ax[0] for ax in axes])
# single block
values = _concat._concat_compat([b.values for b in blocks])
# compute the orderings of our original data
if len(self.blocks) > 1:
indexer = np.empty(len(self.axes[0]), dtype=np.intp)
i = 0
for b in self.blocks:
for j in b.mgr_locs:
indexer[j] = i
i = i + 1
values = values.take(indexer)
return SingleBlockManager(
[make_block(values,
ndim=1,
placement=np.arange(len(values)))],
axes[0])
def isnull(self, **kwargs):
return self.apply('apply', **kwargs)
def where(self, **kwargs):
return self.apply('where', **kwargs)
def eval(self, **kwargs):
return self.apply('eval', **kwargs)
def quantile(self, **kwargs):
return self.reduction('quantile', **kwargs)
def setitem(self, **kwargs):
return self.apply('setitem', **kwargs)
def putmask(self, **kwargs):
return self.apply('putmask', **kwargs)
def diff(self, **kwargs):
return self.apply('diff', **kwargs)
def interpolate(self, **kwargs):
return self.apply('interpolate', **kwargs)
def shift(self, **kwargs):
return self.apply('shift', **kwargs)
def fillna(self, **kwargs):
return self.apply('fillna', **kwargs)
def downcast(self, **kwargs):
return self.apply('downcast', **kwargs)
def astype(self, dtype, **kwargs):
return self.apply('astype', dtype=dtype, **kwargs)
def convert(self, **kwargs):
return self.apply('convert', **kwargs)
def replace(self, **kwargs):
return self.apply('replace', **kwargs)
def replace_list(self, src_list, dest_list, inplace=False, regex=False,
mgr=None):
""" do a list replace """
inplace = validate_bool_kwarg(inplace, 'inplace')
if mgr is None:
mgr = self
# figure out our mask a-priori to avoid repeated replacements
values = self.as_matrix()
def comp(s):
if isnull(s):
return isnull(values)
return _maybe_compare(values, getattr(s, 'asm8', s), operator.eq)
def _cast_scalar(block, scalar):
dtype, val = infer_dtype_from_scalar(scalar, pandas_dtype=True)
if not is_dtype_equal(block.dtype, dtype):
dtype = find_common_type([block.dtype, dtype])
block = block.astype(dtype)
# use original value
val = scalar
return block, val
masks = [comp(s) for i, s in enumerate(src_list)]
result_blocks = []
src_len = len(src_list) - 1
for blk in self.blocks:
# its possible to get multiple result blocks here
# replace ALWAYS will return a list
rb = [blk if inplace else blk.copy()]
for i, (s, d) in enumerate(zip(src_list, dest_list)):
new_rb = []
for b in rb:
if b.dtype == np.object_:
convert = i == src_len
result = b.replace(s, d, inplace=inplace, regex=regex,
mgr=mgr, convert=convert)
new_rb = _extend_blocks(result, new_rb)
else:
# get our mask for this element, sized to this
# particular block
m = masks[i][b.mgr_locs.indexer]
if m.any():
b, val = _cast_scalar(b, d)
new_rb.extend(b.putmask(m, val, inplace=True))
else:
new_rb.append(b)
rb = new_rb
result_blocks.extend(rb)
bm = self.__class__(result_blocks, self.axes)
bm._consolidate_inplace()
return bm
def reshape_nd(self, axes, **kwargs):
""" a 2d-nd reshape operation on a BlockManager """
return self.apply('reshape_nd', axes=axes, **kwargs)
def is_consolidated(self):
"""
Return True if more than one block with the same dtype
"""
if not self._known_consolidated:
self._consolidate_check()
return self._is_consolidated
def _consolidate_check(self):
ftypes = [blk.ftype for blk in self.blocks]
self._is_consolidated = len(ftypes) == len(set(ftypes))
self._known_consolidated = True
@property
def is_mixed_type(self):
# Warning, consolidation needs to get checked upstairs
self._consolidate_inplace()
return len(self.blocks) > 1
@property
def is_numeric_mixed_type(self):
# Warning, consolidation needs to get checked upstairs
self._consolidate_inplace()
return all([block.is_numeric for block in self.blocks])
@property
def is_datelike_mixed_type(self):
# Warning, consolidation needs to get checked upstairs
self._consolidate_inplace()
return any([block.is_datelike for block in self.blocks])
@property
def is_view(self):
""" return a boolean if we are a single block and are a view """
if len(self.blocks) == 1:
return self.blocks[0].is_view
# It is technically possible to figure out which blocks are views
# e.g. [ b.values.base is not None for b in self.blocks ]
# but then we have the case of possibly some blocks being a view
# and some blocks not. setting in theory is possible on the non-view
# blocks w/o causing a SettingWithCopy raise/warn. But this is a bit
# complicated
return False
def get_bool_data(self, copy=False):
"""
Parameters
----------
copy : boolean, default False
Whether to copy the blocks
"""
self._consolidate_inplace()
return self.combine([b for b in self.blocks if b.is_bool], copy)
def get_numeric_data(self, copy=False):
"""
Parameters
----------
copy : boolean, default False
Whether to copy the blocks
"""
self._consolidate_inplace()
return self.combine([b for b in self.blocks if b.is_numeric], copy)
def combine(self, blocks, copy=True):
""" return a new manager with the blocks """
if len(blocks) == 0:
return self.make_empty()
# FIXME: optimization potential
indexer = np.sort(np.concatenate([b.mgr_locs.as_array
for b in blocks]))
inv_indexer = lib.get_reverse_indexer(indexer, self.shape[0])
new_blocks = []
for b in blocks:
b = b.copy(deep=copy)
b.mgr_locs = algos.take_1d(inv_indexer, b.mgr_locs.as_array,
axis=0, allow_fill=False)
new_blocks.append(b)
axes = list(self.axes)
axes[0] = self.items.take(indexer)
return self.__class__(new_blocks, axes, do_integrity_check=False)
def get_slice(self, slobj, axis=0):
if axis >= self.ndim:
raise IndexError("Requested axis not found in manager")
if axis == 0:
new_blocks = self._slice_take_blocks_ax0(slobj)
else:
slicer = [slice(None)] * (axis + 1)
slicer[axis] = slobj
slicer = tuple(slicer)
new_blocks = [blk.getitem_block(slicer) for blk in self.blocks]
new_axes = list(self.axes)
new_axes[axis] = new_axes[axis][slobj]
bm = self.__class__(new_blocks, new_axes, do_integrity_check=False,
fastpath=True)
bm._consolidate_inplace()
return bm
def __contains__(self, item):
return item in self.items
@property
def nblocks(self):
return len(self.blocks)
def copy(self, deep=True, mgr=None):
"""
Make deep or shallow copy of BlockManager
Parameters
----------
deep : boolean o rstring, default True
If False, return shallow copy (do not copy data)
If 'all', copy data and a deep copy of the index
Returns
-------
copy : BlockManager
"""
# this preserves the notion of view copying of axes
if deep:
if deep == 'all':
copy = lambda ax: ax.copy(deep=True)
else:
copy = lambda ax: ax.view()
new_axes = [copy(ax) for ax in self.axes]
else:
new_axes = list(self.axes)
return self.apply('copy', axes=new_axes, deep=deep,
do_integrity_check=False)
def as_matrix(self, items=None):
if len(self.blocks) == 0:
return np.empty(self.shape, dtype=float)
if items is not None:
mgr = self.reindex_axis(items, axis=0)
else:
mgr = self
if self._is_single_block or not self.is_mixed_type:
return mgr.blocks[0].get_values()
else:
return mgr._interleave()
def _interleave(self):
"""
Return ndarray from blocks with specified item order
Items must be contained in the blocks
"""
dtype = _interleaved_dtype(self.blocks)
result = np.empty(self.shape, dtype=dtype)
if result.shape[0] == 0:
# Workaround for numpy 1.7 bug:
#
# >>> a = np.empty((0,10))
# >>> a[slice(0,0)]
# array([], shape=(0, 10), dtype=float64)
# >>> a[[]]
# Traceback (most recent call last):
# File "<stdin>", line 1, in <module>
# IndexError: index 0 is out of bounds for axis 0 with size 0
return result
itemmask = np.zeros(self.shape[0])
for blk in self.blocks:
rl = blk.mgr_locs
result[rl.indexer] = blk.get_values(dtype)
itemmask[rl.indexer] = 1
if not itemmask.all():
raise AssertionError('Some items were not contained in blocks')
return result
def xs(self, key, axis=1, copy=True, takeable=False):
if axis < 1:
raise AssertionError('Can only take xs across axis >= 1, got %d' %
axis)
# take by position
if takeable:
loc = key
else:
loc = self.axes[axis].get_loc(key)
slicer = [slice(None, None) for _ in range(self.ndim)]
slicer[axis] = loc
slicer = tuple(slicer)
new_axes = list(self.axes)
# could be an array indexer!
if isinstance(loc, (slice, np.ndarray)):
new_axes[axis] = new_axes[axis][loc]
else:
new_axes.pop(axis)
new_blocks = []
if len(self.blocks) > 1:
# we must copy here as we are mixed type
for blk in self.blocks:
newb = make_block(values=blk.values[slicer],
klass=blk.__class__, fastpath=True,
placement=blk.mgr_locs)
new_blocks.append(newb)
elif len(self.blocks) == 1:
block = self.blocks[0]
vals = block.values[slicer]
if copy:
vals = vals.copy()
new_blocks = [make_block(values=vals,
placement=block.mgr_locs,
klass=block.__class__,
fastpath=True, )]
return self.__class__(new_blocks, new_axes)
def fast_xs(self, loc):
"""
get a cross sectional for a given location in the
items ; handle dups
return the result, is *could* be a view in the case of a
single block
"""
if len(self.blocks) == 1:
return self.blocks[0].iget((slice(None), loc))
items = self.items
# non-unique (GH4726)
if not items.is_unique:
result = self._interleave()
if self.ndim == 2:
result = result.T
return result[loc]
# unique
dtype = _interleaved_dtype(self.blocks)
n = len(items)
result = np.empty(n, dtype=dtype)
for blk in self.blocks:
# Such assignment may incorrectly coerce NaT to None
# result[blk.mgr_locs] = blk._slice((slice(None), loc))
for i, rl in enumerate(blk.mgr_locs):
result[rl] = blk._try_coerce_result(blk.iget((i, loc)))
return result
def consolidate(self):
"""
Join together blocks having same dtype
Returns
-------
y : BlockManager
"""
if self.is_consolidated():
return self
bm = self.__class__(self.blocks, self.axes)
bm._is_consolidated = False
bm._consolidate_inplace()
return bm
def _consolidate_inplace(self):
if not self.is_consolidated():
self.blocks = tuple(_consolidate(self.blocks))
self._is_consolidated = True
self._known_consolidated = True
self._rebuild_blknos_and_blklocs()
def get(self, item, fastpath=True):
"""
Return values for selected item (ndarray or BlockManager).
"""
if self.items.is_unique:
if not isnull(item):
loc = self.items.get_loc(item)
else:
indexer = np.arange(len(self.items))[isnull(self.items)]
# allow a single nan location indexer
if not is_scalar(indexer):
if len(indexer) == 1:
loc = indexer.item()
else:
raise ValueError("cannot label index with a null key")
return self.iget(loc, fastpath=fastpath)
else:
if isnull(item):
raise TypeError("cannot label index with a null key")
indexer = self.items.get_indexer_for([item])
return self.reindex_indexer(new_axis=self.items[indexer],
indexer=indexer, axis=0,
allow_dups=True)
def iget(self, i, fastpath=True):
"""
Return the data as a SingleBlockManager if fastpath=True and possible
Otherwise return as a ndarray
"""
block = self.blocks[self._blknos[i]]
values = block.iget(self._blklocs[i])
if not fastpath or not block._box_to_block_values or values.ndim != 1:
return values
# fastpath shortcut for select a single-dim from a 2-dim BM
return SingleBlockManager(
[block.make_block_same_class(values,
placement=slice(0, len(values)),
ndim=1, fastpath=True)],
self.axes[1])
def get_scalar(self, tup):
"""
Retrieve single item
"""
full_loc = list(ax.get_loc(x) for ax, x in zip(self.axes, tup))
blk = self.blocks[self._blknos[full_loc[0]]]
values = blk.values
# FIXME: this may return non-upcasted types?
if values.ndim == 1:
return values[full_loc[1]]
full_loc[0] = self._blklocs[full_loc[0]]
return values[tuple(full_loc)]
def delete(self, item):
"""
Delete selected item (items if non-unique) in-place.
"""
indexer = self.items.get_loc(item)
is_deleted = np.zeros(self.shape[0], dtype=np.bool_)
is_deleted[indexer] = True
ref_loc_offset = -is_deleted.cumsum()
is_blk_deleted = [False] * len(self.blocks)
if isinstance(indexer, int):
affected_start = indexer
else:
affected_start = is_deleted.nonzero()[0][0]
for blkno, _ in _fast_count_smallints(self._blknos[affected_start:]):
blk = self.blocks[blkno]
bml = blk.mgr_locs
blk_del = is_deleted[bml.indexer].nonzero()[0]
if len(blk_del) == len(bml):
is_blk_deleted[blkno] = True
continue
elif len(blk_del) != 0:
blk.delete(blk_del)
bml = blk.mgr_locs
blk.mgr_locs = bml.add(ref_loc_offset[bml.indexer])
# FIXME: use Index.delete as soon as it uses fastpath=True
self.axes[0] = self.items[~is_deleted]
self.blocks = tuple(b for blkno, b in enumerate(self.blocks)
if not is_blk_deleted[blkno])
self._shape = None
self._rebuild_blknos_and_blklocs()
def set(self, item, value, check=False):
"""
Set new item in-place. Does not consolidate. Adds new Block if not
contained in the current set of items
if check, then validate that we are not setting the same data in-place
"""
# FIXME: refactor, clearly separate broadcasting & zip-like assignment
# can prob also fix the various if tests for sparse/categorical
value_is_extension_type = is_extension_type(value)
# categorical/spares/datetimetz
if value_is_extension_type:
def value_getitem(placement):
return value
else:
if value.ndim == self.ndim - 1:
value = _safe_reshape(value, (1,) + value.shape)
def value_getitem(placement):
return value
else:
def value_getitem(placement):
return value[placement.indexer]
if value.shape[1:] != self.shape[1:]:
raise AssertionError('Shape of new values must be compatible '
'with manager shape')
try:
loc = self.items.get_loc(item)
except KeyError:
# This item wasn't present, just insert at end
self.insert(len(self.items), item, value)
return
if isinstance(loc, int):
loc = [loc]
blknos = self._blknos[loc]
blklocs = self._blklocs[loc].copy()
unfit_mgr_locs = []
unfit_val_locs = []
removed_blknos = []
for blkno, val_locs in _get_blkno_placements(blknos, len(self.blocks),
group=True):
blk = self.blocks[blkno]
blk_locs = blklocs[val_locs.indexer]
if blk.should_store(value):
blk.set(blk_locs, value_getitem(val_locs), check=check)
else:
unfit_mgr_locs.append(blk.mgr_locs.as_array[blk_locs])
unfit_val_locs.append(val_locs)
# If all block items are unfit, schedule the block for removal.
if len(val_locs) == len(blk.mgr_locs):
removed_blknos.append(blkno)
else:
self._blklocs[blk.mgr_locs.indexer] = -1
blk.delete(blk_locs)
self._blklocs[blk.mgr_locs.indexer] = np.arange(len(blk))
if len(removed_blknos):
# Remove blocks & update blknos accordingly
is_deleted = np.zeros(self.nblocks, dtype=np.bool_)
is_deleted[removed_blknos] = True
new_blknos = np.empty(self.nblocks, dtype=np.int64)
new_blknos.fill(-1)
new_blknos[~is_deleted] = np.arange(self.nblocks -
len(removed_blknos))
self._blknos = algos.take_1d(new_blknos, self._blknos, axis=0,
allow_fill=False)
self.blocks = tuple(blk for i, blk in enumerate(self.blocks)
if i not in set(removed_blknos))
if unfit_val_locs:
unfit_mgr_locs = np.concatenate(unfit_mgr_locs)
unfit_count = len(unfit_mgr_locs)
new_blocks = []
if value_is_extension_type:
# This code (ab-)uses the fact that sparse blocks contain only
# one item.
new_blocks.extend(
make_block(values=value.copy(), ndim=self.ndim,
placement=slice(mgr_loc, mgr_loc + 1))
for mgr_loc in unfit_mgr_locs)
self._blknos[unfit_mgr_locs] = (np.arange(unfit_count) +
len(self.blocks))
self._blklocs[unfit_mgr_locs] = 0
else:
# unfit_val_locs contains BlockPlacement objects
unfit_val_items = unfit_val_locs[0].append(unfit_val_locs[1:])
new_blocks.append(
make_block(values=value_getitem(unfit_val_items),
ndim=self.ndim, placement=unfit_mgr_locs))
self._blknos[unfit_mgr_locs] = len(self.blocks)
self._blklocs[unfit_mgr_locs] = np.arange(unfit_count)
self.blocks += tuple(new_blocks)
# Newly created block's dtype may already be present.
self._known_consolidated = False
def insert(self, loc, item, value, allow_duplicates=False):
"""
Insert item at selected position.
Parameters
----------
loc : int
item : hashable
value : array_like
allow_duplicates: bool
If False, trying to insert non-unique item will raise
"""
if not allow_duplicates and item in self.items:
# Should this be a different kind of error??
raise ValueError('cannot insert {}, already exists'.format(item))
if not isinstance(loc, int):
raise TypeError("loc must be int")
# insert to the axis; this could possibly raise a TypeError
new_axis = self.items.insert(loc, item)
block = make_block(values=value, ndim=self.ndim,
placement=slice(loc, loc + 1))
for blkno, count in _fast_count_smallints(self._blknos[loc:]):
blk = self.blocks[blkno]
if count == len(blk.mgr_locs):
blk.mgr_locs = blk.mgr_locs.add(1)
else:
new_mgr_locs = blk.mgr_locs.as_array.copy()
new_mgr_locs[new_mgr_locs >= loc] += 1
blk.mgr_locs = new_mgr_locs
if loc == self._blklocs.shape[0]:
# np.append is a lot faster (at least in numpy 1.7.1), let's use it
# if we can.
self._blklocs = np.append(self._blklocs, 0)
self._blknos = np.append(self._blknos, len(self.blocks))
else:
self._blklocs = np.insert(self._blklocs, loc, 0)
self._blknos = np.insert(self._blknos, loc, len(self.blocks))
self.axes[0] = new_axis
self.blocks += (block,)
self._shape = None
self._known_consolidated = False
if len(self.blocks) > 100:
self._consolidate_inplace()
def reindex_axis(self, new_index, axis, method=None, limit=None,
fill_value=None, copy=True):
"""
Conform block manager to new index.
"""
new_index = _ensure_index(new_index)
new_index, indexer = self.axes[axis].reindex(new_index, method=method,
limit=limit)
return self.reindex_indexer(new_index, indexer, axis=axis,
fill_value=fill_value, copy=copy)
def reindex_indexer(self, new_axis, indexer, axis, fill_value=None,
allow_dups=False, copy=True):
"""
Parameters
----------
new_axis : Index
indexer : ndarray of int64 or None
axis : int
fill_value : object
allow_dups : bool
pandas-indexer with -1's only.
"""
if indexer is None:
if new_axis is self.axes[axis] and not copy:
return self
result = self.copy(deep=copy)
result.axes = list(self.axes)
result.axes[axis] = new_axis
return result
self._consolidate_inplace()
# some axes don't allow reindexing with dups
if not allow_dups:
self.axes[axis]._can_reindex(indexer)
if axis >= self.ndim:
raise IndexError("Requested axis not found in manager")
if axis == 0:
new_blocks = self._slice_take_blocks_ax0(indexer,
fill_tuple=(fill_value,))
else:
new_blocks = [blk.take_nd(indexer, axis=axis, fill_tuple=(
fill_value if fill_value is not None else blk.fill_value,))
for blk in self.blocks]
new_axes = list(self.axes)
new_axes[axis] = new_axis
return self.__class__(new_blocks, new_axes)
def _slice_take_blocks_ax0(self, slice_or_indexer, fill_tuple=None):
"""
Slice/take blocks along axis=0.
Overloaded for SingleBlock
Returns
-------
new_blocks : list of Block
"""
allow_fill = fill_tuple is not None
sl_type, slobj, sllen = _preprocess_slice_or_indexer(
slice_or_indexer, self.shape[0], allow_fill=allow_fill)
if self._is_single_block:
blk = self.blocks[0]
if sl_type in ('slice', 'mask'):
return [blk.getitem_block(slobj, new_mgr_locs=slice(0, sllen))]
elif not allow_fill or self.ndim == 1:
if allow_fill and fill_tuple[0] is None:
_, fill_value = maybe_promote(blk.dtype)
fill_tuple = (fill_value, )
return [blk.take_nd(slobj, axis=0,
new_mgr_locs=slice(0, sllen),
fill_tuple=fill_tuple)]
if sl_type in ('slice', 'mask'):
blknos = self._blknos[slobj]
blklocs = self._blklocs[slobj]
else:
blknos = algos.take_1d(self._blknos, slobj, fill_value=-1,
allow_fill=allow_fill)
blklocs = algos.take_1d(self._blklocs, slobj, fill_value=-1,
allow_fill=allow_fill)
# When filling blknos, make sure blknos is updated before appending to
# blocks list, that way new blkno is exactly len(blocks).
#
# FIXME: mgr_groupby_blknos must return mgr_locs in ascending order,
# pytables serialization will break otherwise.
blocks = []
for blkno, mgr_locs in _get_blkno_placements(blknos, len(self.blocks),
group=True):
if blkno == -1:
# If we've got here, fill_tuple was not None.
fill_value = fill_tuple[0]
blocks.append(self._make_na_block(placement=mgr_locs,
fill_value=fill_value))
else:
blk = self.blocks[blkno]
# Otherwise, slicing along items axis is necessary.
if not blk._can_consolidate:
# A non-consolidatable block, it's easy, because there's
# only one item and each mgr loc is a copy of that single
# item.
for mgr_loc in mgr_locs:
newblk = blk.copy(deep=True)
newblk.mgr_locs = slice(mgr_loc, mgr_loc + 1)
blocks.append(newblk)
else:
blocks.append(blk.take_nd(blklocs[mgr_locs.indexer],
axis=0, new_mgr_locs=mgr_locs,
fill_tuple=None))
return blocks
def _make_na_block(self, placement, fill_value=None):
# TODO: infer dtypes other than float64 from fill_value
if fill_value is None:
fill_value = np.nan
block_shape = list(self.shape)
block_shape[0] = len(placement)
dtype, fill_value = infer_dtype_from_scalar(fill_value)
block_values = np.empty(block_shape, dtype=dtype)
block_values.fill(fill_value)
return make_block(block_values, placement=placement)
def take(self, indexer, axis=1, verify=True, convert=True):
"""
Take items along any axis.
"""
self._consolidate_inplace()
indexer = (np.arange(indexer.start, indexer.stop, indexer.step,
dtype='int64')
if isinstance(indexer, slice)
else np.asanyarray(indexer, dtype='int64'))
n = self.shape[axis]
if convert:
indexer = maybe_convert_indices(indexer, n)
if verify:
if ((indexer == -1) | (indexer >= n)).any():
raise Exception('Indices must be nonzero and less than '
'the axis length')
new_labels = self.axes[axis].take(indexer)
return self.reindex_indexer(new_axis=new_labels, indexer=indexer,
axis=axis, allow_dups=True)
def merge(self, other, lsuffix='', rsuffix=''):
if not self._is_indexed_like(other):
raise AssertionError('Must have same axes to merge managers')
l, r = items_overlap_with_suffix(left=self.items, lsuffix=lsuffix,
right=other.items, rsuffix=rsuffix)
new_items = _concat_indexes([l, r])
new_blocks = [blk.copy(deep=False) for blk in self.blocks]
offset = self.shape[0]
for blk in other.blocks:
blk = blk.copy(deep=False)
blk.mgr_locs = blk.mgr_locs.add(offset)
new_blocks.append(blk)
new_axes = list(self.axes)
new_axes[0] = new_items
return self.__class__(_consolidate(new_blocks), new_axes)
def _is_indexed_like(self, other):
"""
Check all axes except items
"""
if self.ndim != other.ndim:
raise AssertionError('Number of dimensions must agree '
'got %d and %d' % (self.ndim, other.ndim))
for ax, oax in zip(self.axes[1:], other.axes[1:]):
if not ax.equals(oax):
return False
return True
def equals(self, other):
self_axes, other_axes = self.axes, other.axes
if len(self_axes) != len(other_axes):
return False
if not all(ax1.equals(ax2) for ax1, ax2 in zip(self_axes, other_axes)):
return False
self._consolidate_inplace()
other._consolidate_inplace()
if len(self.blocks) != len(other.blocks):
return False
# canonicalize block order, using a tuple combining the type
# name and then mgr_locs because there might be unconsolidated
# blocks (say, Categorical) which can only be distinguished by
# the iteration order
def canonicalize(block):
return (block.dtype.name, block.mgr_locs.as_array.tolist())
self_blocks = sorted(self.blocks, key=canonicalize)
other_blocks = sorted(other.blocks, key=canonicalize)
return all(block.equals(oblock)
for block, oblock in zip(self_blocks, other_blocks))
class SingleBlockManager(BlockManager):
""" manage a single block with """
ndim = 1
_is_consolidated = True
_known_consolidated = True
__slots__ = ()
def __init__(self, block, axis, do_integrity_check=False, fastpath=False):
if isinstance(axis, list):
if len(axis) != 1:
raise ValueError("cannot create SingleBlockManager with more "
"than 1 axis")
axis = axis[0]
# passed from constructor, single block, single axis
if fastpath:
self.axes = [axis]
if isinstance(block, list):
# empty block
if len(block) == 0:
block = [np.array([])]
elif len(block) != 1:
raise ValueError('Cannot create SingleBlockManager with '
'more than 1 block')
block = block[0]
else:
self.axes = [_ensure_index(axis)]
# create the block here
if isinstance(block, list):
# provide consolidation to the interleaved_dtype
if len(block) > 1:
dtype = _interleaved_dtype(block)
block = [b.astype(dtype) for b in block]
block = _consolidate(block)
if len(block) != 1:
raise ValueError('Cannot create SingleBlockManager with '
'more than 1 block')
block = block[0]
if not isinstance(block, Block):
block = make_block(block, placement=slice(0, len(axis)), ndim=1,
fastpath=True)
self.blocks = [block]
def _post_setstate(self):
pass
@property
def _block(self):
return self.blocks[0]
@property
def _values(self):
return self._block.values
@property
def _blknos(self):
""" compat with BlockManager """
return None
@property
def _blklocs(self):
""" compat with BlockManager """
return None
def reindex(self, new_axis, indexer=None, method=None, fill_value=None,
limit=None, copy=True):
# if we are the same and don't copy, just return
if self.index.equals(new_axis):
if copy:
return self.copy(deep=True)
else:
return self
values = self._block.get_values()
if indexer is None:
indexer = self.items.get_indexer_for(new_axis)
if fill_value is None:
fill_value = np.nan
new_values = algos.take_1d(values, indexer, fill_value=fill_value)
# fill if needed
if method is not None or limit is not None:
new_values = missing.interpolate_2d(new_values,
method=method,
limit=limit,
fill_value=fill_value)
if self._block.is_sparse:
make_block = self._block.make_block_same_class
block = make_block(new_values, copy=copy,
placement=slice(0, len(new_axis)))
mgr = SingleBlockManager(block, new_axis)
mgr._consolidate_inplace()
return mgr
def get_slice(self, slobj, axis=0):
if axis >= self.ndim:
raise IndexError("Requested axis not found in manager")
return self.__class__(self._block._slice(slobj),
self.index[slobj], fastpath=True)
@property
def index(self):
return self.axes[0]
def convert(self, **kwargs):
""" convert the whole block as one """
kwargs['by_item'] = False
return self.apply('convert', **kwargs)
@property
def dtype(self):
return self._block.dtype
@property
def array_dtype(self):
return self._block.array_dtype
@property
def ftype(self):
return self._block.ftype
def get_dtype_counts(self):
return {self.dtype.name: 1}
def get_ftype_counts(self):
return {self.ftype: 1}
def get_dtypes(self):
return np.array([self._block.dtype])
def get_ftypes(self):
return np.array([self._block.ftype])
def external_values(self):
return self._block.external_values()
def internal_values(self):
return self._block.internal_values()
def get_values(self):
""" return a dense type view """
return np.array(self._block.to_dense(), copy=False)
@property
def asobject(self):
"""
return a object dtype array. datetime/timedelta like values are boxed
to Timestamp/Timedelta instances.
"""
return self._block.get_values(dtype=object)
@property
def itemsize(self):
return self._block.values.itemsize
@property
def _can_hold_na(self):
return self._block._can_hold_na
def is_consolidated(self):
return True
def _consolidate_check(self):
pass
def _consolidate_inplace(self):
pass
def delete(self, item):
"""
Delete single item from SingleBlockManager.
Ensures that self.blocks doesn't become empty.
"""
loc = self.items.get_loc(item)
self._block.delete(loc)
self.axes[0] = self.axes[0].delete(loc)
def fast_xs(self, loc):
"""
fast path for getting a cross-section
return a view of the data
"""
return self._block.values[loc]
def construction_error(tot_items, block_shape, axes, e=None):
""" raise a helpful message about our construction """
passed = tuple(map(int, [tot_items] + list(block_shape)))
implied = tuple(map(int, [len(ax) for ax in axes]))
if passed == implied and e is not None:
raise e
if block_shape[0] == 0:
raise ValueError("Empty data passed with indices specified.")
raise ValueError("Shape of passed values is {0}, indices imply {1}".format(
passed, implied))
def create_block_manager_from_blocks(blocks, axes):
try:
if len(blocks) == 1 and not isinstance(blocks[0], Block):
# if blocks[0] is of length 0, return empty blocks
if not len(blocks[0]):
blocks = []
else:
# It's OK if a single block is passed as values, its placement
# is basically "all items", but if there're many, don't bother
# converting, it's an error anyway.
blocks = [make_block(values=blocks[0],
placement=slice(0, len(axes[0])))]
mgr = BlockManager(blocks, axes)
mgr._consolidate_inplace()
return mgr
except (ValueError) as e:
blocks = [getattr(b, 'values', b) for b in blocks]
tot_items = sum(b.shape[0] for b in blocks)
construction_error(tot_items, blocks[0].shape[1:], axes, e)
def create_block_manager_from_arrays(arrays, names, axes):
try:
blocks = form_blocks(arrays, names, axes)
mgr = BlockManager(blocks, axes)
mgr._consolidate_inplace()
return mgr
except ValueError as e:
construction_error(len(arrays), arrays[0].shape, axes, e)
def form_blocks(arrays, names, axes):
# put "leftover" items in float bucket, where else?
# generalize?
float_items = []
complex_items = []
int_items = []
bool_items = []
object_items = []
sparse_items = []
datetime_items = []
datetime_tz_items = []
cat_items = []
extra_locs = []
names_idx = Index(names)
if names_idx.equals(axes[0]):
names_indexer = np.arange(len(names_idx))
else:
assert names_idx.intersection(axes[0]).is_unique
names_indexer = names_idx.get_indexer_for(axes[0])
for i, name_idx in enumerate(names_indexer):
if name_idx == -1:
extra_locs.append(i)
continue
k = names[name_idx]
v = arrays[name_idx]
if is_sparse(v):
sparse_items.append((i, k, v))
elif issubclass(v.dtype.type, np.floating):
float_items.append((i, k, v))
elif issubclass(v.dtype.type, np.complexfloating):
complex_items.append((i, k, v))
elif issubclass(v.dtype.type, np.datetime64):
if v.dtype != _NS_DTYPE:
v = tslib.cast_to_nanoseconds(v)
if is_datetimetz(v):
datetime_tz_items.append((i, k, v))
else:
datetime_items.append((i, k, v))
elif is_datetimetz(v):
datetime_tz_items.append((i, k, v))
elif issubclass(v.dtype.type, np.integer):
int_items.append((i, k, v))
elif v.dtype == np.bool_:
bool_items.append((i, k, v))
elif is_categorical(v):
cat_items.append((i, k, v))
else:
object_items.append((i, k, v))
blocks = []
if len(float_items):
float_blocks = _multi_blockify(float_items)
blocks.extend(float_blocks)
if len(complex_items):
complex_blocks = _multi_blockify(complex_items)
blocks.extend(complex_blocks)
if len(int_items):
int_blocks = _multi_blockify(int_items)
blocks.extend(int_blocks)
if len(datetime_items):
datetime_blocks = _simple_blockify(datetime_items, _NS_DTYPE)
blocks.extend(datetime_blocks)
if len(datetime_tz_items):
dttz_blocks = [make_block(array,
klass=DatetimeTZBlock,
fastpath=True,
placement=[i], )
for i, _, array in datetime_tz_items]
blocks.extend(dttz_blocks)
if len(bool_items):
bool_blocks = _simple_blockify(bool_items, np.bool_)
blocks.extend(bool_blocks)
if len(object_items) > 0:
object_blocks = _simple_blockify(object_items, np.object_)
blocks.extend(object_blocks)
if len(sparse_items) > 0:
sparse_blocks = _sparse_blockify(sparse_items)
blocks.extend(sparse_blocks)
if len(cat_items) > 0:
cat_blocks = [make_block(array, klass=CategoricalBlock, fastpath=True,
placement=[i])
for i, _, array in cat_items]
blocks.extend(cat_blocks)
if len(extra_locs):
shape = (len(extra_locs),) + tuple(len(x) for x in axes[1:])
# empty items -> dtype object
block_values = np.empty(shape, dtype=object)
block_values.fill(np.nan)
na_block = make_block(block_values, placement=extra_locs)
blocks.append(na_block)
return blocks
def _simple_blockify(tuples, dtype):
""" return a single array of a block that has a single dtype; if dtype is
not None, coerce to this dtype
"""
values, placement = _stack_arrays(tuples, dtype)
# CHECK DTYPE?
if dtype is not None and values.dtype != dtype: # pragma: no cover
values = values.astype(dtype)
block = make_block(values, placement=placement)
return [block]
def _multi_blockify(tuples, dtype=None):
""" return an array of blocks that potentially have different dtypes """
# group by dtype
grouper = itertools.groupby(tuples, lambda x: x[2].dtype)
new_blocks = []
for dtype, tup_block in grouper:
values, placement = _stack_arrays(list(tup_block), dtype)
block = make_block(values, placement=placement)
new_blocks.append(block)
return new_blocks
def _sparse_blockify(tuples, dtype=None):
""" return an array of blocks that potentially have different dtypes (and
are sparse)
"""
new_blocks = []
for i, names, array in tuples:
array = _maybe_to_sparse(array)
block = make_block(array, klass=SparseBlock, fastpath=True,
placement=[i])
new_blocks.append(block)
return new_blocks
def _stack_arrays(tuples, dtype):
# fml
def _asarray_compat(x):
if isinstance(x, ABCSeries):
return x._values
else:
return np.asarray(x)
def _shape_compat(x):
if isinstance(x, ABCSeries):
return len(x),
else:
return x.shape
placement, names, arrays = zip(*tuples)
first = arrays[0]
shape = (len(arrays),) + _shape_compat(first)
stacked = np.empty(shape, dtype=dtype)
for i, arr in enumerate(arrays):
stacked[i] = _asarray_compat(arr)
return stacked, placement
def _interleaved_dtype(blocks):
if not len(blocks):
return None
dtype = find_common_type([b.dtype for b in blocks])
# only numpy compat
if isinstance(dtype, ExtensionDtype):
dtype = np.object
return dtype
def _consolidate(blocks):
"""
Merge blocks having same dtype, exclude non-consolidating blocks
"""
# sort by _can_consolidate, dtype
gkey = lambda x: x._consolidate_key
grouper = itertools.groupby(sorted(blocks, key=gkey), gkey)
new_blocks = []
for (_can_consolidate, dtype), group_blocks in grouper:
merged_blocks = _merge_blocks(list(group_blocks), dtype=dtype,
_can_consolidate=_can_consolidate)
new_blocks = _extend_blocks(merged_blocks, new_blocks)
return new_blocks
def _merge_blocks(blocks, dtype=None, _can_consolidate=True):
if len(blocks) == 1:
return blocks[0]
if _can_consolidate:
if dtype is None:
if len(set([b.dtype for b in blocks])) != 1:
raise AssertionError("_merge_blocks are invalid!")
dtype = blocks[0].dtype
# FIXME: optimization potential in case all mgrs contain slices and
# combination of those slices is a slice, too.
new_mgr_locs = np.concatenate([b.mgr_locs.as_array for b in blocks])
new_values = _vstack([b.values for b in blocks], dtype)
argsort = np.argsort(new_mgr_locs)
new_values = new_values[argsort]
new_mgr_locs = new_mgr_locs[argsort]
return make_block(new_values, fastpath=True, placement=new_mgr_locs)
# no merge
return blocks
def _extend_blocks(result, blocks=None):
""" return a new extended blocks, givin the result """
if blocks is None:
blocks = []
if isinstance(result, list):
for r in result:
if isinstance(r, list):
blocks.extend(r)
else:
blocks.append(r)
elif isinstance(result, BlockManager):
blocks.extend(result.blocks)
else:
blocks.append(result)
return blocks
def _block_shape(values, ndim=1, shape=None):
""" guarantee the shape of the values to be at least 1 d """
if values.ndim < ndim:
if shape is None:
shape = values.shape
values = values.reshape(tuple((1, ) + shape))
return values
def _vstack(to_stack, dtype):
# work around NumPy 1.6 bug
if dtype == _NS_DTYPE or dtype == _TD_DTYPE:
new_values = np.vstack([x.view('i8') for x in to_stack])
return new_values.view(dtype)
else:
return np.vstack(to_stack)
def _maybe_compare(a, b, op):
is_a_array = isinstance(a, np.ndarray)
is_b_array = isinstance(b, np.ndarray)
# numpy deprecation warning to have i8 vs integer comparisions
if is_datetimelike_v_numeric(a, b):
result = False
# numpy deprecation warning if comparing numeric vs string-like
elif is_numeric_v_string_like(a, b):
result = False
else:
result = op(a, b)
if is_scalar(result) and (is_a_array or is_b_array):
type_names = [type(a).__name__, type(b).__name__]
if is_a_array:
type_names[0] = 'ndarray(dtype=%s)' % a.dtype
if is_b_array:
type_names[1] = 'ndarray(dtype=%s)' % b.dtype
raise TypeError("Cannot compare types %r and %r" % tuple(type_names))
return result
def _concat_indexes(indexes):
return indexes[0].append(indexes[1:])
def _block2d_to_blocknd(values, placement, shape, labels, ref_items):
""" pivot to the labels shape """
from pandas.core.internals import make_block
panel_shape = (len(placement),) + shape
# TODO: lexsort depth needs to be 2!!
# Create observation selection vector using major and minor
# labels, for converting to panel format.
selector = _factor_indexer(shape[1:], labels)
mask = np.zeros(np.prod(shape), dtype=bool)
mask.put(selector, True)
if mask.all():
pvalues = np.empty(panel_shape, dtype=values.dtype)
else:
dtype, fill_value = maybe_promote(values.dtype)
pvalues = np.empty(panel_shape, dtype=dtype)
pvalues.fill(fill_value)
for i in range(len(placement)):
pvalues[i].flat[mask] = values[:, i]
return make_block(pvalues, placement=placement)
def _factor_indexer(shape, labels):
"""
given a tuple of shape and a list of Categorical labels, return the
expanded label indexer
"""
mult = np.array(shape)[::-1].cumprod()[::-1]
return _ensure_platform_int(
np.sum(np.array(labels).T * np.append(mult, [1]), axis=1).T)
def _get_blkno_placements(blknos, blk_count, group=True):
"""
Parameters
----------
blknos : array of int64
blk_count : int
group : bool
Returns
-------
iterator
yield (BlockPlacement, blkno)
"""
blknos = _ensure_int64(blknos)
# FIXME: blk_count is unused, but it may avoid the use of dicts in cython
for blkno, indexer in lib.get_blkno_indexers(blknos, group):
yield blkno, BlockPlacement(indexer)
def items_overlap_with_suffix(left, lsuffix, right, rsuffix):
"""
If two indices overlap, add suffixes to overlapping entries.
If corresponding suffix is empty, the entry is simply converted to string.
"""
to_rename = left.intersection(right)
if len(to_rename) == 0:
return left, right
else:
if not lsuffix and not rsuffix:
raise ValueError('columns overlap but no suffix specified: %s' %
to_rename)
def lrenamer(x):
if x in to_rename:
return '%s%s' % (x, lsuffix)
return x
def rrenamer(x):
if x in to_rename:
return '%s%s' % (x, rsuffix)
return x
return (_transform_index(left, lrenamer),
_transform_index(right, rrenamer))
def _safe_reshape(arr, new_shape):
"""
If possible, reshape `arr` to have shape `new_shape`,
with a couple of exceptions (see gh-13012):
1) If `arr` is a Categorical or Index, `arr` will be
returned as is.
2) If `arr` is a Series, the `_values` attribute will
be reshaped and returned.
Parameters
----------
arr : array-like, object to be reshaped
new_shape : int or tuple of ints, the new shape
"""
if isinstance(arr, ABCSeries):
arr = arr._values
if not isinstance(arr, Categorical):
arr = arr.reshape(new_shape)
return arr
def _transform_index(index, func, level=None):
"""
Apply function to all values found in index.
This includes transforming multiindex entries separately.
Only apply function to one level of the MultiIndex if level is specified.
"""
if isinstance(index, MultiIndex):
if level is not None:
items = [tuple(func(y) if i == level else y
for i, y in enumerate(x)) for x in index]
else:
items = [tuple(func(y) for y in x) for x in index]
return MultiIndex.from_tuples(items, names=index.names)
else:
items = [func(x) for x in index]
return Index(items, name=index.name)
def _putmask_smart(v, m, n):
"""
Return a new block, try to preserve dtype if possible.
Parameters
----------
v : `values`, updated in-place (array like)
m : `mask`, applies to both sides (array like)
n : `new values` either scalar or an array like aligned with `values`
"""
# n should be the length of the mask or a scalar here
if not is_list_like(n):
n = np.array([n] * len(m))
elif isinstance(n, np.ndarray) and n.ndim == 0: # numpy scalar
n = np.repeat(np.array(n, ndmin=1), len(m))
# see if we are only masking values that if putted
# will work in the current dtype
try:
nn = n[m]
# make sure that we have a nullable type
# if we have nulls
if not _is_na_compat(v, nn[0]):
raise ValueError
nn_at = nn.astype(v.dtype)
# avoid invalid dtype comparisons
if not is_numeric_v_string_like(nn, nn_at):
comp = (nn == nn_at)
if is_list_like(comp) and comp.all():
nv = v.copy()
nv[m] = nn_at
return nv
except (ValueError, IndexError, TypeError):
pass
# change the dtype
dtype, _ = maybe_promote(n.dtype)
if is_extension_type(v.dtype) and is_object_dtype(dtype):
nv = v.get_values(dtype)
else:
nv = v.astype(dtype)
try:
nv[m] = n[m]
except ValueError:
idx, = np.where(np.squeeze(m))
for mask_index, new_val in zip(idx, n[m]):
nv[mask_index] = new_val
return nv
def concatenate_block_managers(mgrs_indexers, axes, concat_axis, copy):
"""
Concatenate block managers into one.
Parameters
----------
mgrs_indexers : list of (BlockManager, {axis: indexer,...}) tuples
axes : list of Index
concat_axis : int
copy : bool
"""
concat_plan = combine_concat_plans(
[get_mgr_concatenation_plan(mgr, indexers)
for mgr, indexers in mgrs_indexers], concat_axis)
blocks = [make_block(
concatenate_join_units(join_units, concat_axis, copy=copy),
placement=placement) for placement, join_units in concat_plan]
return BlockManager(blocks, axes)
def get_empty_dtype_and_na(join_units):
"""
Return dtype and N/A values to use when concatenating specified units.
Returned N/A value may be None which means there was no casting involved.
Returns
-------
dtype
na
"""
if len(join_units) == 1:
blk = join_units[0].block
if blk is None:
return np.float64, np.nan
has_none_blocks = False
dtypes = [None] * len(join_units)
for i, unit in enumerate(join_units):
if unit.block is None:
has_none_blocks = True
else:
dtypes[i] = unit.dtype
upcast_classes = defaultdict(list)
null_upcast_classes = defaultdict(list)
for dtype, unit in zip(dtypes, join_units):
if dtype is None:
continue
if is_categorical_dtype(dtype):
upcast_cls = 'category'
elif is_datetimetz(dtype):
upcast_cls = 'datetimetz'
elif issubclass(dtype.type, np.bool_):
upcast_cls = 'bool'
elif issubclass(dtype.type, np.object_):
upcast_cls = 'object'
elif is_datetime64_dtype(dtype):
upcast_cls = 'datetime'
elif is_timedelta64_dtype(dtype):
upcast_cls = 'timedelta'
elif is_float_dtype(dtype) or is_numeric_dtype(dtype):
upcast_cls = dtype.name
else:
upcast_cls = 'float'
# Null blocks should not influence upcast class selection, unless there
# are only null blocks, when same upcasting rules must be applied to
# null upcast classes.
if unit.is_null:
null_upcast_classes[upcast_cls].append(dtype)
else:
upcast_classes[upcast_cls].append(dtype)
if not upcast_classes:
upcast_classes = null_upcast_classes
# create the result
if 'object' in upcast_classes:
return np.dtype(np.object_), np.nan
elif 'bool' in upcast_classes:
if has_none_blocks:
return np.dtype(np.object_), np.nan
else:
return np.dtype(np.bool_), None
elif 'category' in upcast_classes:
return np.dtype(np.object_), np.nan
elif 'datetimetz' in upcast_classes:
dtype = upcast_classes['datetimetz']
return dtype[0], tslib.iNaT
elif 'datetime' in upcast_classes:
return np.dtype('M8[ns]'), tslib.iNaT
elif 'timedelta' in upcast_classes:
return np.dtype('m8[ns]'), tslib.iNaT
else: # pragma
g = np.find_common_type(upcast_classes, [])
if is_float_dtype(g):
return g, g.type(np.nan)
elif is_numeric_dtype(g):
if has_none_blocks:
return np.float64, np.nan
else:
return g, None
msg = "invalid dtype determination in get_concat_dtype"
raise AssertionError(msg)
def concatenate_join_units(join_units, concat_axis, copy):
"""
Concatenate values from several join units along selected axis.
"""
if concat_axis == 0 and len(join_units) > 1:
# Concatenating join units along ax0 is handled in _merge_blocks.
raise AssertionError("Concatenating join units along axis0")
empty_dtype, upcasted_na = get_empty_dtype_and_na(join_units)
to_concat = [ju.get_reindexed_values(empty_dtype=empty_dtype,
upcasted_na=upcasted_na)
for ju in join_units]
if len(to_concat) == 1:
# Only one block, nothing to concatenate.
concat_values = to_concat[0]
if copy and concat_values.base is not None:
concat_values = concat_values.copy()
else:
concat_values = _concat._concat_compat(to_concat, axis=concat_axis)
return concat_values
def get_mgr_concatenation_plan(mgr, indexers):
"""
Construct concatenation plan for given block manager and indexers.
Parameters
----------
mgr : BlockManager
indexers : dict of {axis: indexer}
Returns
-------
plan : list of (BlockPlacement, JoinUnit) tuples
"""
# Calculate post-reindex shape , save for item axis which will be separate
# for each block anyway.
mgr_shape = list(mgr.shape)
for ax, indexer in indexers.items():
mgr_shape[ax] = len(indexer)
mgr_shape = tuple(mgr_shape)
if 0 in indexers:
ax0_indexer = indexers.pop(0)
blknos = algos.take_1d(mgr._blknos, ax0_indexer, fill_value=-1)
blklocs = algos.take_1d(mgr._blklocs, ax0_indexer, fill_value=-1)
else:
if mgr._is_single_block:
blk = mgr.blocks[0]
return [(blk.mgr_locs, JoinUnit(blk, mgr_shape, indexers))]
ax0_indexer = None
blknos = mgr._blknos
blklocs = mgr._blklocs
plan = []
for blkno, placements in _get_blkno_placements(blknos, len(mgr.blocks),
group=False):
assert placements.is_slice_like
join_unit_indexers = indexers.copy()
shape = list(mgr_shape)
shape[0] = len(placements)
shape = tuple(shape)
if blkno == -1:
unit = JoinUnit(None, shape)
else:
blk = mgr.blocks[blkno]
ax0_blk_indexer = blklocs[placements.indexer]
unit_no_ax0_reindexing = (len(placements) == len(blk.mgr_locs) and
# Fastpath detection of join unit not
# needing to reindex its block: no ax0
# reindexing took place and block
# placement was sequential before.
((ax0_indexer is None and
blk.mgr_locs.is_slice_like and
blk.mgr_locs.as_slice.step == 1) or
# Slow-ish detection: all indexer locs
# are sequential (and length match is
# checked above).
(np.diff(ax0_blk_indexer) == 1).all()))
# Omit indexer if no item reindexing is required.
if unit_no_ax0_reindexing:
join_unit_indexers.pop(0, None)
else:
join_unit_indexers[0] = ax0_blk_indexer
unit = JoinUnit(blk, shape, join_unit_indexers)
plan.append((placements, unit))
return plan
def combine_concat_plans(plans, concat_axis):
"""
Combine multiple concatenation plans into one.
existing_plan is updated in-place.
"""
if len(plans) == 1:
for p in plans[0]:
yield p[0], [p[1]]
elif concat_axis == 0:
offset = 0
for plan in plans:
last_plc = None
for plc, unit in plan:
yield plc.add(offset), [unit]
last_plc = plc
if last_plc is not None:
offset += last_plc.as_slice.stop
else:
num_ended = [0]
def _next_or_none(seq):
retval = next(seq, None)
if retval is None:
num_ended[0] += 1
return retval
plans = list(map(iter, plans))
next_items = list(map(_next_or_none, plans))
while num_ended[0] != len(next_items):
if num_ended[0] > 0:
raise ValueError("Plan shapes are not aligned")
placements, units = zip(*next_items)
lengths = list(map(len, placements))
min_len, max_len = min(lengths), max(lengths)
if min_len == max_len:
yield placements[0], units
next_items[:] = map(_next_or_none, plans)
else:
yielded_placement = None
yielded_units = [None] * len(next_items)
for i, (plc, unit) in enumerate(next_items):
yielded_units[i] = unit
if len(plc) > min_len:
# trim_join_unit updates unit in place, so only
# placement needs to be sliced to skip min_len.
next_items[i] = (plc[min_len:],
trim_join_unit(unit, min_len))
else:
yielded_placement = plc
next_items[i] = _next_or_none(plans[i])
yield yielded_placement, yielded_units
def trim_join_unit(join_unit, length):
"""
Reduce join_unit's shape along item axis to length.
Extra items that didn't fit are returned as a separate block.
"""
if 0 not in join_unit.indexers:
extra_indexers = join_unit.indexers
if join_unit.block is None:
extra_block = None
else:
extra_block = join_unit.block.getitem_block(slice(length, None))
join_unit.block = join_unit.block.getitem_block(slice(length))
else:
extra_block = join_unit.block
extra_indexers = copy.copy(join_unit.indexers)
extra_indexers[0] = extra_indexers[0][length:]
join_unit.indexers[0] = join_unit.indexers[0][:length]
extra_shape = (join_unit.shape[0] - length,) + join_unit.shape[1:]
join_unit.shape = (length,) + join_unit.shape[1:]
return JoinUnit(block=extra_block, indexers=extra_indexers,
shape=extra_shape)
class JoinUnit(object):
def __init__(self, block, shape, indexers=None):
# Passing shape explicitly is required for cases when block is None.
if indexers is None:
indexers = {}
self.block = block
self.indexers = indexers
self.shape = shape
def __repr__(self):
return '%s(%r, %s)' % (self.__class__.__name__, self.block,
self.indexers)
@cache_readonly
def needs_filling(self):
for indexer in self.indexers.values():
# FIXME: cache results of indexer == -1 checks.
if (indexer == -1).any():
return True
return False
@cache_readonly
def dtype(self):
if self.block is None:
raise AssertionError("Block is None, no dtype")
if not self.needs_filling:
return self.block.dtype
else:
return _get_dtype(maybe_promote(self.block.dtype,
self.block.fill_value)[0])
@cache_readonly
def is_null(self):
if self.block is None:
return True
if not self.block._can_hold_na:
return False
# Usually it's enough to check but a small fraction of values to see if
# a block is NOT null, chunks should help in such cases. 1000 value
# was chosen rather arbitrarily.
values = self.block.values
if self.block.is_categorical:
values_flat = values.categories
elif self.block.is_sparse:
# fill_value is not NaN and have holes
if not values._null_fill_value and values.sp_index.ngaps > 0:
return False
values_flat = values.ravel(order='K')
else:
values_flat = values.ravel(order='K')
total_len = values_flat.shape[0]
chunk_len = max(total_len // 40, 1000)
for i in range(0, total_len, chunk_len):
if not isnull(values_flat[i:i + chunk_len]).all():
return False
return True
def get_reindexed_values(self, empty_dtype, upcasted_na):
if upcasted_na is None:
# No upcasting is necessary
fill_value = self.block.fill_value
values = self.block.get_values()
else:
fill_value = upcasted_na
if self.is_null:
if getattr(self.block, 'is_object', False):
# we want to avoid filling with np.nan if we are
# using None; we already know that we are all
# nulls
values = self.block.values.ravel(order='K')
if len(values) and values[0] is None:
fill_value = None
if getattr(self.block, 'is_datetimetz', False):
pass
elif getattr(self.block, 'is_categorical', False):
pass
elif getattr(self.block, 'is_sparse', False):
pass
else:
missing_arr = np.empty(self.shape, dtype=empty_dtype)
missing_arr.fill(fill_value)
return missing_arr
if not self.indexers:
if not self.block._can_consolidate:
# preserve these for validation in _concat_compat
return self.block.values
if self.block.is_bool:
# External code requested filling/upcasting, bool values must
# be upcasted to object to avoid being upcasted to numeric.
values = self.block.astype(np.object_).values
elif self.block.is_categorical:
values = self.block.values
else:
# No dtype upcasting is done here, it will be performed during
# concatenation itself.
values = self.block.get_values()
if not self.indexers:
# If there's no indexing to be done, we want to signal outside
# code that this array must be copied explicitly. This is done
# by returning a view and checking `retval.base`.
values = values.view()
else:
for ax, indexer in self.indexers.items():
values = algos.take_nd(values, indexer, axis=ax,
fill_value=fill_value)
return values
def _fast_count_smallints(arr):
"""Faster version of set(arr) for sequences of small numbers."""
if len(arr) == 0:
# Handle empty arr case separately: numpy 1.6 chokes on that.
return np.empty((0, 2), dtype=arr.dtype)
else:
counts = np.bincount(arr.astype(np.int_))
nz = counts.nonzero()[0]
return np.c_[nz, counts[nz]]
def _preprocess_slice_or_indexer(slice_or_indexer, length, allow_fill):
if isinstance(slice_or_indexer, slice):
return 'slice', slice_or_indexer, lib.slice_len(slice_or_indexer,
length)
elif (isinstance(slice_or_indexer, np.ndarray) and
slice_or_indexer.dtype == np.bool_):
return 'mask', slice_or_indexer, slice_or_indexer.sum()
else:
indexer = np.asanyarray(slice_or_indexer, dtype=np.int64)
if not allow_fill:
indexer = maybe_convert_indices(indexer, length)
return 'fancy', indexer, len(indexer)
| bsd-3-clause |
rabernat/xray | xarray/core/variable.py | 1 | 63679 | from __future__ import absolute_import
from __future__ import division
from __future__ import print_function
from datetime import timedelta
from collections import defaultdict
import functools
import itertools
from distutils.version import LooseVersion
import numpy as np
import pandas as pd
from . import common
from . import duck_array_ops
from . import dtypes
from . import indexing
from . import nputils
from . import ops
from . import utils
from .pycompat import (basestring, OrderedDict, zip, integer_types,
dask_array_type)
from .indexing import (PandasIndexAdapter, as_indexable, BasicIndexer,
OuterIndexer, VectorizedIndexer)
from .utils import OrderedSet
import xarray as xr # only for Dataset and DataArray
try:
import dask.array as da
except ImportError:
pass
NON_NUMPY_SUPPORTED_ARRAY_TYPES = (
indexing.ExplicitlyIndexed, pd.Index) + dask_array_type
BASIC_INDEXING_TYPES = integer_types + (slice,)
class MissingDimensionsError(ValueError):
"""Error class used when we can't safely guess a dimension name.
"""
# inherits from ValueError for backward compatibility
# TODO: move this to an xarray.exceptions module?
def as_variable(obj, name=None):
"""Convert an object into a Variable.
Parameters
----------
obj : object
Object to convert into a Variable.
- If the object is already a Variable, return a shallow copy.
- Otherwise, if the object has 'dims' and 'data' attributes, convert
it into a new Variable.
- If all else fails, attempt to convert the object into a Variable by
unpacking it into the arguments for creating a new Variable.
name : str, optional
If provided:
- `obj` can be a 1D array, which is assumed to label coordinate values
along a dimension of this given name.
- Variables with name matching one of their dimensions are converted
into `IndexVariable` objects.
Returns
-------
var : Variable
The newly created variable.
"""
# TODO: consider extending this method to automatically handle Iris and
# pandas objects.
if hasattr(obj, 'variable'):
# extract the primary Variable from DataArrays
obj = obj.variable
if isinstance(obj, Variable):
obj = obj.copy(deep=False)
elif hasattr(obj, 'dims') and (hasattr(obj, 'data') or
hasattr(obj, 'values')):
obj_data = getattr(obj, 'data', None)
if obj_data is None:
obj_data = getattr(obj, 'values')
obj = Variable(obj.dims, obj_data,
getattr(obj, 'attrs', None),
getattr(obj, 'encoding', None))
elif isinstance(obj, tuple):
try:
obj = Variable(*obj)
except TypeError:
# use .format() instead of % because it handles tuples consistently
raise TypeError('tuples to convert into variables must be of the '
'form (dims, data[, attrs, encoding]): '
'{}'.format(obj))
elif utils.is_scalar(obj):
obj = Variable([], obj)
elif isinstance(obj, (pd.Index, IndexVariable)) and obj.name is not None:
obj = Variable(obj.name, obj)
elif name is not None:
data = as_compatible_data(obj)
if data.ndim != 1:
raise MissingDimensionsError(
'cannot set variable %r with %r-dimensional data '
'without explicit dimension names. Pass a tuple of '
'(dims, data) instead.' % (name, data.ndim))
obj = Variable(name, obj, fastpath=True)
else:
raise TypeError('unable to convert object into a variable without an '
'explicit list of dimensions: %r' % obj)
if name is not None and name in obj.dims:
# convert the Variable into an Index
if obj.ndim != 1:
raise MissingDimensionsError(
'%r has more than 1-dimension and the same name as one of its '
'dimensions %r. xarray disallows such variables because they '
'conflict with the coordinates used to label '
'dimensions.' % (name, obj.dims))
obj = obj.to_index_variable()
return obj
def _maybe_wrap_data(data):
"""
Put pandas.Index and numpy.ndarray arguments in adapter objects to ensure
they can be indexed properly.
NumpyArrayAdapter, PandasIndexAdapter and LazilyIndexedArray should
all pass through unmodified.
"""
if isinstance(data, pd.Index):
return PandasIndexAdapter(data)
return data
def _possibly_convert_objects(values):
"""Convert arrays of datetime.datetime and datetime.timedelta objects into
datetime64 and timedelta64, according to the pandas convention.
"""
return np.asarray(pd.Series(values.ravel())).reshape(values.shape)
def as_compatible_data(data, fastpath=False):
"""Prepare and wrap data to put in a Variable.
- If data does not have the necessary attributes, convert it to ndarray.
- If data has dtype=datetime64, ensure that it has ns precision. If it's a
pandas.Timestamp, convert it to datetime64.
- If data is already a pandas or xarray object (other than an Index), just
use the values.
Finally, wrap it up with an adapter if necessary.
"""
if fastpath and getattr(data, 'ndim', 0) > 0:
# can't use fastpath (yet) for scalars
return _maybe_wrap_data(data)
if isinstance(data, Variable):
return data.data
if isinstance(data, NON_NUMPY_SUPPORTED_ARRAY_TYPES):
return _maybe_wrap_data(data)
if isinstance(data, tuple):
data = utils.to_0d_object_array(data)
if isinstance(data, pd.Timestamp):
# TODO: convert, handle datetime objects, too
data = np.datetime64(data.value, 'ns')
if isinstance(data, timedelta):
data = np.timedelta64(getattr(data, 'value', data), 'ns')
# we don't want nested self-described arrays
data = getattr(data, 'values', data)
if isinstance(data, np.ma.MaskedArray):
mask = np.ma.getmaskarray(data)
if mask.any():
dtype, fill_value = dtypes.maybe_promote(data.dtype)
data = np.asarray(data, dtype=dtype)
data[mask] = fill_value
else:
data = np.asarray(data)
# validate whether the data is valid data types
data = np.asarray(data)
if isinstance(data, np.ndarray):
if data.dtype.kind == 'O':
data = _possibly_convert_objects(data)
elif data.dtype.kind == 'M':
data = np.asarray(data, 'datetime64[ns]')
elif data.dtype.kind == 'm':
data = np.asarray(data, 'timedelta64[ns]')
return _maybe_wrap_data(data)
def _as_array_or_item(data):
"""Return the given values as a numpy array, or as an individual item if
it's a 0d datetime64 or timedelta64 array.
Importantly, this function does not copy data if it is already an ndarray -
otherwise, it will not be possible to update Variable values in place.
This function mostly exists because 0-dimensional ndarrays with
dtype=datetime64 are broken :(
https://github.com/numpy/numpy/issues/4337
https://github.com/numpy/numpy/issues/7619
TODO: remove this (replace with np.asarray) once these issues are fixed
"""
data = np.asarray(data)
if data.ndim == 0:
if data.dtype.kind == 'M':
data = np.datetime64(data, 'ns')
elif data.dtype.kind == 'm':
data = np.timedelta64(data, 'ns')
return data
class Variable(common.AbstractArray, utils.NdimSizeLenMixin):
"""A netcdf-like variable consisting of dimensions, data and attributes
which describe a single Array. A single Variable object is not fully
described outside the context of its parent Dataset (if you want such a
fully described object, use a DataArray instead).
The main functional difference between Variables and numpy arrays is that
numerical operations on Variables implement array broadcasting by dimension
name. For example, adding an Variable with dimensions `('time',)` to
another Variable with dimensions `('space',)` results in a new Variable
with dimensions `('time', 'space')`. Furthermore, numpy reduce operations
like ``mean`` or ``sum`` are overwritten to take a "dimension" argument
instead of an "axis".
Variables are light-weight objects used as the building block for datasets.
They are more primitive objects, so operations with them provide marginally
higher performance than using DataArrays. However, manipulating data in the
form of a Dataset or DataArray should almost always be preferred, because
they can use more complete metadata in context of coordinate labels.
"""
def __init__(self, dims, data, attrs=None, encoding=None, fastpath=False):
"""
Parameters
----------
dims : str or sequence of str
Name(s) of the the data dimension(s). Must be either a string (only
for 1D data) or a sequence of strings with length equal to the
number of dimensions.
data : array_like
Data array which supports numpy-like data access.
attrs : dict_like or None, optional
Attributes to assign to the new variable. If None (default), an
empty attribute dictionary is initialized.
encoding : dict_like or None, optional
Dictionary specifying how to encode this array's data into a
serialized format like netCDF4. Currently used keys (for netCDF)
include '_FillValue', 'scale_factor', 'add_offset' and 'dtype'.
Well-behaved code to serialize a Variable should ignore
unrecognized encoding items.
"""
self._data = as_compatible_data(data, fastpath=fastpath)
self._dims = self._parse_dimensions(dims)
self._attrs = None
self._encoding = None
if attrs is not None:
self.attrs = attrs
if encoding is not None:
self.encoding = encoding
@property
def dtype(self):
return self._data.dtype
@property
def shape(self):
return self._data.shape
@property
def nbytes(self):
return self.size * self.dtype.itemsize
@property
def _in_memory(self):
return (isinstance(self._data, (np.ndarray, np.number, PandasIndexAdapter)) or
(isinstance(self._data, indexing.MemoryCachedArray) and
isinstance(self._data.array, indexing.NumpyIndexingAdapter)))
@property
def data(self):
if isinstance(self._data, dask_array_type):
return self._data
else:
return self.values
@data.setter
def data(self, data):
data = as_compatible_data(data)
if data.shape != self.shape:
raise ValueError(
"replacement data must match the Variable's shape")
self._data = data
@property
def _indexable_data(self):
return as_indexable(self._data)
def load(self, **kwargs):
"""Manually trigger loading of this variable's data from disk or a
remote source into memory and return this variable.
Normally, it should not be necessary to call this method in user code,
because all xarray functions should either work on deferred data or
load data automatically.
Parameters
----------
**kwargs : dict
Additional keyword arguments passed on to ``dask.array.compute``.
See Also
--------
dask.array.compute
"""
if isinstance(self._data, dask_array_type):
self._data = as_compatible_data(self._data.compute(**kwargs))
elif not isinstance(self._data, np.ndarray):
self._data = np.asarray(self._data)
return self
def compute(self, **kwargs):
"""Manually trigger loading of this variable's data from disk or a
remote source into memory and return a new variable. The original is
left unaltered.
Normally, it should not be necessary to call this method in user code,
because all xarray functions should either work on deferred data or
load data automatically.
Parameters
----------
**kwargs : dict
Additional keyword arguments passed on to ``dask.array.compute``.
See Also
--------
dask.array.compute
"""
new = self.copy(deep=False)
return new.load(**kwargs)
def __dask_graph__(self):
if isinstance(self._data, dask_array_type):
return self._data.__dask_graph__()
else:
return None
def __dask_keys__(self):
return self._data.__dask_keys__()
@property
def __dask_optimize__(self):
return self._data.__dask_optimize__
@property
def __dask_scheduler__(self):
return self._data.__dask_scheduler__
def __dask_postcompute__(self):
array_func, array_args = self._data.__dask_postcompute__()
return self._dask_finalize, (array_func, array_args, self._dims,
self._attrs, self._encoding)
def __dask_postpersist__(self):
array_func, array_args = self._data.__dask_postpersist__()
return self._dask_finalize, (array_func, array_args, self._dims,
self._attrs, self._encoding)
@staticmethod
def _dask_finalize(results, array_func, array_args, dims, attrs, encoding):
if isinstance(results, dict): # persist case
name = array_args[0]
results = {k: v for k, v in results.items() if k[0] == name} # cull
data = array_func(results, *array_args)
return Variable(dims, data, attrs=attrs, encoding=encoding)
@property
def values(self):
"""The variable's data as a numpy.ndarray"""
return _as_array_or_item(self._data)
@values.setter
def values(self, values):
self.data = values
def to_base_variable(self):
"""Return this variable as a base xarray.Variable"""
return Variable(self.dims, self._data, self._attrs,
encoding=self._encoding, fastpath=True)
to_variable = utils.alias(to_base_variable, 'to_variable')
def to_index_variable(self):
"""Return this variable as an xarray.IndexVariable"""
return IndexVariable(self.dims, self._data, self._attrs,
encoding=self._encoding, fastpath=True)
to_coord = utils.alias(to_index_variable, 'to_coord')
def to_index(self):
"""Convert this variable to a pandas.Index"""
return self.to_index_variable().to_index()
@property
def dims(self):
"""Tuple of dimension names with which this variable is associated.
"""
return self._dims
def _parse_dimensions(self, dims):
if isinstance(dims, basestring):
dims = (dims,)
dims = tuple(dims)
if len(dims) != self.ndim:
raise ValueError('dimensions %s must have the same length as the '
'number of data dimensions, ndim=%s'
% (dims, self.ndim))
return dims
@dims.setter
def dims(self, value):
self._dims = self._parse_dimensions(value)
def _item_key_to_tuple(self, key):
if utils.is_dict_like(key):
return tuple(key.get(dim, slice(None)) for dim in self.dims)
else:
return key
def _broadcast_indexes(self, key):
"""Prepare an indexing key for an indexing operation.
Parameters
-----------
key: int, slice, array, dict or tuple of integer, slices and arrays
Any valid input for indexing.
Returns
-------
dims: tuple
Dimension of the resultant variable.
indexers: IndexingTuple subclass
Tuple of integer, array-like, or slices to use when indexing
self._data. The type of this argument indicates the type of
indexing to perform, either basic, outer or vectorized.
new_order : Optional[Sequence[int]]
Optional reordering to do on the result of indexing. If not None,
the first len(new_order) indexing should be moved to these
positions.
"""
key = self._item_key_to_tuple(key) # key is a tuple
# key is a tuple of full size
key = indexing.expanded_indexer(key, self.ndim)
# Convert a scalar Variable as an integer
key = tuple(
k.data.item() if isinstance(k, Variable) and k.ndim == 0 else k
for k in key)
if all(isinstance(k, BASIC_INDEXING_TYPES) for k in key):
return self._broadcast_indexes_basic(key)
self._validate_indexers(key)
# Detect it can be mapped as an outer indexer
# If all key is unlabeled, or
# key can be mapped as an OuterIndexer.
if all(not isinstance(k, Variable) for k in key):
return self._broadcast_indexes_outer(key)
# If all key is 1-dimensional and there are no duplicate labels,
# key can be mapped as an OuterIndexer.
dims = []
for k, d in zip(key, self.dims):
if isinstance(k, Variable):
if len(k.dims) > 1:
return self._broadcast_indexes_vectorized(key)
dims.append(k.dims[0])
elif not isinstance(k, integer_types):
dims.append(d)
if len(set(dims)) == len(dims):
return self._broadcast_indexes_outer(key)
return self._broadcast_indexes_vectorized(key)
def _broadcast_indexes_basic(self, key):
dims = tuple(dim for k, dim in zip(key, self.dims)
if not isinstance(k, integer_types))
return dims, BasicIndexer(key), None
def _validate_indexers(self, key):
""" Make sanity checks """
for dim, k in zip(self.dims, key):
if isinstance(k, BASIC_INDEXING_TYPES):
pass
else:
if not isinstance(k, Variable):
k = np.asarray(k)
if k.ndim > 1:
raise IndexError(
"Unlabeled multi-dimensional array cannot be "
"used for indexing: {}".format(k))
if k.dtype.kind == 'b':
if self.shape[self.get_axis_num(dim)] != len(k):
raise IndexError(
"Boolean array size {0:d} is used to index array "
"with shape {1:s}.".format(len(k),
str(self.shape)))
if k.ndim > 1:
raise IndexError("{}-dimensional boolean indexing is "
"not supported. ".format(k.ndim))
if getattr(k, 'dims', (dim, )) != (dim, ):
raise IndexError(
"Boolean indexer should be unlabeled or on the "
"same dimension to the indexed array. Indexer is "
"on {0:s} but the target dimension is "
"{1:s}.".format(str(k.dims), dim))
def _broadcast_indexes_outer(self, key):
dims = tuple(k.dims[0] if isinstance(k, Variable) else dim
for k, dim in zip(key, self.dims)
if not isinstance(k, integer_types))
new_key = []
for k in key:
if isinstance(k, Variable):
k = k.data
if not isinstance(k, BASIC_INDEXING_TYPES):
k = np.asarray(k)
if k.dtype.kind == 'b':
(k,) = np.nonzero(k)
new_key.append(k)
return dims, OuterIndexer(tuple(new_key)), None
def _nonzero(self):
""" Equivalent numpy's nonzero but returns a tuple of Varibles. """
# TODO we should replace dask's native nonzero
# after https://github.com/dask/dask/issues/1076 is implemented.
nonzeros = np.nonzero(self.data)
return tuple(Variable((dim), nz) for nz, dim
in zip(nonzeros, self.dims))
def _broadcast_indexes_vectorized(self, key):
variables = []
out_dims_set = OrderedSet()
for dim, value in zip(self.dims, key):
if isinstance(value, slice):
out_dims_set.add(dim)
else:
variable = (value if isinstance(value, Variable) else
as_variable(value, name=dim))
if variable.dtype.kind == 'b': # boolean indexing case
(variable,) = variable._nonzero()
variables.append(variable)
out_dims_set.update(variable.dims)
variable_dims = set()
for variable in variables:
variable_dims.update(variable.dims)
slices = []
for i, (dim, value) in enumerate(zip(self.dims, key)):
if isinstance(value, slice):
if dim in variable_dims:
# We only convert slice objects to variables if they share
# a dimension with at least one other variable. Otherwise,
# we can equivalently leave them as slices aknd transpose
# the result. This is significantly faster/more efficient
# for most array backends.
values = np.arange(*value.indices(self.sizes[dim]))
variables.insert(i - len(slices), Variable((dim,), values))
else:
slices.append((i, value))
try:
variables = _broadcast_compat_variables(*variables)
except ValueError:
raise IndexError("Dimensions of indexers mismatch: {}".format(key))
out_key = [variable.data for variable in variables]
out_dims = tuple(out_dims_set)
slice_positions = set()
for i, value in slices:
out_key.insert(i, value)
new_position = out_dims.index(self.dims[i])
slice_positions.add(new_position)
if slice_positions:
new_order = [i for i in range(len(out_dims))
if i not in slice_positions]
else:
new_order = None
return out_dims, VectorizedIndexer(tuple(out_key)), new_order
def __getitem__(self, key):
"""Return a new Array object whose contents are consistent with
getting the provided key from the underlying data.
NB. __getitem__ and __setitem__ implement xarray-style indexing,
where if keys are unlabeled arrays, we index the array orthogonally
with them. If keys are labeled array (such as Variables), they are
broadcasted with our usual scheme and then the array is indexed with
the broadcasted key, like numpy's fancy indexing.
If you really want to do indexing like `x[x > 0]`, manipulate the numpy
array `x.values` directly.
"""
dims, index_tuple, new_order = self._broadcast_indexes(key)
data = self._indexable_data[index_tuple]
if new_order:
data = np.moveaxis(data, range(len(new_order)), new_order)
return type(self)(dims, data, self._attrs, self._encoding,
fastpath=True)
def __setitem__(self, key, value):
"""__setitem__ is overloaded to access the underlying numpy values with
orthogonal indexing.
See __getitem__ for more details.
"""
dims, index_tuple, new_order = self._broadcast_indexes(key)
if isinstance(value, Variable):
value = value.set_dims(dims).data
if new_order:
value = duck_array_ops.asarray(value)
if value.ndim > len(dims):
raise ValueError(
'shape mismatch: value array of shape %s could not be'
'broadcast to indexing result with %s dimensions'
% (value.shape, len(dims)))
value = value[(len(dims) - value.ndim) * (np.newaxis,) +
(Ellipsis,)]
value = np.moveaxis(value, new_order, range(len(new_order)))
self._indexable_data[index_tuple] = value
@property
def attrs(self):
"""Dictionary of local attributes on this variable.
"""
if self._attrs is None:
self._attrs = OrderedDict()
return self._attrs
@attrs.setter
def attrs(self, value):
self._attrs = OrderedDict(value)
@property
def encoding(self):
"""Dictionary of encodings on this variable.
"""
if self._encoding is None:
self._encoding = {}
return self._encoding
@encoding.setter
def encoding(self, value):
try:
self._encoding = dict(value)
except ValueError:
raise ValueError('encoding must be castable to a dictionary')
def copy(self, deep=True):
"""Returns a copy of this object.
If `deep=True`, the data array is loaded into memory and copied onto
the new object. Dimensions, attributes and encodings are always copied.
"""
data = self._data
if isinstance(data, indexing.MemoryCachedArray):
# don't share caching between copies
data = indexing.MemoryCachedArray(data.array)
if deep:
if isinstance(data, dask_array_type):
data = data.copy()
elif not isinstance(data, PandasIndexAdapter):
# pandas.Index is immutable
data = np.array(data)
# note:
# dims is already an immutable tuple
# attributes and encoding will be copied when the new Array is created
return type(self)(self.dims, data, self._attrs, self._encoding,
fastpath=True)
def __copy__(self):
return self.copy(deep=False)
def __deepcopy__(self, memo=None):
# memo does nothing but is required for compatibility with
# copy.deepcopy
return self.copy(deep=True)
# mutable objects should not be hashable
__hash__ = None
@property
def chunks(self):
"""Block dimensions for this array's data or None if it's not a dask
array.
"""
return getattr(self._data, 'chunks', None)
_array_counter = itertools.count()
def chunk(self, chunks=None, name=None, lock=False):
"""Coerce this array's data into a dask arrays with the given chunks.
If this variable is a non-dask array, it will be converted to dask
array. If it's a dask array, it will be rechunked to the given chunk
sizes.
If neither chunks is not provided for one or more dimensions, chunk
sizes along that dimension will not be updated; non-dask arrays will be
converted into dask arrays with a single block.
Parameters
----------
chunks : int, tuple or dict, optional
Chunk sizes along each dimension, e.g., ``5``, ``(5, 5)`` or
``{'x': 5, 'y': 5}``.
name : str, optional
Used to generate the name for this array in the internal dask
graph. Does not need not be unique.
lock : optional
Passed on to :py:func:`dask.array.from_array`, if the array is not
already as dask array.
Returns
-------
chunked : xarray.Variable
"""
import dask.array as da
if utils.is_dict_like(chunks):
chunks = dict((self.get_axis_num(dim), chunk)
for dim, chunk in chunks.items())
if chunks is None:
chunks = self.chunks or self.shape
data = self._data
if isinstance(data, da.Array):
data = data.rechunk(chunks)
else:
if utils.is_dict_like(chunks):
chunks = tuple(chunks.get(n, s)
for n, s in enumerate(self.shape))
# da.from_array works by using lazily indexing with a tuple of
# slices. Using OuterIndexer is a pragmatic choice: dask does not
# yet handle different indexing types in an explicit way:
# https://github.com/dask/dask/issues/2883
data = indexing.ImplicitToExplicitIndexingAdapter(
data, indexing.OuterIndexer)
data = da.from_array(data, chunks, name=name, lock=lock)
return type(self)(self.dims, data, self._attrs, self._encoding,
fastpath=True)
def isel(self, **indexers):
"""Return a new array indexed along the specified dimension(s).
Parameters
----------
**indexers : {dim: indexer, ...}
Keyword arguments with names matching dimensions and values given
by integers, slice objects or arrays.
Returns
-------
obj : Array object
A new Array with the selected data and dimensions. In general,
the new variable's data will be a view of this variable's data,
unless numpy fancy indexing was triggered by using an array
indexer, in which case the data will be a copy.
"""
invalid = [k for k in indexers if k not in self.dims]
if invalid:
raise ValueError("dimensions %r do not exist" % invalid)
key = [slice(None)] * self.ndim
for i, dim in enumerate(self.dims):
if dim in indexers:
key[i] = indexers[dim]
return self[tuple(key)]
def squeeze(self, dim=None):
"""Return a new object with squeezed data.
Parameters
----------
dim : None or str or tuple of str, optional
Selects a subset of the length one dimensions. If a dimension is
selected with length greater than one, an error is raised. If
None, all length one dimensions are squeezed.
Returns
-------
squeezed : same type as caller
This object, but with with all or a subset of the dimensions of
length 1 removed.
See Also
--------
numpy.squeeze
"""
dims = common.get_squeeze_dims(self, dim)
return self.isel(**{d: 0 for d in dims})
def _shift_one_dim(self, dim, count):
axis = self.get_axis_num(dim)
if count > 0:
keep = slice(None, -count)
elif count < 0:
keep = slice(-count, None)
else:
keep = slice(None)
trimmed_data = self[(slice(None),) * axis + (keep,)].data
dtype, fill_value = dtypes.maybe_promote(self.dtype)
shape = list(self.shape)
shape[axis] = min(abs(count), shape[axis])
if isinstance(trimmed_data, dask_array_type):
chunks = list(trimmed_data.chunks)
chunks[axis] = (shape[axis],)
full = functools.partial(da.full, chunks=chunks)
else:
full = np.full
nans = full(shape, fill_value, dtype=dtype)
if count > 0:
arrays = [nans, trimmed_data]
else:
arrays = [trimmed_data, nans]
data = duck_array_ops.concatenate(arrays, axis)
if isinstance(data, dask_array_type):
# chunked data should come out with the same chunks; this makes
# it feasible to combine shifted and unshifted data
# TODO: remove this once dask.array automatically aligns chunks
data = data.rechunk(self.data.chunks)
return type(self)(self.dims, data, self._attrs, fastpath=True)
def shift(self, **shifts):
"""
Return a new Variable with shifted data.
Parameters
----------
**shifts : keyword arguments of the form {dim: offset}
Integer offset to shift along each of the given dimensions.
Positive offsets shift to the right; negative offsets shift to the
left.
Returns
-------
shifted : Variable
Variable with the same dimensions and attributes but shifted data.
"""
result = self
for dim, count in shifts.items():
result = result._shift_one_dim(dim, count)
return result
def _roll_one_dim(self, dim, count):
axis = self.get_axis_num(dim)
count %= self.shape[axis]
if count != 0:
indices = [slice(-count, None), slice(None, -count)]
else:
indices = [slice(None)]
arrays = [self[(slice(None),) * axis + (idx,)].data
for idx in indices]
data = duck_array_ops.concatenate(arrays, axis)
if isinstance(data, dask_array_type):
# chunked data should come out with the same chunks; this makes
# it feasible to combine shifted and unshifted data
# TODO: remove this once dask.array automatically aligns chunks
data = data.rechunk(self.data.chunks)
return type(self)(self.dims, data, self._attrs, fastpath=True)
def roll(self, **shifts):
"""
Return a new Variable with rolld data.
Parameters
----------
**shifts : keyword arguments of the form {dim: offset}
Integer offset to roll along each of the given dimensions.
Positive offsets roll to the right; negative offsets roll to the
left.
Returns
-------
shifted : Variable
Variable with the same dimensions and attributes but rolled data.
"""
result = self
for dim, count in shifts.items():
result = result._roll_one_dim(dim, count)
return result
def transpose(self, *dims):
"""Return a new Variable object with transposed dimensions.
Parameters
----------
*dims : str, optional
By default, reverse the dimensions. Otherwise, reorder the
dimensions to this order.
Returns
-------
transposed : Variable
The returned object has transposed data and dimensions with the
same attributes as the original.
Notes
-----
Although this operation returns a view of this variable's data, it is
not lazy -- the data will be fully loaded.
See Also
--------
numpy.transpose
"""
if len(dims) == 0:
dims = self.dims[::-1]
axes = self.get_axis_num(dims)
if len(dims) < 2: # no need to transpose if only one dimension
return self.copy(deep=False)
data = duck_array_ops.transpose(self.data, axes)
return type(self)(dims, data, self._attrs, self._encoding,
fastpath=True)
def expand_dims(self, *args):
import warnings
warnings.warn('Variable.expand_dims is deprecated: use '
'Variable.set_dims instead', DeprecationWarning,
stacklevel=2)
return self.expand_dims(*args)
def set_dims(self, dims, shape=None):
"""Return a new variable with given set of dimensions.
This method might be used to attach new dimension(s) to variable.
When possible, this operation does not copy this variable's data.
Parameters
----------
dims : str or sequence of str or dict
Dimensions to include on the new variable. If a dict, values are
used to provide the sizes of new dimensions; otherwise, new
dimensions are inserted with length 1.
Returns
-------
Variable
"""
if isinstance(dims, basestring):
dims = [dims]
if shape is None and utils.is_dict_like(dims):
shape = dims.values()
missing_dims = set(self.dims) - set(dims)
if missing_dims:
raise ValueError('new dimensions %r must be a superset of '
'existing dimensions %r' % (dims, self.dims))
self_dims = set(self.dims)
expanded_dims = tuple(
d for d in dims if d not in self_dims) + self.dims
if self.dims == expanded_dims:
# don't use broadcast_to unless necessary so the result remains
# writeable if possible
expanded_data = self.data
elif shape is not None:
dims_map = dict(zip(dims, shape))
tmp_shape = tuple(dims_map[d] for d in expanded_dims)
expanded_data = duck_array_ops.broadcast_to(self.data, tmp_shape)
else:
expanded_data = self.data[
(None,) * (len(expanded_dims) - self.ndim)]
expanded_var = Variable(expanded_dims, expanded_data, self._attrs,
self._encoding, fastpath=True)
return expanded_var.transpose(*dims)
def _stack_once(self, dims, new_dim):
if not set(dims) <= set(self.dims):
raise ValueError('invalid existing dimensions: %s' % dims)
if new_dim in self.dims:
raise ValueError('cannot create a new dimension with the same '
'name as an existing dimension')
if len(dims) == 0:
# don't stack
return self.copy(deep=False)
other_dims = [d for d in self.dims if d not in dims]
dim_order = other_dims + list(dims)
reordered = self.transpose(*dim_order)
new_shape = reordered.shape[:len(other_dims)] + (-1,)
new_data = reordered.data.reshape(new_shape)
new_dims = reordered.dims[:len(other_dims)] + (new_dim,)
return Variable(new_dims, new_data, self._attrs, self._encoding,
fastpath=True)
def stack(self, **dimensions):
"""
Stack any number of existing dimensions into a single new dimension.
New dimensions will be added at the end, and the order of the data
along each new dimension will be in contiguous (C) order.
Parameters
----------
**dimensions : keyword arguments of the form new_name=(dim1, dim2, ...)
Names of new dimensions, and the existing dimensions that they
replace.
Returns
-------
stacked : Variable
Variable with the same attributes but stacked data.
See also
--------
Variable.unstack
"""
result = self
for new_dim, dims in dimensions.items():
result = result._stack_once(dims, new_dim)
return result
def _unstack_once(self, dims, old_dim):
new_dim_names = tuple(dims.keys())
new_dim_sizes = tuple(dims.values())
if old_dim not in self.dims:
raise ValueError('invalid existing dimension: %s' % old_dim)
if set(new_dim_names).intersection(self.dims):
raise ValueError('cannot create a new dimension with the same '
'name as an existing dimension')
if np.prod(new_dim_sizes) != self.sizes[old_dim]:
raise ValueError('the product of the new dimension sizes must '
'equal the size of the old dimension')
other_dims = [d for d in self.dims if d != old_dim]
dim_order = other_dims + [old_dim]
reordered = self.transpose(*dim_order)
new_shape = reordered.shape[:len(other_dims)] + new_dim_sizes
new_data = reordered.data.reshape(new_shape)
new_dims = reordered.dims[:len(other_dims)] + new_dim_names
return Variable(new_dims, new_data, self._attrs, self._encoding,
fastpath=True)
def unstack(self, **dimensions):
"""
Unstack an existing dimension into multiple new dimensions.
New dimensions will be added at the end, and the order of the data
along each new dimension will be in contiguous (C) order.
Parameters
----------
**dimensions : keyword arguments of the form old_dim={dim1: size1, ...}
Names of existing dimensions, and the new dimensions and sizes that they
map to.
Returns
-------
unstacked : Variable
Variable with the same attributes but unstacked data.
See also
--------
Variable.stack
"""
result = self
for old_dim, dims in dimensions.items():
result = result._unstack_once(dims, old_dim)
return result
def fillna(self, value):
return ops.fillna(self, value)
def where(self, cond, other=dtypes.NA):
return ops.where_method(self, cond, other)
def reduce(self, func, dim=None, axis=None, keep_attrs=False,
allow_lazy=False, **kwargs):
"""Reduce this array by applying `func` along some dimension(s).
Parameters
----------
func : function
Function which can be called in the form
`func(x, axis=axis, **kwargs)` to return the result of reducing an
np.ndarray over an integer valued axis.
dim : str or sequence of str, optional
Dimension(s) over which to apply `func`.
axis : int or sequence of int, optional
Axis(es) over which to apply `func`. Only one of the 'dim'
and 'axis' arguments can be supplied. If neither are supplied, then
the reduction is calculated over the flattened array (by calling
`func(x)` without an axis argument).
keep_attrs : bool, optional
If True, the variable's attributes (`attrs`) will be copied from
the original object to the new one. If False (default), the new
object will be returned without attributes.
**kwargs : dict
Additional keyword arguments passed on to `func`.
Returns
-------
reduced : Array
Array with summarized data and the indicated dimension(s)
removed.
"""
if dim is not None and axis is not None:
raise ValueError("cannot supply both 'axis' and 'dim' arguments")
if getattr(func, 'keep_dims', False):
if dim is None and axis is None:
raise ValueError("must supply either single 'dim' or 'axis' "
"argument to %s" % (func.__name__))
if dim is not None:
axis = self.get_axis_num(dim)
data = func(self.data if allow_lazy else self.values,
axis=axis, **kwargs)
if getattr(data, 'shape', ()) == self.shape:
dims = self.dims
else:
removed_axes = (range(self.ndim) if axis is None
else np.atleast_1d(axis) % self.ndim)
dims = [adim for n, adim in enumerate(self.dims)
if n not in removed_axes]
attrs = self._attrs if keep_attrs else None
return Variable(dims, data, attrs=attrs)
@classmethod
def concat(cls, variables, dim='concat_dim', positions=None,
shortcut=False):
"""Concatenate variables along a new or existing dimension.
Parameters
----------
variables : iterable of Array
Arrays to stack together. Each variable is expected to have
matching dimensions and shape except for along the stacked
dimension.
dim : str or DataArray, optional
Name of the dimension to stack along. This can either be a new
dimension name, in which case it is added along axis=0, or an
existing dimension name, in which case the location of the
dimension is unchanged. Where to insert the new dimension is
determined by the first variable.
positions : None or list of integer arrays, optional
List of integer arrays which specifies the integer positions to which
to assign each dataset along the concatenated dimension. If not
supplied, objects are concatenated in the provided order.
shortcut : bool, optional
This option is used internally to speed-up groupby operations.
If `shortcut` is True, some checks of internal consistency between
arrays to concatenate are skipped.
Returns
-------
stacked : Variable
Concatenated Variable formed by stacking all the supplied variables
along the given dimension.
"""
if not isinstance(dim, basestring):
dim, = dim.dims
# can't do this lazily: we need to loop through variables at least
# twice
variables = list(variables)
first_var = variables[0]
arrays = [v.data for v in variables]
# TODO: use our own type promotion rules to ensure that
# [str, float] -> object, not str like numpy
if dim in first_var.dims:
axis = first_var.get_axis_num(dim)
dims = first_var.dims
data = duck_array_ops.concatenate(arrays, axis=axis)
if positions is not None:
# TODO: deprecate this option -- we don't need it for groupby
# any more.
indices = nputils.inverse_permutation(
np.concatenate(positions))
data = duck_array_ops.take(data, indices, axis=axis)
else:
axis = 0
dims = (dim,) + first_var.dims
data = duck_array_ops.stack(arrays, axis=axis)
attrs = OrderedDict(first_var.attrs)
encoding = OrderedDict(first_var.encoding)
if not shortcut:
for var in variables:
if var.dims != first_var.dims:
raise ValueError('inconsistent dimensions')
utils.remove_incompatible_items(attrs, var.attrs)
return cls(dims, data, attrs, encoding)
def equals(self, other, equiv=duck_array_ops.array_equiv):
"""True if two Variables have the same dimensions and values;
otherwise False.
Variables can still be equal (like pandas objects) if they have NaN
values in the same locations.
This method is necessary because `v1 == v2` for Variables
does element-wise comparisons (like numpy.ndarrays).
"""
other = getattr(other, 'variable', other)
try:
return (self.dims == other.dims and
(self._data is other._data or
equiv(self.data, other.data)))
except (TypeError, AttributeError):
return False
def broadcast_equals(self, other, equiv=duck_array_ops.array_equiv):
"""True if two Variables have the values after being broadcast against
each other; otherwise False.
Variables can still be equal (like pandas objects) if they have NaN
values in the same locations.
"""
try:
self, other = broadcast_variables(self, other)
except (ValueError, AttributeError):
return False
return self.equals(other, equiv=equiv)
def identical(self, other):
"""Like equals, but also checks attributes.
"""
try:
return (utils.dict_equiv(self.attrs, other.attrs) and
self.equals(other))
except (TypeError, AttributeError):
return False
def no_conflicts(self, other):
"""True if the intersection of two Variable's non-null data is
equal; otherwise false.
Variables can thus still be equal if there are locations where either,
or both, contain NaN values.
"""
return self.broadcast_equals(
other, equiv=duck_array_ops.array_notnull_equiv)
def quantile(self, q, dim=None, interpolation='linear'):
"""Compute the qth quantile of the data along the specified dimension.
Returns the qth quantiles(s) of the array elements.
Parameters
----------
q : float in range of [0,1] (or sequence of floats)
Quantile to compute, which must be between 0 and 1
inclusive.
dim : str or sequence of str, optional
Dimension(s) over which to apply quantile.
interpolation : {'linear', 'lower', 'higher', 'midpoint', 'nearest'}
This optional parameter specifies the interpolation method to
use when the desired quantile lies between two data points
``i < j``:
* linear: ``i + (j - i) * fraction``, where ``fraction`` is
the fractional part of the index surrounded by ``i`` and
``j``.
* lower: ``i``.
* higher: ``j``.
* nearest: ``i`` or ``j``, whichever is nearest.
* midpoint: ``(i + j) / 2``.
Returns
-------
quantiles : Variable
If `q` is a single quantile, then the result
is a scalar. If multiple percentiles are given, first axis of
the result corresponds to the quantile and a quantile dimension
is added to the return array. The other dimensions are the
dimensions that remain after the reduction of the array.
See Also
--------
numpy.nanpercentile, pandas.Series.quantile, Dataset.quantile,
DataArray.quantile
"""
if isinstance(self.data, dask_array_type):
raise TypeError("quantile does not work for arrays stored as dask "
"arrays. Load the data via .compute() or .load() "
"prior to calling this method.")
if LooseVersion(np.__version__) < LooseVersion('1.10.0'):
raise NotImplementedError(
'quantile requres numpy version 1.10.0 or later')
q = np.asarray(q, dtype=np.float64)
new_dims = list(self.dims)
if dim is not None:
axis = self.get_axis_num(dim)
if utils.is_scalar(dim):
new_dims.remove(dim)
else:
for d in dim:
new_dims.remove(d)
else:
axis = None
new_dims = []
# only add the quantile dimension if q is array like
if q.ndim != 0:
new_dims = ['quantile'] + new_dims
qs = np.nanpercentile(self.data, q * 100., axis=axis,
interpolation=interpolation)
return Variable(new_dims, qs)
@property
def real(self):
return type(self)(self.dims, self.data.real, self._attrs)
@property
def imag(self):
return type(self)(self.dims, self.data.imag, self._attrs)
def __array_wrap__(self, obj, context=None):
return Variable(self.dims, obj)
@staticmethod
def _unary_op(f):
@functools.wraps(f)
def func(self, *args, **kwargs):
with np.errstate(all='ignore'):
return self.__array_wrap__(f(self.data, *args, **kwargs))
return func
@staticmethod
def _binary_op(f, reflexive=False, **ignored_kwargs):
@functools.wraps(f)
def func(self, other):
if isinstance(other, (xr.DataArray, xr.Dataset)):
return NotImplemented
self_data, other_data, dims = _broadcast_compat_data(self, other)
with np.errstate(all='ignore'):
new_data = (f(self_data, other_data)
if not reflexive
else f(other_data, self_data))
result = Variable(dims, new_data)
return result
return func
@staticmethod
def _inplace_binary_op(f):
@functools.wraps(f)
def func(self, other):
if isinstance(other, xr.Dataset):
raise TypeError('cannot add a Dataset to a Variable in-place')
self_data, other_data, dims = _broadcast_compat_data(self, other)
if dims != self.dims:
raise ValueError('dimensions cannot change for in-place '
'operations')
with np.errstate(all='ignore'):
self.values = f(self_data, other_data)
return self
return func
ops.inject_all_ops_and_reduce_methods(Variable)
class IndexVariable(Variable):
"""Wrapper for accommodating a pandas.Index in an xarray.Variable.
IndexVariable preserve loaded values in the form of a pandas.Index instead
of a NumPy array. Hence, their values are immutable and must always be one-
dimensional.
They also have a name property, which is the name of their sole dimension
unless another name is given.
"""
def __init__(self, dims, data, attrs=None, encoding=None, fastpath=False):
super(IndexVariable, self).__init__(dims, data, attrs, encoding,
fastpath)
if self.ndim != 1:
raise ValueError('%s objects must be 1-dimensional' %
type(self).__name__)
# Unlike in Variable, always eagerly load values into memory
if not isinstance(self._data, PandasIndexAdapter):
self._data = PandasIndexAdapter(self._data)
def load(self):
# data is already loaded into memory for IndexVariable
return self
@Variable.data.setter
def data(self, data):
Variable.data.fset(self, data)
if not isinstance(self._data, PandasIndexAdapter):
self._data = PandasIndexAdapter(self._data)
def chunk(self, chunks=None, name=None, lock=False):
# Dummy - do not chunk. This method is invoked e.g. by Dataset.chunk()
return self.copy(deep=False)
def __getitem__(self, key):
dims, index_tuple, new_order = self._broadcast_indexes(key)
values = self._indexable_data[index_tuple]
if getattr(values, 'ndim', 0) != 1:
# returns Variable rather than IndexVariable if multi-dimensional
return Variable(dims, values, self._attrs, self._encoding)
else:
return type(self)(dims, values, self._attrs,
self._encoding, fastpath=True)
def __setitem__(self, key, value):
raise TypeError('%s values cannot be modified' % type(self).__name__)
@classmethod
def concat(cls, variables, dim='concat_dim', positions=None,
shortcut=False):
"""Specialized version of Variable.concat for IndexVariable objects.
This exists because we want to avoid converting Index objects to NumPy
arrays, if possible.
"""
if not isinstance(dim, basestring):
dim, = dim.dims
variables = list(variables)
first_var = variables[0]
if any(not isinstance(v, cls) for v in variables):
raise TypeError('IndexVariable.concat requires that all input '
'variables be IndexVariable objects')
indexes = [v._data.array for v in variables]
if not indexes:
data = []
else:
data = indexes[0].append(indexes[1:])
if positions is not None:
indices = nputils.inverse_permutation(
np.concatenate(positions))
data = data.take(indices)
attrs = OrderedDict(first_var.attrs)
if not shortcut:
for var in variables:
if var.dims != first_var.dims:
raise ValueError('inconsistent dimensions')
utils.remove_incompatible_items(attrs, var.attrs)
return cls(first_var.dims, data, attrs)
def copy(self, deep=True):
"""Returns a copy of this object.
`deep` is ignored since data is stored in the form of pandas.Index,
which is already immutable. Dimensions, attributes and encodings are
always copied.
"""
return type(self)(self.dims, self._data, self._attrs,
self._encoding, fastpath=True)
def equals(self, other, equiv=None):
# if equiv is specified, super up
if equiv is not None:
return super(IndexVariable, self).equals(other, equiv)
# otherwise use the native index equals, rather than looking at _data
other = getattr(other, 'variable', other)
try:
return (self.dims == other.dims and
self._data_equals(other))
except (TypeError, AttributeError):
return False
def _data_equals(self, other):
return self.to_index().equals(other.to_index())
def to_index_variable(self):
"""Return this variable as an xarray.IndexVariable"""
return self
to_coord = utils.alias(to_index_variable, 'to_coord')
def to_index(self):
"""Convert this variable to a pandas.Index"""
# n.b. creating a new pandas.Index from an old pandas.Index is
# basically free as pandas.Index objects are immutable
assert self.ndim == 1
index = self._data.array
if isinstance(index, pd.MultiIndex):
# set default names for multi-index unnamed levels so that
# we can safely rename dimension / coordinate later
valid_level_names = [name or '{}_level_{}'.format(self.dims[0], i)
for i, name in enumerate(index.names)]
index = index.set_names(valid_level_names)
else:
index = index.set_names(self.name)
return index
@property
def level_names(self):
"""Return MultiIndex level names or None if this IndexVariable has no
MultiIndex.
"""
index = self.to_index()
if isinstance(index, pd.MultiIndex):
return index.names
else:
return None
def get_level_variable(self, level):
"""Return a new IndexVariable from a given MultiIndex level."""
if self.level_names is None:
raise ValueError("IndexVariable %r has no MultiIndex" % self.name)
index = self.to_index()
return type(self)(self.dims, index.get_level_values(level))
@property
def name(self):
return self.dims[0]
@name.setter
def name(self, value):
raise AttributeError('cannot modify name of IndexVariable in-place')
# for backwards compatibility
Coordinate = utils.alias(IndexVariable, 'Coordinate')
def _unified_dims(variables):
# validate dimensions
all_dims = OrderedDict()
for var in variables:
var_dims = var.dims
if len(set(var_dims)) < len(var_dims):
raise ValueError('broadcasting cannot handle duplicate '
'dimensions: %r' % list(var_dims))
for d, s in zip(var_dims, var.shape):
if d not in all_dims:
all_dims[d] = s
elif all_dims[d] != s:
raise ValueError('operands cannot be broadcast together '
'with mismatched lengths for dimension %r: %s'
% (d, (all_dims[d], s)))
return all_dims
def _broadcast_compat_variables(*variables):
dims = tuple(_unified_dims(variables))
return tuple(var.set_dims(dims) if var.dims != dims else var
for var in variables)
def broadcast_variables(*variables):
"""Given any number of variables, return variables with matching dimensions
and broadcast data.
The data on the returned variables will be a view of the data on the
corresponding original arrays, but dimensions will be reordered and
inserted so that both broadcast arrays have the same dimensions. The new
dimensions are sorted in order of appearance in the first variable's
dimensions followed by the second variable's dimensions.
"""
dims_map = _unified_dims(variables)
dims_tuple = tuple(dims_map)
return tuple(var.set_dims(dims_map) if var.dims != dims_tuple else var
for var in variables)
def _broadcast_compat_data(self, other):
if all(hasattr(other, attr) for attr
in ['dims', 'data', 'shape', 'encoding']):
# `other` satisfies the necessary Variable API for broadcast_variables
new_self, new_other = _broadcast_compat_variables(self, other)
self_data = new_self.data
other_data = new_other.data
dims = new_self.dims
else:
# rely on numpy broadcasting rules
self_data = self.data
other_data = other
dims = self.dims
return self_data, other_data, dims
def concat(variables, dim='concat_dim', positions=None, shortcut=False):
"""Concatenate variables along a new or existing dimension.
Parameters
----------
variables : iterable of Array
Arrays to stack together. Each variable is expected to have
matching dimensions and shape except for along the stacked
dimension.
dim : str or DataArray, optional
Name of the dimension to stack along. This can either be a new
dimension name, in which case it is added along axis=0, or an
existing dimension name, in which case the location of the
dimension is unchanged. Where to insert the new dimension is
determined by the first variable.
positions : None or list of integer arrays, optional
List of integer arrays which specifies the integer positions to which
to assign each dataset along the concatenated dimension. If not
supplied, objects are concatenated in the provided order.
shortcut : bool, optional
This option is used internally to speed-up groupby operations.
If `shortcut` is True, some checks of internal consistency between
arrays to concatenate are skipped.
Returns
-------
stacked : Variable
Concatenated Variable formed by stacking all the supplied variables
along the given dimension.
"""
variables = list(variables)
if all(isinstance(v, IndexVariable) for v in variables):
return IndexVariable.concat(variables, dim, positions, shortcut)
else:
return Variable.concat(variables, dim, positions, shortcut)
def assert_unique_multiindex_level_names(variables):
"""Check for uniqueness of MultiIndex level names in all given
variables.
Not public API. Used for checking consistency of DataArray and Dataset
objects.
"""
level_names = defaultdict(list)
for var_name, var in variables.items():
if isinstance(var._data, PandasIndexAdapter):
idx_level_names = var.to_index_variable().level_names
if idx_level_names is not None:
for n in idx_level_names:
level_names[n].append('%r (%s)' % (n, var_name))
for k, v in level_names.items():
if k in variables:
v.append('(%s)' % k)
duplicate_names = [v for v in level_names.values() if len(v) > 1]
if duplicate_names:
conflict_str = '\n'.join([', '.join(v) for v in duplicate_names])
raise ValueError('conflicting MultiIndex level name(s):\n%s'
% conflict_str)
| apache-2.0 |
connorcoley/ochem_predict_nn | scripts/characterize_transforms.py | 1 | 3963 | from __future__ import print_function
import numpy as np
import matplotlib.pyplot as plt
from matplotlib import rcParams
rcParams.update({'figure.autolayout': True})
import sys
import os
def get_counts(collection):
'''
Gets the 'count' field for all entries in a MongoDB collection
'''
counts = np.zeros((collection.find().count(), 1)).flatten()
n = len(counts)
for (i, doc) in enumerate(collection.find({}, ['count']).limit(n)):
counts[i] = doc['count'] if 'count' in doc else 0
print('Total counts: {}'.format(sum(counts)))
print('Total number of templates: {}'.format(sum(counts != 0)))
return counts[counts != 0]
def get_counts_templates(collection):
'''
Gets the 'counts' and 'reaction_smarts' field for all entries
in a MongoDB collection
'''
docs = range(collection.find({}, ['count', 'reaction_smarts']).count())
for (i, doc) in enumerate(collection.find({}, ['count', 'reaction_smarts']).limit(len(docs))):
docs[i] = (doc['count'] if 'count' in doc else 0, \
doc['reaction_smarts'] if 'reaction_smarts' in doc else '')
return [x for x in docs if x[0] != 0 and x[1] != '']
def probability_v_rank(counts, out = None):
'''
Plots the probability (normalized frequency) versus rank for an
arbitrary 1D vector of counts
'''
counts = np.sort(counts)
probs = counts / np.sum(counts)
ranks = range(1, len(probs) + 1)
ranks.reverse()
# Probability
fig = plt.figure(figsize=(6,4), dpi = 300)
ax = plt.gca()
ax.scatter(ranks, probs, alpha = 0.5)
ax.set_yscale('log')
ax.set_xscale('log')
ax.axis([1, np.power(10, np.ceil(np.log10(max(ranks)))), \
np.power(10, np.floor(np.log10(min(probs)))), \
np.power(10, np.ceil(np.log10(max(probs))))])
plt.xlabel('Rank')
plt.ylabel('Normalized frequency')
plt.title('Transform templates from {}'.format(collection.name))
plt.grid(True)
if out:
fig.savefig(out + ' prob_rank.png')
np.savetxt(out + ' probs.txt', sorted(probs, reverse = True))
# plt.show()
# Count
fig = plt.figure(figsize=(6,4), dpi = 300)
ax = plt.gca()
ax.scatter(ranks, counts, alpha = 0.5)
ax.set_yscale('log')
ax.set_xscale('log')
ax.axis([1, np.power(10, np.ceil(np.log10(max(ranks)))), \
np.power(10, np.floor(np.log10(min(counts)))), \
np.power(10, np.ceil(np.log10(max(counts))))])
plt.xlabel('Rank')
plt.ylabel('Counts')
plt.title('Transform templates from {}'.format(collection.name))
plt.grid(True)
if out:
fig.savefig(out + ' count_rank.png')
np.savetxt(out + ' counts.txt', sorted(counts, reverse = True))
# plt.show()
# Coverage
missing = np.ones_like(probs)
missing[0] = 0.0
for i in range(1, len(probs)):
missing[i] = missing[i-1] + probs[i]
missing = np.ones_like(missing) - missing
fig = plt.figure(figsize=(6,4), dpi = 300)
ax = plt.gca()
ax.scatter(ranks, missing, alpha = 0.5)
ax.set_xscale('log')
ax.axis([1, np.power(10, np.ceil(np.log10(max(ranks)))), \
0, 1])
plt.xlabel('Rank threshold for inclusion')
plt.ylabel('Estimated minimum coverage')
plt.title('Transform templates from {}'.format(collection.name))
plt.grid(True)
if out:
fig.savefig(out + ' missing_rank.png')
# plt.show()
return
def top_templates(docs, out, n = 10):
'''
Finds the top ten transformation templates
'''
sorted_docs = sorted(docs, key = lambda x: x[0], reverse = True)
with open(out + ' top_{}.txt'.format(n), 'w') as fid:
for i in range(n):
fid.write('{}\t{}\n'.format(sorted_docs[i][0], sorted_docs[i][1]))
return
if __name__ == '__main__':
out_folder = os.path.join(os.path.dirname(__file__), 'output')
if not os.path.isdir(out_folder):
os.mkdir(out_folder)
# DATABASE
from ochem_predict_nn.utils.database import collection_templates
collection = collection_templates()
probability_v_rank(get_counts(collection),
out = os.path.join(out_folder, '{}'.format(collection.name)))
top_templates(get_counts_templates(collection),
os.path.join(out_folder, '{}'.format(collection.name)), n = 500)
| mit |
timbennett/twitter-tools | get_recent_tweets.py | 1 | 1318 | '''
export user's last 3240 tweets to CSV (full structure)
usage: python get_recent_tweets.py screenname
requires pandas because why reinvent to_csv()?
'''
import tweepy #https://github.com/tweepy/tweepy
import csv
import sys
import json
import pandas as pd
# make sure twitter_auth.py exists with contents:
#
# access_key = ""
# access_secret = ""
# consumer_key = ""
# consumer_secret = ""
#
auth = tweepy.OAuthHandler(consumer_key, consumer_secret)
auth.set_access_token(access_key, access_secret)
api = tweepy.API(auth)
alltweets = []
screenname=sys.argv[1]
new_tweets = api.user_timeline(screen_name = screenname,count=200)
alltweets.extend(new_tweets)
oldest = alltweets[-1].id - 1
while len(new_tweets) > 0:
print "getting tweets before %s" % (oldest)
#all subsiquent requests use the max_id param to prevent duplicates
new_tweets = api.user_timeline(screen_name = screenname,count=200,max_id=oldest)
#save most recent tweets
alltweets.extend(new_tweets)
#update the id of the oldest tweet less one
oldest = alltweets[-1].id - 1
print "...%s tweets downloaded so far" % (len(alltweets))
json_strings = json.dumps([tweet._json for tweet in alltweets])
df = pd.read_json(json_strings)
df.to_csv('{}.csv'.format(screenname), encoding='utf-8')
| mit |
Designist/sympy | sympy/plotting/plot_implicit.py | 83 | 14400 | """Implicit plotting module for SymPy
The module implements a data series called ImplicitSeries which is used by
``Plot`` class to plot implicit plots for different backends. The module,
by default, implements plotting using interval arithmetic. It switches to a
fall back algorithm if the expression cannot be plotted using interval arithmetic.
It is also possible to specify to use the fall back algorithm for all plots.
Boolean combinations of expressions cannot be plotted by the fall back
algorithm.
See Also
========
sympy.plotting.plot
References
==========
- Jeffrey Allen Tupper. Reliable Two-Dimensional Graphing Methods for
Mathematical Formulae with Two Free Variables.
- Jeffrey Allen Tupper. Graphing Equations with Generalized Interval
Arithmetic. Master's thesis. University of Toronto, 1996
"""
from __future__ import print_function, division
from .plot import BaseSeries, Plot
from .experimental_lambdify import experimental_lambdify, vectorized_lambdify
from .intervalmath import interval
from sympy.core.relational import (Equality, GreaterThan, LessThan,
Relational, StrictLessThan, StrictGreaterThan)
from sympy import Eq, Tuple, sympify, Symbol, Dummy
from sympy.external import import_module
from sympy.logic.boolalg import BooleanFunction
from sympy.polys.polyutils import _sort_gens
from sympy.utilities.decorator import doctest_depends_on
from sympy.utilities.iterables import flatten
import warnings
class ImplicitSeries(BaseSeries):
""" Representation for Implicit plot """
is_implicit = True
def __init__(self, expr, var_start_end_x, var_start_end_y,
has_equality, use_interval_math, depth, nb_of_points,
line_color):
super(ImplicitSeries, self).__init__()
self.expr = sympify(expr)
self.var_x = sympify(var_start_end_x[0])
self.start_x = float(var_start_end_x[1])
self.end_x = float(var_start_end_x[2])
self.var_y = sympify(var_start_end_y[0])
self.start_y = float(var_start_end_y[1])
self.end_y = float(var_start_end_y[2])
self.get_points = self.get_raster
self.has_equality = has_equality # If the expression has equality, i.e.
#Eq, Greaterthan, LessThan.
self.nb_of_points = nb_of_points
self.use_interval_math = use_interval_math
self.depth = 4 + depth
self.line_color = line_color
def __str__(self):
return ('Implicit equation: %s for '
'%s over %s and %s over %s') % (
str(self.expr),
str(self.var_x),
str((self.start_x, self.end_x)),
str(self.var_y),
str((self.start_y, self.end_y)))
def get_raster(self):
func = experimental_lambdify((self.var_x, self.var_y), self.expr,
use_interval=True)
xinterval = interval(self.start_x, self.end_x)
yinterval = interval(self.start_y, self.end_y)
try:
temp = func(xinterval, yinterval)
except AttributeError:
if self.use_interval_math:
warnings.warn("Adaptive meshing could not be applied to the"
" expression. Using uniform meshing.")
self.use_interval_math = False
if self.use_interval_math:
return self._get_raster_interval(func)
else:
return self._get_meshes_grid()
def _get_raster_interval(self, func):
""" Uses interval math to adaptively mesh and obtain the plot"""
k = self.depth
interval_list = []
#Create initial 32 divisions
np = import_module('numpy')
xsample = np.linspace(self.start_x, self.end_x, 33)
ysample = np.linspace(self.start_y, self.end_y, 33)
#Add a small jitter so that there are no false positives for equality.
# Ex: y==x becomes True for x interval(1, 2) and y interval(1, 2)
#which will draw a rectangle.
jitterx = (np.random.rand(
len(xsample)) * 2 - 1) * (self.end_x - self.start_x) / 2**20
jittery = (np.random.rand(
len(ysample)) * 2 - 1) * (self.end_y - self.start_y) / 2**20
xsample += jitterx
ysample += jittery
xinter = [interval(x1, x2) for x1, x2 in zip(xsample[:-1],
xsample[1:])]
yinter = [interval(y1, y2) for y1, y2 in zip(ysample[:-1],
ysample[1:])]
interval_list = [[x, y] for x in xinter for y in yinter]
plot_list = []
#recursive call refinepixels which subdivides the intervals which are
#neither True nor False according to the expression.
def refine_pixels(interval_list):
""" Evaluates the intervals and subdivides the interval if the
expression is partially satisfied."""
temp_interval_list = []
plot_list = []
for intervals in interval_list:
#Convert the array indices to x and y values
intervalx = intervals[0]
intervaly = intervals[1]
func_eval = func(intervalx, intervaly)
#The expression is valid in the interval. Change the contour
#array values to 1.
if func_eval[1] is False or func_eval[0] is False:
pass
elif func_eval == (True, True):
plot_list.append([intervalx, intervaly])
elif func_eval[1] is None or func_eval[0] is None:
#Subdivide
avgx = intervalx.mid
avgy = intervaly.mid
a = interval(intervalx.start, avgx)
b = interval(avgx, intervalx.end)
c = interval(intervaly.start, avgy)
d = interval(avgy, intervaly.end)
temp_interval_list.append([a, c])
temp_interval_list.append([a, d])
temp_interval_list.append([b, c])
temp_interval_list.append([b, d])
return temp_interval_list, plot_list
while k >= 0 and len(interval_list):
interval_list, plot_list_temp = refine_pixels(interval_list)
plot_list.extend(plot_list_temp)
k = k - 1
#Check whether the expression represents an equality
#If it represents an equality, then none of the intervals
#would have satisfied the expression due to floating point
#differences. Add all the undecided values to the plot.
if self.has_equality:
for intervals in interval_list:
intervalx = intervals[0]
intervaly = intervals[1]
func_eval = func(intervalx, intervaly)
if func_eval[1] and func_eval[0] is not False:
plot_list.append([intervalx, intervaly])
return plot_list, 'fill'
def _get_meshes_grid(self):
"""Generates the mesh for generating a contour.
In the case of equality, ``contour`` function of matplotlib can
be used. In other cases, matplotlib's ``contourf`` is used.
"""
equal = False
if isinstance(self.expr, Equality):
expr = self.expr.lhs - self.expr.rhs
equal = True
elif isinstance(self.expr, (GreaterThan, StrictGreaterThan)):
expr = self.expr.lhs - self.expr.rhs
elif isinstance(self.expr, (LessThan, StrictLessThan)):
expr = self.expr.rhs - self.expr.lhs
else:
raise NotImplementedError("The expression is not supported for "
"plotting in uniform meshed plot.")
np = import_module('numpy')
xarray = np.linspace(self.start_x, self.end_x, self.nb_of_points)
yarray = np.linspace(self.start_y, self.end_y, self.nb_of_points)
x_grid, y_grid = np.meshgrid(xarray, yarray)
func = vectorized_lambdify((self.var_x, self.var_y), expr)
z_grid = func(x_grid, y_grid)
z_grid[np.ma.where(z_grid < 0)] = -1
z_grid[np.ma.where(z_grid > 0)] = 1
if equal:
return xarray, yarray, z_grid, 'contour'
else:
return xarray, yarray, z_grid, 'contourf'
@doctest_depends_on(modules=('matplotlib',))
def plot_implicit(expr, x_var=None, y_var=None, **kwargs):
"""A plot function to plot implicit equations / inequalities.
Arguments
=========
- ``expr`` : The equation / inequality that is to be plotted.
- ``x_var`` (optional) : symbol to plot on x-axis or tuple giving symbol
and range as ``(symbol, xmin, xmax)``
- ``y_var`` (optional) : symbol to plot on y-axis or tuple giving symbol
and range as ``(symbol, ymin, ymax)``
If neither ``x_var`` nor ``y_var`` are given then the free symbols in the
expression will be assigned in the order they are sorted.
The following keyword arguments can also be used:
- ``adaptive``. Boolean. The default value is set to True. It has to be
set to False if you want to use a mesh grid.
- ``depth`` integer. The depth of recursion for adaptive mesh grid.
Default value is 0. Takes value in the range (0, 4).
- ``points`` integer. The number of points if adaptive mesh grid is not
used. Default value is 200.
- ``title`` string .The title for the plot.
- ``xlabel`` string. The label for the x-axis
- ``ylabel`` string. The label for the y-axis
Aesthetics options:
- ``line_color``: float or string. Specifies the color for the plot.
See ``Plot`` to see how to set color for the plots.
plot_implicit, by default, uses interval arithmetic to plot functions. If
the expression cannot be plotted using interval arithmetic, it defaults to
a generating a contour using a mesh grid of fixed number of points. By
setting adaptive to False, you can force plot_implicit to use the mesh
grid. The mesh grid method can be effective when adaptive plotting using
interval arithmetic, fails to plot with small line width.
Examples
========
Plot expressions:
>>> from sympy import plot_implicit, cos, sin, symbols, Eq, And
>>> x, y = symbols('x y')
Without any ranges for the symbols in the expression
>>> p1 = plot_implicit(Eq(x**2 + y**2, 5))
With the range for the symbols
>>> p2 = plot_implicit(Eq(x**2 + y**2, 3),
... (x, -3, 3), (y, -3, 3))
With depth of recursion as argument.
>>> p3 = plot_implicit(Eq(x**2 + y**2, 5),
... (x, -4, 4), (y, -4, 4), depth = 2)
Using mesh grid and not using adaptive meshing.
>>> p4 = plot_implicit(Eq(x**2 + y**2, 5),
... (x, -5, 5), (y, -2, 2), adaptive=False)
Using mesh grid with number of points as input.
>>> p5 = plot_implicit(Eq(x**2 + y**2, 5),
... (x, -5, 5), (y, -2, 2),
... adaptive=False, points=400)
Plotting regions.
>>> p6 = plot_implicit(y > x**2)
Plotting Using boolean conjunctions.
>>> p7 = plot_implicit(And(y > x, y > -x))
When plotting an expression with a single variable (y - 1, for example),
specify the x or the y variable explicitly:
>>> p8 = plot_implicit(y - 1, y_var=y)
>>> p9 = plot_implicit(x - 1, x_var=x)
"""
has_equality = False # Represents whether the expression contains an Equality,
#GreaterThan or LessThan
def arg_expand(bool_expr):
"""
Recursively expands the arguments of an Boolean Function
"""
for arg in bool_expr.args:
if isinstance(arg, BooleanFunction):
arg_expand(arg)
elif isinstance(arg, Relational):
arg_list.append(arg)
arg_list = []
if isinstance(expr, BooleanFunction):
arg_expand(expr)
#Check whether there is an equality in the expression provided.
if any(isinstance(e, (Equality, GreaterThan, LessThan))
for e in arg_list):
has_equality = True
elif not isinstance(expr, Relational):
expr = Eq(expr, 0)
has_equality = True
elif isinstance(expr, (Equality, GreaterThan, LessThan)):
has_equality = True
xyvar = [i for i in (x_var, y_var) if i is not None]
free_symbols = expr.free_symbols
range_symbols = Tuple(*flatten(xyvar)).free_symbols
undeclared = free_symbols - range_symbols
if len(free_symbols & range_symbols) > 2:
raise NotImplementedError("Implicit plotting is not implemented for "
"more than 2 variables")
#Create default ranges if the range is not provided.
default_range = Tuple(-5, 5)
def _range_tuple(s):
if isinstance(s, Symbol):
return Tuple(s) + default_range
if len(s) == 3:
return Tuple(*s)
raise ValueError('symbol or `(symbol, min, max)` expected but got %s' % s)
if len(xyvar) == 0:
xyvar = list(_sort_gens(free_symbols))
var_start_end_x = _range_tuple(xyvar[0])
x = var_start_end_x[0]
if len(xyvar) != 2:
if x in undeclared or not undeclared:
xyvar.append(Dummy('f(%s)' % x.name))
else:
xyvar.append(undeclared.pop())
var_start_end_y = _range_tuple(xyvar[1])
use_interval = kwargs.pop('adaptive', True)
nb_of_points = kwargs.pop('points', 300)
depth = kwargs.pop('depth', 0)
line_color = kwargs.pop('line_color', "blue")
#Check whether the depth is greater than 4 or less than 0.
if depth > 4:
depth = 4
elif depth < 0:
depth = 0
series_argument = ImplicitSeries(expr, var_start_end_x, var_start_end_y,
has_equality, use_interval, depth,
nb_of_points, line_color)
show = kwargs.pop('show', True)
#set the x and y limits
kwargs['xlim'] = tuple(float(x) for x in var_start_end_x[1:])
kwargs['ylim'] = tuple(float(y) for y in var_start_end_y[1:])
# set the x and y labels
kwargs.setdefault('xlabel', var_start_end_x[0].name)
kwargs.setdefault('ylabel', var_start_end_y[0].name)
p = Plot(series_argument, **kwargs)
if show:
p.show()
return p
| bsd-3-clause |
xodus7/tensorflow | tensorflow/contrib/timeseries/examples/predict_test.py | 80 | 2487 | # Copyright 2017 The TensorFlow Authors. All Rights Reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
# ==============================================================================
"""Tests that the TensorFlow parts of the prediction example run."""
from __future__ import absolute_import
from __future__ import division
from __future__ import print_function
from os import path
from tensorflow.contrib.timeseries.examples import predict
from tensorflow.python.platform import test
_MODULE_PATH = path.dirname(__file__)
_DATA_FILE = path.join(_MODULE_PATH, "data/period_trend.csv")
class PeriodTrendExampleTest(test.TestCase):
def test_shapes_and_variance_structural(self):
(times, observed, all_times, mean, upper_limit, lower_limit
) = predict.structural_ensemble_train_and_predict(_DATA_FILE)
# Just check that plotting will probably be OK. We can't actually run the
# plotting code since we don't want to pull in matplotlib as a dependency
# for this test.
self.assertAllEqual([500], times.shape)
self.assertAllEqual([500], observed.shape)
self.assertAllEqual([700], all_times.shape)
self.assertAllEqual([700], mean.shape)
self.assertAllEqual([700], upper_limit.shape)
self.assertAllEqual([700], lower_limit.shape)
# Check that variance hasn't blown up too much. This is a relatively good
# indication that training was successful.
self.assertLess(upper_limit[-1] - lower_limit[-1],
1.5 * (upper_limit[0] - lower_limit[0]))
def test_ar(self):
(times, observed, all_times, mean,
upper_limit, lower_limit) = predict.ar_train_and_predict(_DATA_FILE)
self.assertAllEqual(times.shape, observed.shape)
self.assertAllEqual(all_times.shape, mean.shape)
self.assertAllEqual(all_times.shape, upper_limit.shape)
self.assertAllEqual(all_times.shape, lower_limit.shape)
self.assertLess((upper_limit - lower_limit).mean(), 4.)
if __name__ == "__main__":
test.main()
| apache-2.0 |
chendaniely/spring_2016_cs_5854-PathLinker | src/setup_pl2.py | 1 | 20313 | #! /usr/env/python
import collections
import itertools
import random
import pandas as pd
import numpy as np
import networkx as nx
import matplotlib.pyplot as plt
def find_grouped_edges(edge_data):
"""Takes a dataframe of edges
returns a dictionary:
'edges' are tuples of edges and
'reverse_edges' are tuples of edges that are reverse of edges
for each row in the dataframe, we subset the columns that rep the edge
convert the edge into a tuple (so it can be a dict key)
"""
same_edges = []
duplicate_edges = {}
for row in edge_data.iterrows():
edge = tuple(row[1][['#tail', 'head']].tolist())
reverse_edge = tuple(row[1][['head', '#tail']].tolist())
if edge in same_edges:
print('duplicate edge: {}'.format(edge))
elif reverse_edge in same_edges:
print('duplicate edge as reverse: {} -> {}'.format(edge, reverse_edge))
else:
same_edges.append(edge)
if (reverse_edge in same_edges) and \
(reverse_edge not in duplicate_edges.keys()):
print('reverse edge found: {} -> {}'.format(edge, reverse_edge))
duplicate_edges[reverse_edge] = edge
for edge_idx, edge in enumerate(same_edges):
assert edge not in same_edges[edge_idx + 1:]
assert [edge[1], edge[0]] not in same_edges[edge_idx + 1:]
return {'edges': same_edges, 'reverse_edges':duplicate_edges}
def num_parts_to_len_per_part(total_size, num_parts):
return (total_size + 1) // num_parts
def split_into_n_parts(edges, size_per_part):
# this will only work if the size_per_part evenly divides
# return [list(t) for t in zip(*[iter(edges)] * size_per_part)]
l = [edges[i:(i + size_per_part)] for i in range(0, len(edges), size_per_part)]
assert l[0] != l[1]
return l
def sample_edges_for_fold(grouped_edge_dict, num_parts):
"""Returns a 2d list of edges sampled for folds
"""
edges = grouped_edge_dict['edges']
print('length of edges: {}'.format(len(edges)))
for edge_idx, edge in enumerate(edges):
assert edge not in edges[edge_idx + 1:]
assert [edge[1], edge[0]] not in edges[edge_idx + 1:]
random.shuffle(edges)
size_per_part = num_parts_to_len_per_part(len(edges), num_parts)
print('size per part: {}'.format(size_per_part))
sampled_edges = split_into_n_parts(edges, size_per_part)
print('len of sampled edges: {}'.format(len(sampled_edges)))
for edge in sampled_edges[0]:
assert edge not in sampled_edges[1]
assert grouped_edge_dict['reverse_edges'][edge] == [edge[1], edge[0]]
assert grouped_edge_dict['reverse_edges'][edge] not in sampled_edges[1]
return sampled_edges
def append_reverse_edges(sampled_edges, reverse_edges):
"""sampled_edges is a list where the first level is the folds
the second level contain tuples of the edges
for each edge in the fold, if a reverse edge exists,
the reverse edge is added to the fold
"""
# matched_reverse_edges = []
# for fold in sampled_edges:
# reverse_edge_fold = []
# for edge in reverse_edges:
# if edge in reverse_edges.keys():
# reverse_edge_fold.append(reverse_edges[edge])
# assert reverse_edges[edge] not in reverse_edges
# matched_reverse_edges.append(reverse_edge_fold)
# return matched_reverse_edges
for edge in sampled_edges[0]:
assert edge not in sampled_edges[1]
assert reverse_edges[edge] not in sampled_edges[1]
print(type(reverse_edges))
print("append reverse edges")
print("len sampled edges: {}".format(len(sampled_edges)))
print('len reverse_edges: {}'.format(len(reverse_edges)))
new_sampled_edges = []
for fold_idx, fold in enumerate(sampled_edges):
new_fold_values = []
# found_reverse_edges = []
for edge in fold:
new_fold_values.append(edge)
if edge in reverse_edges.keys():
print("found edge in reverse_edges {}, {}. fold: {}".\
format(edge, reverse_edges[edge], fold_idx))
assert edge != reverse_edges[edge]
new_fold_values.append(reverse_edges[edge])
new_sampled_edges.append(new_fold_values)
assert len(new_sampled_edges) == 2
print("len new sampled edges: {}".format(len(new_sampled_edges)))
print("len new sampled edges: {}".format(len(new_sampled_edges[0])))
print("len new sampled edges: {}".format(len(new_sampled_edges[1])))
for edge in new_sampled_edges[0]:
assert edge not in new_sampled_edges[1], "duplication error in append_reverse_edges() {}".format(edge)
return new_sampled_edges
def create_fold_data(data, num_parts, base_file_path, seed=None,
edge_from='#tail', edge_to='head'):
"""
For a given dataframe, it will be parsed into num_parts with a filename
based off the base_file_path as a tsv
"""
if seed is not None:
np.random.seed(seed)
random.seed(seed)
print('creating fold data')
edges_only = data[[edge_from, edge_to]]
print(edges_only.shape)
unique_edges = edges_only.drop_duplicates()
print(unique_edges.shape)
print(unique_edges.head())
print(type(unique_edges))
dict_edges_reverse = find_grouped_edges(unique_edges)
# print(dict_edges_reverse['reverse_edges'])
sampled_edges = sample_edges_for_fold(dict_edges_reverse, num_parts)
print("SAMPLED EDGES")
print(sampled_edges)
assert sampled_edges[0] != sampled_edges[1]
for e in sampled_edges:
print(len(e))
for edge in sampled_edges[0]:
assert edge not in sampled_edges[1]
assert [edge[1], edge[0]] not in sampled_edges[1]
assert dict_edges_reverse['reverse_edges'][edge] not in sampled_edges[1]
edges_for_fold = append_reverse_edges(sampled_edges,
dict_edges_reverse['reverse_edges'])
print("len edges_for_fold: {}".format(len(edges_for_fold)))
for fold in edges_for_fold:
print(len(fold))
# make sure there are no common edges in the folds
for edge in edges_for_fold[0]:
assert edge not in edges_for_fold[1], "Folds share common edge:{}".format(edge)
for edges_per_fold_idx, edges_per_fold in enumerate(edges_for_fold):
print('#'*10)
print(len(edges_per_fold))
filename = "{}_part_{}_of_{}.txt".format(base_file_path,
edges_per_fold_idx + 1,
num_parts)
df_edges = pd.DataFrame(edges_per_fold)
df_fold = pd.merge(data, df_edges,
left_on=['#tail', 'head'],
right_on=[0, 1])
print(df_fold.head())
df_fold['cv_part'] = edges_per_fold_idx
df_fold.to_csv(filename, index=False, sep='\t')
print("Fold data created: {}".format(filename))
# len_per_part = num_parts_to_len_per_part(len(data), num_parts)
# data_shuffled = data.iloc[np.random.permutation(len(data))]
# for idx, i in enumerate(range(0, len(data), len_per_part)):
# filename = "{}_part_{}_of_{}.txt".format(base_file_path,
# idx + 1,
# num_parts)
# df = data_shuffled[i:(i + len_per_part)]
# df['cv_part'] = idx
# df.to_csv(filename, index=False, sep='\t')
# print("Fold data created: {}".format(filename))
# # cv_data.append(df)
return None # cv_data
def generate_parameters():
transformation = collections.namedtuple('transformation_params',
'dataset fixed_pct value')
edges_not_in = (
# transformation('not_in', 'fixed', 0.10),
transformation('not_in', 'pct', 0.25),
# transformation('not_in', 'pct', 0.50),
# transformation('not_in', 'pct', 0.75),
# transformation('not_in', 'pct', 1.00),
# transformation('not_in', 'fixed', 0.0)
)
edges_in = (
# transformation('in', 'pct', 1.50),
# transformation('in', 'pct', 2.00),
# transformation('in', 'pct', 3.00),
# transformation('in', 'pct', 1.00),
transformation('in', 'fixed', 0.9),
# transformation('in', 'fixed', 1.0)
)
return itertools.product(edges_not_in, edges_in)
def calculate_new_weight(original_value, mult_factor, min_value, max_value):
new_value = original_value * mult_factor
if pd.isnull(original_value):
return np.NaN
elif new_value > max_value:
return max_value
elif new_value < min_value:
return new_value
else:
return new_value
def transform_weights_io(df, param_not_in, param_in,
single_pathway, part_number, part_number_total,
minimum_weight=0, maximum_weight=0.9):
# not in
if param_not_in.fixed_pct == 'fixed':
df.ix[pd.isnull(df['weight']), 'edge_weight'] = param_not_in.value
elif param_not_in.fixed_pct == 'pct':
df.ix[pd.isnull(df['weight']), 'edge_weight'] = \
df.ix[pd.isnull(df['weight']), 'edge_weight'].\
apply(calculate_new_weight,
mult_factor=param_not_in.value,
min_value=minimum_weight,
max_value=maximum_weight)
# in
if param_in.fixed_pct == 'fixed':
df.ix[pd.notnull(df['weight']), 'edge_weight'] = param_in.value
elif param_in.fixed_pct == 'pct':
df.ix[pd.notnull(df['weight']), 'edge_weight'] = \
df.ix[pd.notnull(df['weight']), 'edge_weight'].\
apply(calculate_new_weight,
mult_factor=param_not_in.value,
min_value=minimum_weight,
max_value=maximum_weight)
return df
def transform_weights(all_data, param_not_in, param_in,
# interactome_weight_col_name='edge_weight',
# pathway_weight_col_name='weight',
minimum_weight=0,
maximum_weight=0.9):
"""
If the 'weights' column in the full interactome is `nan`,
then the edge is "not in" the pathway of interest.
Additionally, for each unique value of cv_parts, if the value does not
match a cv_part, it will also be considered 'not in'
When the weights column in the full interactome is not `nan',
it represents an edge that is part of the pathway of interest.
we then use the cv_part vlues to set the `in` value
"""
cv_parts = all_data['cv_part'].unique()
cv_parts = cv_parts[~np.isnan(cv_parts)]
print(cv_parts)
transformed_dfs = []
for cv_part in cv_parts:
df = all_data.copy()
# create a new df based on the cv_parts
# the rows that are not in cv_parts will be the param_not_in
# the rows that are in the cv_parts will be the param_in
# if the weight is null, and if the cv_part is not the current value
# then consider the row as "not in"
print("NOT IN before")
print(df.ix[pd.isnull(df['weight']), [
'edge_weight', 'cv_part']].head())
print(df.ix[pd.notnull(df['weight']), [
'edge_weight', 'cv_part']].head())
# not_in
if param_not_in.fixed_pct == 'fixed':
df.ix[(pd.isnull(df['weight']) | (df['cv_part'] != cv_part)),
'edge_weight'] = param_not_in.value
elif param_not_in.fixed_pct == 'pct':
df.ix[(pd.isnull(df['weight']) | (df['cv_part'] != cv_part)),
'edge_weight'] = df.ix[(pd.isnull(df['weight']) | (df['cv_part'] != cv_part)),
'edge_weight'].\
apply(calculate_new_weight,
mult_factor=param_not_in.value,
min_value=minimum_weight,
max_value=maximum_weight)
else:
raise ValueError
print("NOT IN after")
print(df.ix[pd.isnull(df['weight']), [
'edge_weight', 'cv_part']].head())
print(df.ix[pd.notnull(df['weight']), [
'edge_weight', 'cv_part']].head())
# in
print('IN before')
print(df.ix[pd.notnull(df['weight']), [
'edge_weight', 'cv_part']].head())
# if the weight is not null and if
if param_in.fixed_pct == 'fixed':
df.ix[(pd.notnull(df['weight'])) & (df['cv_part'] == cv_part),
'edge_weight'] = param_in.value
elif param_in.fixed_pct == 'pct':
df.ix[(pd.notnull(df['weight'])) & (df['cv_part'] == cv_part),
'edge_weight'] = df.ix[(pd.notnull(df['weight'])) & (df['cv_part'] == cv_part),
'edge_weight'].\
apply(calculate_new_weight,
mult_factor=param_in.value,
min_value=minimum_weight,
max_value=maximum_weight)
else:
raise ValueError
print('IN after')
print(df.ix[pd.notnull(df['weight']), 'edge_weight'].head())
transformed_dfs.append(df)
return transformed_dfs
def main():
pathways = ["BDNF", "EGFR1",
"IL1",
"IL2",
"IL3",
"IL6",
"IL-7",
"KitReceptor",
"Leptin",
"Prolactin",
"RANKL",
"TCR",
"TGF_beta_Receptor",
"TNFalpha",
"Wnt"]
# pathways = ["IL3", "IL6"]
interactome = pd.read_csv(
'../data/pathlinker-signaling-children-reg-weighted.txt',
delimiter='\t')
k_folds = 2
# for each pathway
for single_pathway_idx, single_pathway in enumerate(pathways):
print("*"*80)
print("Generating data for pathway: {}".format(single_pathway))
pathway = pd.read_csv('../data/pathways/{}-edges.txt'.
format(single_pathway),
delimiter='\t')
edge_list = pathway[['#tail', 'head']]
g = nx.from_pandas_dataframe(edge_list, '#tail', 'head',
create_using=nx.DiGraph())
print('nx EDGES')
print(g.edges())
fold1 = []
fold2 = []
for edge in g.edges():
print((edge[0], edge[1]))
if (edge[1], edge[0]) in fold1 or (edge[0], edge[1]) in fold1:
fold1.append(edge)
print('fold 1 dup')
continue
if (edge[1], edge[0]) in fold2 or (edge[0], edge[1]) in fold2:
fold2.append(edge)
print('fold 2 dup')
continue
if len(fold1) <= len(fold2):
fold1.append(edge)
else:
fold2.append(edge)
assert set(fold1) != set(fold2)
for edge in fold1:
assert set(edge) not in fold2
for edge in fold2:
assert set(edge) not in fold1
print('~'*100)
print(sorted(fold1))
print('*'*80)
print(sorted(fold2))
pathway_nodes = pd.read_csv('../data/pathways/{}-nodes.txt'.
format(single_pathway),
delimiter='\t')
pathway_nodes = open('../data/pathways/{}-nodes.txt'.format(single_pathway))
f1h = open('../output/nodes/{}_fold1_holdout_nodes.txt'.format(single_pathway), 'w')
f2h = open('../output/nodes/{}_fold2_holdout_nodes.txt'.format(single_pathway), 'w')
f1h.write('#node\tnode_symbol\tnode_type\n')
f2h.write('#node\tnode_symbol\tnode_type\n')
nodes = []
fold1_nodes = []
fold2_nodes = []
fold1_holdout_nodes = []
fold2_holdout_nodes = []
for edge in g.edges():
if edge[0] not in nodes:
nodes.append(edge[0])
if edge[1] not in nodes:
nodes.append(edge[1])
for edge in fold1:
fold1_nodes.append(edge[0])
fold1_nodes.append(edge[1])
for edge in fold2:
fold2_nodes.append(edge[0])
fold2_nodes.append(edge[1])
for node in nodes:
if node not in fold1_nodes:
fold1_holdout_nodes.append(node)
if node not in fold2_nodes:
fold2_holdout_nodes.append(node)
print(len(nodes), len(fold1_holdout_nodes), len(fold2_holdout_nodes))
for line in pathway_nodes:
row = line.split('\t')
if row[0] in fold1_holdout_nodes:
f1h.write(line)
if row[0] in fold2_holdout_nodes:
f2h.write(line)
f1h.close()
f2h.close()
# assert False
# cv_data = create_fold_data(data=pathway, num_parts=k_folds,
# base_file_path='../output/{}'.
# format(single_pathway),
# seed=42)
cv_data = [fold1, fold2]
for fold_idx, fold in enumerate(cv_data):
part_number = fold_idx + 1
filename = '../output/{}_part_{}_of_{}.txt'.format(
single_pathway, part_number, k_folds)
print(filename)
df_edges = pd.DataFrame(fold)
df_fold = pd.merge(pathway, df_edges,
left_on=['#tail', 'head'],
right_on=[0, 1])
print(df_edges.head())
print(df_fold.head())
df_fold['cv_part'] = fold_idx
df_fold.to_csv(filename, index=False, sep='\t')
print(df_fold.head())
# for edges_per_fold_idx, edges_per_fold in enumerate(edges_for_fold):
# print('#'*10)
# print(len(edges_per_fold))
# filename = "{}_part_{}_of_{}.txt".format(
# edges_per_fold_idx + 1,
# num_parts)
# df_edges = pd.DataFrame(edges_per_fold)
# df_fold = pd.merge(data, df_edges,
# print(df_fold.head())
# df_fold['cv_part'] = edges_per_fold_idx
# df_fold.to_csv(filename, index=False, sep='\t')
# print("Fold data created: {}".format(filename))
parameters = generate_parameters()
print("Parameters: {}".format(parameters))
# for each parameter set
for param_idx, param in enumerate(parameters):
param_not_in, param_in = param
print(param_not_in)
print(param_in)
# for each fold
for fold_num in range(k_folds):
fold_num_name = fold_num + 1
data_file_for_k = pd.read_csv('../output/{}_part_{}_of_{}.txt'.
format(single_pathway,
fold_num_name,
k_folds),
delimiter='\t')
print(data_file_for_k.head())
fold_df = pd.merge(interactome, data_file_for_k,
on=['#tail', 'head'],
how='left')
print(fold_df['cv_part'].unique())
transformed_df = transform_weights_io(fold_df,
param_not_in, param_in,
single_pathway,
fold_num_name, k_folds)
filename = "../output/{}_data_{}_ni-{}-{}_i-{}-{}.txt".format(
single_pathway,
fold_num_name,
param_not_in.fixed_pct, param_not_in.value,
param_in.fixed_pct, param_in.value)
transformed_df.to_csv(filename, index=False, sep='\t')
if __name__ == '__main__':
main()
| gpl-3.0 |
JanNash/sms-tools | lectures/04-STFT/plots-code/windows.py | 24 | 1247 | import matplotlib.pyplot as plt
import numpy as np
import time, os, sys
sys.path.append(os.path.join(os.path.dirname(os.path.realpath(__file__)), '../../../software/models/'))
import dftModel as DF
import utilFunctions as UF
from scipy.fftpack import fft, ifft
import math
(fs, x) = UF.wavread('../../../sounds/oboe-A4.wav')
N = 512
pin = 5000
w = np.ones(501)
hM1 = int(math.floor((w.size+1)/2))
hM2 = int(math.floor(w.size/2))
x1 = x[pin-hM1:pin+hM2]
plt.figure(1, figsize=(9.5, 7))
plt.subplot(4,1,1)
plt.plot(np.arange(-hM1, hM2), x1, lw=1.5)
plt.axis([-hM1, hM2, min(x1), max(x1)])
plt.title('x (oboe-A4.wav)')
mX, pX = DF.dftAnal(x1, w, N)
mX = mX - max(mX)
plt.subplot(4,1,2)
plt.plot(np.arange(mX.size), mX, 'r', lw=1.5)
plt.axis([0,N/4,-70,0])
plt.title ('mX (rectangular window)')
w = np.hamming(501)
mX, pX = DF.dftAnal(x1, w, N)
mX = mX - max(mX)
plt.subplot(4,1,3)
plt.plot(np.arange(mX.size), mX, 'r', lw=1.5)
plt.axis([0,N/4,-70,0])
plt.title ('mX (hamming window)')
w = np.blackman(501)
mX, pX = DF.dftAnal(x1, w, N)
mX = mX - max(mX)
plt.subplot(4,1,4)
plt.plot(np.arange(mX.size), mX, 'r', lw=1.5)
plt.axis([0,N/4,-70,0])
plt.title ('mX (blackman window)')
plt.tight_layout()
plt.savefig('windows.png')
plt.show()
| agpl-3.0 |
gfyoung/scipy | scipy/ndimage/filters.py | 1 | 49434 | # Copyright (C) 2003-2005 Peter J. Verveer
#
# Redistribution and use in source and binary forms, with or without
# modification, are permitted provided that the following conditions
# are met:
#
# 1. Redistributions of source code must retain the above copyright
# notice, this list of conditions and the following disclaimer.
#
# 2. Redistributions in binary form must reproduce the above
# copyright notice, this list of conditions and the following
# disclaimer in the documentation and/or other materials provided
# with the distribution.
#
# 3. The name of the author may not be used to endorse or promote
# products derived from this software without specific prior
# written permission.
#
# THIS SOFTWARE IS PROVIDED BY THE AUTHOR ``AS IS'' AND ANY EXPRESS
# OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED
# WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE
# ARE DISCLAIMED. IN NO EVENT SHALL THE AUTHOR BE LIABLE FOR ANY
# DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL
# DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE
# GOODS OR SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS
# INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY,
# WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING
# NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF THIS
# SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
from __future__ import division, print_function, absolute_import
import warnings
import math
import numpy
from . import _ni_support
from . import _nd_image
from . import _ni_docstrings
from scipy.misc import doccer
from scipy._lib._version import NumpyVersion
__all__ = ['correlate1d', 'convolve1d', 'gaussian_filter1d', 'gaussian_filter',
'prewitt', 'sobel', 'generic_laplace', 'laplace',
'gaussian_laplace', 'generic_gradient_magnitude',
'gaussian_gradient_magnitude', 'correlate', 'convolve',
'uniform_filter1d', 'uniform_filter', 'minimum_filter1d',
'maximum_filter1d', 'minimum_filter', 'maximum_filter',
'rank_filter', 'median_filter', 'percentile_filter',
'generic_filter1d', 'generic_filter']
def _invalid_origin(origin, lenw):
return (origin < -(lenw // 2)) or (origin > (lenw - 1) // 2)
@_ni_docstrings.docfiller
def correlate1d(input, weights, axis=-1, output=None, mode="reflect",
cval=0.0, origin=0):
"""Calculate a one-dimensional correlation along the given axis.
The lines of the array along the given axis are correlated with the
given weights.
Parameters
----------
%(input)s
weights : array
One-dimensional sequence of numbers.
%(axis)s
%(output)s
%(mode)s
%(cval)s
%(origin)s
Examples
--------
>>> from scipy.ndimage import correlate1d
>>> correlate1d([2, 8, 0, 4, 1, 9, 9, 0], weights=[1, 3])
array([ 8, 26, 8, 12, 7, 28, 36, 9])
"""
input = numpy.asarray(input)
if numpy.iscomplexobj(input):
raise TypeError('Complex type not supported')
output = _ni_support._get_output(output, input)
weights = numpy.asarray(weights, dtype=numpy.float64)
if weights.ndim != 1 or weights.shape[0] < 1:
raise RuntimeError('no filter weights given')
if not weights.flags.contiguous:
weights = weights.copy()
axis = _ni_support._check_axis(axis, input.ndim)
if _invalid_origin(origin, len(weights)):
raise ValueError('Invalid origin; origin must satisfy '
'-(len(weights) // 2) <= origin <= '
'(len(weights)-1) // 2')
mode = _ni_support._extend_mode_to_code(mode)
_nd_image.correlate1d(input, weights, axis, output, mode, cval,
origin)
return output
@_ni_docstrings.docfiller
def convolve1d(input, weights, axis=-1, output=None, mode="reflect",
cval=0.0, origin=0):
"""Calculate a one-dimensional convolution along the given axis.
The lines of the array along the given axis are convolved with the
given weights.
Parameters
----------
%(input)s
weights : ndarray
One-dimensional sequence of numbers.
%(axis)s
%(output)s
%(mode)s
%(cval)s
%(origin)s
Returns
-------
convolve1d : ndarray
Convolved array with same shape as input
Examples
--------
>>> from scipy.ndimage import convolve1d
>>> convolve1d([2, 8, 0, 4, 1, 9, 9, 0], weights=[1, 3])
array([14, 24, 4, 13, 12, 36, 27, 0])
"""
weights = weights[::-1]
origin = -origin
if not len(weights) & 1:
origin -= 1
return correlate1d(input, weights, axis, output, mode, cval, origin)
def _gaussian_kernel1d(sigma, order, radius):
"""
Computes a 1D Gaussian convolution kernel.
"""
if order < 0:
raise ValueError('order must be non-negative')
p = numpy.polynomial.Polynomial([0, 0, -0.5 / (sigma * sigma)])
x = numpy.arange(-radius, radius + 1)
phi_x = numpy.exp(p(x), dtype=numpy.double)
phi_x /= phi_x.sum()
if order > 0:
q = numpy.polynomial.Polynomial([1])
p_deriv = p.deriv()
for _ in range(order):
# f(x) = q(x) * phi(x) = q(x) * exp(p(x))
# f'(x) = (q'(x) + q(x) * p'(x)) * phi(x)
q = q.deriv() + q * p_deriv
phi_x *= q(x)
return phi_x
@_ni_docstrings.docfiller
def gaussian_filter1d(input, sigma, axis=-1, order=0, output=None,
mode="reflect", cval=0.0, truncate=4.0):
"""One-dimensional Gaussian filter.
Parameters
----------
%(input)s
sigma : scalar
standard deviation for Gaussian kernel
%(axis)s
order : int, optional
An order of 0 corresponds to convolution with a Gaussian
kernel. A positive order corresponds to convolution with
that derivative of a Gaussian.
%(output)s
%(mode)s
%(cval)s
truncate : float, optional
Truncate the filter at this many standard deviations.
Default is 4.0.
Returns
-------
gaussian_filter1d : ndarray
Examples
--------
>>> from scipy.ndimage import gaussian_filter1d
>>> gaussian_filter1d([1.0, 2.0, 3.0, 4.0, 5.0], 1)
array([ 1.42704095, 2.06782203, 3. , 3.93217797, 4.57295905])
>>> gaussian_filter1d([1.0, 2.0, 3.0, 4.0, 5.0], 4)
array([ 2.91948343, 2.95023502, 3. , 3.04976498, 3.08051657])
>>> import matplotlib.pyplot as plt
>>> np.random.seed(280490)
>>> x = np.random.randn(101).cumsum()
>>> y3 = gaussian_filter1d(x, 3)
>>> y6 = gaussian_filter1d(x, 6)
>>> plt.plot(x, 'k', label='original data')
>>> plt.plot(y3, '--', label='filtered, sigma=3')
>>> plt.plot(y6, ':', label='filtered, sigma=6')
>>> plt.legend()
>>> plt.grid()
>>> plt.show()
"""
sd = float(sigma)
# make the radius of the filter equal to truncate standard deviations
lw = int(truncate * sd + 0.5)
# Since we are calling correlate, not convolve, revert the kernel
weights = _gaussian_kernel1d(sigma, order, lw)[::-1]
return correlate1d(input, weights, axis, output, mode, cval, 0)
@_ni_docstrings.docfiller
def gaussian_filter(input, sigma, order=0, output=None,
mode="reflect", cval=0.0, truncate=4.0):
"""Multidimensional Gaussian filter.
Parameters
----------
%(input)s
sigma : scalar or sequence of scalars
Standard deviation for Gaussian kernel. The standard
deviations of the Gaussian filter are given for each axis as a
sequence, or as a single number, in which case it is equal for
all axes.
order : int or sequence of ints, optional
The order of the filter along each axis is given as a sequence
of integers, or as a single number. An order of 0 corresponds
to convolution with a Gaussian kernel. A positive order
corresponds to convolution with that derivative of a Gaussian.
%(output)s
%(mode_multiple)s
%(cval)s
truncate : float
Truncate the filter at this many standard deviations.
Default is 4.0.
Returns
-------
gaussian_filter : ndarray
Returned array of same shape as `input`.
Notes
-----
The multidimensional filter is implemented as a sequence of
one-dimensional convolution filters. The intermediate arrays are
stored in the same data type as the output. Therefore, for output
types with a limited precision, the results may be imprecise
because intermediate results may be stored with insufficient
precision.
Examples
--------
>>> from scipy.ndimage import gaussian_filter
>>> a = np.arange(50, step=2).reshape((5,5))
>>> a
array([[ 0, 2, 4, 6, 8],
[10, 12, 14, 16, 18],
[20, 22, 24, 26, 28],
[30, 32, 34, 36, 38],
[40, 42, 44, 46, 48]])
>>> gaussian_filter(a, sigma=1)
array([[ 4, 6, 8, 9, 11],
[10, 12, 14, 15, 17],
[20, 22, 24, 25, 27],
[29, 31, 33, 34, 36],
[35, 37, 39, 40, 42]])
>>> from scipy import misc
>>> import matplotlib.pyplot as plt
>>> fig = plt.figure()
>>> plt.gray() # show the filtered result in grayscale
>>> ax1 = fig.add_subplot(121) # left side
>>> ax2 = fig.add_subplot(122) # right side
>>> ascent = misc.ascent()
>>> result = gaussian_filter(ascent, sigma=5)
>>> ax1.imshow(ascent)
>>> ax2.imshow(result)
>>> plt.show()
"""
input = numpy.asarray(input)
output = _ni_support._get_output(output, input)
orders = _ni_support._normalize_sequence(order, input.ndim)
sigmas = _ni_support._normalize_sequence(sigma, input.ndim)
modes = _ni_support._normalize_sequence(mode, input.ndim)
axes = list(range(input.ndim))
axes = [(axes[ii], sigmas[ii], orders[ii], modes[ii])
for ii in range(len(axes)) if sigmas[ii] > 1e-15]
if len(axes) > 0:
for axis, sigma, order, mode in axes:
gaussian_filter1d(input, sigma, axis, order, output,
mode, cval, truncate)
input = output
else:
output[...] = input[...]
return output
@_ni_docstrings.docfiller
def prewitt(input, axis=-1, output=None, mode="reflect", cval=0.0):
"""Calculate a Prewitt filter.
Parameters
----------
%(input)s
%(axis)s
%(output)s
%(mode_multiple)s
%(cval)s
Examples
--------
>>> from scipy import ndimage, misc
>>> import matplotlib.pyplot as plt
>>> fig = plt.figure()
>>> plt.gray() # show the filtered result in grayscale
>>> ax1 = fig.add_subplot(121) # left side
>>> ax2 = fig.add_subplot(122) # right side
>>> ascent = misc.ascent()
>>> result = ndimage.prewitt(ascent)
>>> ax1.imshow(ascent)
>>> ax2.imshow(result)
>>> plt.show()
"""
input = numpy.asarray(input)
axis = _ni_support._check_axis(axis, input.ndim)
output = _ni_support._get_output(output, input)
modes = _ni_support._normalize_sequence(mode, input.ndim)
correlate1d(input, [-1, 0, 1], axis, output, modes[axis], cval, 0)
axes = [ii for ii in range(input.ndim) if ii != axis]
for ii in axes:
correlate1d(output, [1, 1, 1], ii, output, modes[ii], cval, 0,)
return output
@_ni_docstrings.docfiller
def sobel(input, axis=-1, output=None, mode="reflect", cval=0.0):
"""Calculate a Sobel filter.
Parameters
----------
%(input)s
%(axis)s
%(output)s
%(mode_multiple)s
%(cval)s
Examples
--------
>>> from scipy import ndimage, misc
>>> import matplotlib.pyplot as plt
>>> fig = plt.figure()
>>> plt.gray() # show the filtered result in grayscale
>>> ax1 = fig.add_subplot(121) # left side
>>> ax2 = fig.add_subplot(122) # right side
>>> ascent = misc.ascent()
>>> result = ndimage.sobel(ascent)
>>> ax1.imshow(ascent)
>>> ax2.imshow(result)
>>> plt.show()
"""
input = numpy.asarray(input)
axis = _ni_support._check_axis(axis, input.ndim)
output = _ni_support._get_output(output, input)
modes = _ni_support._normalize_sequence(mode, input.ndim)
correlate1d(input, [-1, 0, 1], axis, output, modes[axis], cval, 0)
axes = [ii for ii in range(input.ndim) if ii != axis]
for ii in axes:
correlate1d(output, [1, 2, 1], ii, output, modes[ii], cval, 0)
return output
@_ni_docstrings.docfiller
def generic_laplace(input, derivative2, output=None, mode="reflect",
cval=0.0,
extra_arguments=(),
extra_keywords=None):
"""
N-dimensional Laplace filter using a provided second derivative function.
Parameters
----------
%(input)s
derivative2 : callable
Callable with the following signature::
derivative2(input, axis, output, mode, cval,
*extra_arguments, **extra_keywords)
See `extra_arguments`, `extra_keywords` below.
%(output)s
%(mode_multiple)s
%(cval)s
%(extra_keywords)s
%(extra_arguments)s
"""
if extra_keywords is None:
extra_keywords = {}
input = numpy.asarray(input)
output = _ni_support._get_output(output, input)
axes = list(range(input.ndim))
if len(axes) > 0:
modes = _ni_support._normalize_sequence(mode, len(axes))
derivative2(input, axes[0], output, modes[0], cval,
*extra_arguments, **extra_keywords)
for ii in range(1, len(axes)):
tmp = derivative2(input, axes[ii], output.dtype, modes[ii], cval,
*extra_arguments, **extra_keywords)
output += tmp
else:
output[...] = input[...]
return output
@_ni_docstrings.docfiller
def laplace(input, output=None, mode="reflect", cval=0.0):
"""N-dimensional Laplace filter based on approximate second derivatives.
Parameters
----------
%(input)s
%(output)s
%(mode_multiple)s
%(cval)s
Examples
--------
>>> from scipy import ndimage, misc
>>> import matplotlib.pyplot as plt
>>> fig = plt.figure()
>>> plt.gray() # show the filtered result in grayscale
>>> ax1 = fig.add_subplot(121) # left side
>>> ax2 = fig.add_subplot(122) # right side
>>> ascent = misc.ascent()
>>> result = ndimage.laplace(ascent)
>>> ax1.imshow(ascent)
>>> ax2.imshow(result)
>>> plt.show()
"""
def derivative2(input, axis, output, mode, cval):
return correlate1d(input, [1, -2, 1], axis, output, mode, cval, 0)
return generic_laplace(input, derivative2, output, mode, cval)
@_ni_docstrings.docfiller
def gaussian_laplace(input, sigma, output=None, mode="reflect",
cval=0.0, **kwargs):
"""Multidimensional Laplace filter using gaussian second derivatives.
Parameters
----------
%(input)s
sigma : scalar or sequence of scalars
The standard deviations of the Gaussian filter are given for
each axis as a sequence, or as a single number, in which case
it is equal for all axes.
%(output)s
%(mode_multiple)s
%(cval)s
Extra keyword arguments will be passed to gaussian_filter().
Examples
--------
>>> from scipy import ndimage, misc
>>> import matplotlib.pyplot as plt
>>> ascent = misc.ascent()
>>> fig = plt.figure()
>>> plt.gray() # show the filtered result in grayscale
>>> ax1 = fig.add_subplot(121) # left side
>>> ax2 = fig.add_subplot(122) # right side
>>> result = ndimage.gaussian_laplace(ascent, sigma=1)
>>> ax1.imshow(result)
>>> result = ndimage.gaussian_laplace(ascent, sigma=3)
>>> ax2.imshow(result)
>>> plt.show()
"""
input = numpy.asarray(input)
def derivative2(input, axis, output, mode, cval, sigma, **kwargs):
order = [0] * input.ndim
order[axis] = 2
return gaussian_filter(input, sigma, order, output, mode, cval,
**kwargs)
return generic_laplace(input, derivative2, output, mode, cval,
extra_arguments=(sigma,),
extra_keywords=kwargs)
@_ni_docstrings.docfiller
def generic_gradient_magnitude(input, derivative, output=None,
mode="reflect", cval=0.0,
extra_arguments=(), extra_keywords=None):
"""Gradient magnitude using a provided gradient function.
Parameters
----------
%(input)s
derivative : callable
Callable with the following signature::
derivative(input, axis, output, mode, cval,
*extra_arguments, **extra_keywords)
See `extra_arguments`, `extra_keywords` below.
`derivative` can assume that `input` and `output` are ndarrays.
Note that the output from `derivative` is modified inplace;
be careful to copy important inputs before returning them.
%(output)s
%(mode_multiple)s
%(cval)s
%(extra_keywords)s
%(extra_arguments)s
"""
if extra_keywords is None:
extra_keywords = {}
input = numpy.asarray(input)
output = _ni_support._get_output(output, input)
axes = list(range(input.ndim))
if len(axes) > 0:
modes = _ni_support._normalize_sequence(mode, len(axes))
derivative(input, axes[0], output, modes[0], cval,
*extra_arguments, **extra_keywords)
numpy.multiply(output, output, output)
for ii in range(1, len(axes)):
tmp = derivative(input, axes[ii], output.dtype, modes[ii], cval,
*extra_arguments, **extra_keywords)
numpy.multiply(tmp, tmp, tmp)
output += tmp
# This allows the sqrt to work with a different default casting
numpy.sqrt(output, output, casting='unsafe')
else:
output[...] = input[...]
return output
@_ni_docstrings.docfiller
def gaussian_gradient_magnitude(input, sigma, output=None,
mode="reflect", cval=0.0, **kwargs):
"""Multidimensional gradient magnitude using Gaussian derivatives.
Parameters
----------
%(input)s
sigma : scalar or sequence of scalars
The standard deviations of the Gaussian filter are given for
each axis as a sequence, or as a single number, in which case
it is equal for all axes..
%(output)s
%(mode_multiple)s
%(cval)s
Extra keyword arguments will be passed to gaussian_filter().
Returns
-------
gaussian_gradient_magnitude : ndarray
Filtered array. Has the same shape as `input`.
Examples
--------
>>> from scipy import ndimage, misc
>>> import matplotlib.pyplot as plt
>>> fig = plt.figure()
>>> plt.gray() # show the filtered result in grayscale
>>> ax1 = fig.add_subplot(121) # left side
>>> ax2 = fig.add_subplot(122) # right side
>>> ascent = misc.ascent()
>>> result = ndimage.gaussian_gradient_magnitude(ascent, sigma=5)
>>> ax1.imshow(ascent)
>>> ax2.imshow(result)
>>> plt.show()
"""
input = numpy.asarray(input)
def derivative(input, axis, output, mode, cval, sigma, **kwargs):
order = [0] * input.ndim
order[axis] = 1
return gaussian_filter(input, sigma, order, output, mode,
cval, **kwargs)
return generic_gradient_magnitude(input, derivative, output, mode,
cval, extra_arguments=(sigma,),
extra_keywords=kwargs)
def _correlate_or_convolve(input, weights, output, mode, cval, origin,
convolution):
input = numpy.asarray(input)
if numpy.iscomplexobj(input):
raise TypeError('Complex type not supported')
origins = _ni_support._normalize_sequence(origin, input.ndim)
weights = numpy.asarray(weights, dtype=numpy.float64)
wshape = [ii for ii in weights.shape if ii > 0]
if len(wshape) != input.ndim:
raise RuntimeError('filter weights array has incorrect shape.')
if convolution:
weights = weights[tuple([slice(None, None, -1)] * weights.ndim)]
for ii in range(len(origins)):
origins[ii] = -origins[ii]
if not weights.shape[ii] & 1:
origins[ii] -= 1
for origin, lenw in zip(origins, wshape):
if _invalid_origin(origin, lenw):
raise ValueError('Invalid origin; origin must satisfy '
'-(weights.shape[k] // 2) <= origin[k] <= '
'(weights.shape[k]-1) // 2')
if not weights.flags.contiguous:
weights = weights.copy()
output = _ni_support._get_output(output, input)
mode = _ni_support._extend_mode_to_code(mode)
_nd_image.correlate(input, weights, output, mode, cval, origins)
return output
@_ni_docstrings.docfiller
def correlate(input, weights, output=None, mode='reflect', cval=0.0,
origin=0):
"""
Multi-dimensional correlation.
The array is correlated with the given kernel.
Parameters
----------
%(input)s
weights : ndarray
array of weights, same number of dimensions as input
%(output)s
%(mode_multiple)s
%(cval)s
%(origin_multiple)s
See Also
--------
convolve : Convolve an image with a kernel.
"""
return _correlate_or_convolve(input, weights, output, mode, cval,
origin, False)
@_ni_docstrings.docfiller
def convolve(input, weights, output=None, mode='reflect', cval=0.0,
origin=0):
"""
Multidimensional convolution.
The array is convolved with the given kernel.
Parameters
----------
%(input)s
weights : array_like
Array of weights, same number of dimensions as input
%(output)s
%(mode_multiple)s
cval : scalar, optional
Value to fill past edges of input if `mode` is 'constant'. Default
is 0.0
%(origin_multiple)s
Returns
-------
result : ndarray
The result of convolution of `input` with `weights`.
See Also
--------
correlate : Correlate an image with a kernel.
Notes
-----
Each value in result is :math:`C_i = \\sum_j{I_{i+k-j} W_j}`, where
W is the `weights` kernel,
j is the n-D spatial index over :math:`W`,
I is the `input` and k is the coordinate of the center of
W, specified by `origin` in the input parameters.
Examples
--------
Perhaps the simplest case to understand is ``mode='constant', cval=0.0``,
because in this case borders (i.e. where the `weights` kernel, centered
on any one value, extends beyond an edge of `input`) are treated as zeros.
>>> a = np.array([[1, 2, 0, 0],
... [5, 3, 0, 4],
... [0, 0, 0, 7],
... [9, 3, 0, 0]])
>>> k = np.array([[1,1,1],[1,1,0],[1,0,0]])
>>> from scipy import ndimage
>>> ndimage.convolve(a, k, mode='constant', cval=0.0)
array([[11, 10, 7, 4],
[10, 3, 11, 11],
[15, 12, 14, 7],
[12, 3, 7, 0]])
Setting ``cval=1.0`` is equivalent to padding the outer edge of `input`
with 1.0's (and then extracting only the original region of the result).
>>> ndimage.convolve(a, k, mode='constant', cval=1.0)
array([[13, 11, 8, 7],
[11, 3, 11, 14],
[16, 12, 14, 10],
[15, 6, 10, 5]])
With ``mode='reflect'`` (the default), outer values are reflected at the
edge of `input` to fill in missing values.
>>> b = np.array([[2, 0, 0],
... [1, 0, 0],
... [0, 0, 0]])
>>> k = np.array([[0,1,0], [0,1,0], [0,1,0]])
>>> ndimage.convolve(b, k, mode='reflect')
array([[5, 0, 0],
[3, 0, 0],
[1, 0, 0]])
This includes diagonally at the corners.
>>> k = np.array([[1,0,0],[0,1,0],[0,0,1]])
>>> ndimage.convolve(b, k)
array([[4, 2, 0],
[3, 2, 0],
[1, 1, 0]])
With ``mode='nearest'``, the single nearest value in to an edge in
`input` is repeated as many times as needed to match the overlapping
`weights`.
>>> c = np.array([[2, 0, 1],
... [1, 0, 0],
... [0, 0, 0]])
>>> k = np.array([[0, 1, 0],
... [0, 1, 0],
... [0, 1, 0],
... [0, 1, 0],
... [0, 1, 0]])
>>> ndimage.convolve(c, k, mode='nearest')
array([[7, 0, 3],
[5, 0, 2],
[3, 0, 1]])
"""
return _correlate_or_convolve(input, weights, output, mode, cval,
origin, True)
@_ni_docstrings.docfiller
def uniform_filter1d(input, size, axis=-1, output=None,
mode="reflect", cval=0.0, origin=0):
"""Calculate a one-dimensional uniform filter along the given axis.
The lines of the array along the given axis are filtered with a
uniform filter of given size.
Parameters
----------
%(input)s
size : int
length of uniform filter
%(axis)s
%(output)s
%(mode)s
%(cval)s
%(origin)s
Examples
--------
>>> from scipy.ndimage import uniform_filter1d
>>> uniform_filter1d([2, 8, 0, 4, 1, 9, 9, 0], size=3)
array([4, 3, 4, 1, 4, 6, 6, 3])
"""
input = numpy.asarray(input)
if numpy.iscomplexobj(input):
raise TypeError('Complex type not supported')
axis = _ni_support._check_axis(axis, input.ndim)
if size < 1:
raise RuntimeError('incorrect filter size')
output = _ni_support._get_output(output, input)
if (size // 2 + origin < 0) or (size // 2 + origin >= size):
raise ValueError('invalid origin')
mode = _ni_support._extend_mode_to_code(mode)
_nd_image.uniform_filter1d(input, size, axis, output, mode, cval,
origin)
return output
@_ni_docstrings.docfiller
def uniform_filter(input, size=3, output=None, mode="reflect",
cval=0.0, origin=0):
"""Multi-dimensional uniform filter.
Parameters
----------
%(input)s
size : int or sequence of ints, optional
The sizes of the uniform filter are given for each axis as a
sequence, or as a single number, in which case the size is
equal for all axes.
%(output)s
%(mode_multiple)s
%(cval)s
%(origin_multiple)s
Returns
-------
uniform_filter : ndarray
Filtered array. Has the same shape as `input`.
Notes
-----
The multi-dimensional filter is implemented as a sequence of
one-dimensional uniform filters. The intermediate arrays are stored
in the same data type as the output. Therefore, for output types
with a limited precision, the results may be imprecise because
intermediate results may be stored with insufficient precision.
Examples
--------
>>> from scipy import ndimage, misc
>>> import matplotlib.pyplot as plt
>>> fig = plt.figure()
>>> plt.gray() # show the filtered result in grayscale
>>> ax1 = fig.add_subplot(121) # left side
>>> ax2 = fig.add_subplot(122) # right side
>>> ascent = misc.ascent()
>>> result = ndimage.uniform_filter(ascent, size=20)
>>> ax1.imshow(ascent)
>>> ax2.imshow(result)
>>> plt.show()
"""
input = numpy.asarray(input)
output = _ni_support._get_output(output, input)
sizes = _ni_support._normalize_sequence(size, input.ndim)
origins = _ni_support._normalize_sequence(origin, input.ndim)
modes = _ni_support._normalize_sequence(mode, input.ndim)
axes = list(range(input.ndim))
axes = [(axes[ii], sizes[ii], origins[ii], modes[ii])
for ii in range(len(axes)) if sizes[ii] > 1]
if len(axes) > 0:
for axis, size, origin, mode in axes:
uniform_filter1d(input, int(size), axis, output, mode,
cval, origin)
input = output
else:
output[...] = input[...]
return output
@_ni_docstrings.docfiller
def minimum_filter1d(input, size, axis=-1, output=None,
mode="reflect", cval=0.0, origin=0):
"""Calculate a one-dimensional minimum filter along the given axis.
The lines of the array along the given axis are filtered with a
minimum filter of given size.
Parameters
----------
%(input)s
size : int
length along which to calculate 1D minimum
%(axis)s
%(output)s
%(mode)s
%(cval)s
%(origin)s
Notes
-----
This function implements the MINLIST algorithm [1]_, as described by
Richard Harter [2]_, and has a guaranteed O(n) performance, `n` being
the `input` length, regardless of filter size.
References
----------
.. [1] http://citeseerx.ist.psu.edu/viewdoc/summary?doi=10.1.1.42.2777
.. [2] http://www.richardhartersworld.com/cri/2001/slidingmin.html
Examples
--------
>>> from scipy.ndimage import minimum_filter1d
>>> minimum_filter1d([2, 8, 0, 4, 1, 9, 9, 0], size=3)
array([2, 0, 0, 0, 1, 1, 0, 0])
"""
input = numpy.asarray(input)
if numpy.iscomplexobj(input):
raise TypeError('Complex type not supported')
axis = _ni_support._check_axis(axis, input.ndim)
if size < 1:
raise RuntimeError('incorrect filter size')
output = _ni_support._get_output(output, input)
if (size // 2 + origin < 0) or (size // 2 + origin >= size):
raise ValueError('invalid origin')
mode = _ni_support._extend_mode_to_code(mode)
_nd_image.min_or_max_filter1d(input, size, axis, output, mode, cval,
origin, 1)
return output
@_ni_docstrings.docfiller
def maximum_filter1d(input, size, axis=-1, output=None,
mode="reflect", cval=0.0, origin=0):
"""Calculate a one-dimensional maximum filter along the given axis.
The lines of the array along the given axis are filtered with a
maximum filter of given size.
Parameters
----------
%(input)s
size : int
Length along which to calculate the 1-D maximum.
%(axis)s
%(output)s
%(mode)s
%(cval)s
%(origin)s
Returns
-------
maximum1d : ndarray, None
Maximum-filtered array with same shape as input.
None if `output` is not None
Notes
-----
This function implements the MAXLIST algorithm [1]_, as described by
Richard Harter [2]_, and has a guaranteed O(n) performance, `n` being
the `input` length, regardless of filter size.
References
----------
.. [1] http://citeseerx.ist.psu.edu/viewdoc/summary?doi=10.1.1.42.2777
.. [2] http://www.richardhartersworld.com/cri/2001/slidingmin.html
Examples
--------
>>> from scipy.ndimage import maximum_filter1d
>>> maximum_filter1d([2, 8, 0, 4, 1, 9, 9, 0], size=3)
array([8, 8, 8, 4, 9, 9, 9, 9])
"""
input = numpy.asarray(input)
if numpy.iscomplexobj(input):
raise TypeError('Complex type not supported')
axis = _ni_support._check_axis(axis, input.ndim)
if size < 1:
raise RuntimeError('incorrect filter size')
output = _ni_support._get_output(output, input)
if (size // 2 + origin < 0) or (size // 2 + origin >= size):
raise ValueError('invalid origin')
mode = _ni_support._extend_mode_to_code(mode)
_nd_image.min_or_max_filter1d(input, size, axis, output, mode, cval,
origin, 0)
return output
def _min_or_max_filter(input, size, footprint, structure, output, mode,
cval, origin, minimum):
if (size is not None) and (footprint is not None):
warnings.warn("ignoring size because footprint is set", UserWarning, stacklevel=3)
if structure is None:
if footprint is None:
if size is None:
raise RuntimeError("no footprint provided")
separable = True
else:
footprint = numpy.asarray(footprint, dtype=bool)
if not footprint.any():
raise ValueError("All-zero footprint is not supported.")
if footprint.all():
size = footprint.shape
footprint = None
separable = True
else:
separable = False
else:
structure = numpy.asarray(structure, dtype=numpy.float64)
separable = False
if footprint is None:
footprint = numpy.ones(structure.shape, bool)
else:
footprint = numpy.asarray(footprint, dtype=bool)
input = numpy.asarray(input)
if numpy.iscomplexobj(input):
raise TypeError('Complex type not supported')
output = _ni_support._get_output(output, input)
origins = _ni_support._normalize_sequence(origin, input.ndim)
if separable:
sizes = _ni_support._normalize_sequence(size, input.ndim)
modes = _ni_support._normalize_sequence(mode, input.ndim)
axes = list(range(input.ndim))
axes = [(axes[ii], sizes[ii], origins[ii], modes[ii])
for ii in range(len(axes)) if sizes[ii] > 1]
if minimum:
filter_ = minimum_filter1d
else:
filter_ = maximum_filter1d
if len(axes) > 0:
for axis, size, origin, mode in axes:
filter_(input, int(size), axis, output, mode, cval, origin)
input = output
else:
output[...] = input[...]
else:
fshape = [ii for ii in footprint.shape if ii > 0]
if len(fshape) != input.ndim:
raise RuntimeError('footprint array has incorrect shape.')
for origin, lenf in zip(origins, fshape):
if (lenf // 2 + origin < 0) or (lenf // 2 + origin >= lenf):
raise ValueError('invalid origin')
if not footprint.flags.contiguous:
footprint = footprint.copy()
if structure is not None:
if len(structure.shape) != input.ndim:
raise RuntimeError('structure array has incorrect shape')
if not structure.flags.contiguous:
structure = structure.copy()
mode = _ni_support._extend_mode_to_code(mode)
_nd_image.min_or_max_filter(input, footprint, structure, output,
mode, cval, origins, minimum)
return output
@_ni_docstrings.docfiller
def minimum_filter(input, size=None, footprint=None, output=None,
mode="reflect", cval=0.0, origin=0):
"""Calculate a multi-dimensional minimum filter.
Parameters
----------
%(input)s
%(size_foot)s
%(output)s
%(mode_multiple)s
%(cval)s
%(origin_multiple)s
Returns
-------
minimum_filter : ndarray
Filtered array. Has the same shape as `input`.
Examples
--------
>>> from scipy import ndimage, misc
>>> import matplotlib.pyplot as plt
>>> fig = plt.figure()
>>> plt.gray() # show the filtered result in grayscale
>>> ax1 = fig.add_subplot(121) # left side
>>> ax2 = fig.add_subplot(122) # right side
>>> ascent = misc.ascent()
>>> result = ndimage.minimum_filter(ascent, size=20)
>>> ax1.imshow(ascent)
>>> ax2.imshow(result)
>>> plt.show()
"""
return _min_or_max_filter(input, size, footprint, None, output, mode,
cval, origin, 1)
@_ni_docstrings.docfiller
def maximum_filter(input, size=None, footprint=None, output=None,
mode="reflect", cval=0.0, origin=0):
"""Calculate a multi-dimensional maximum filter.
Parameters
----------
%(input)s
%(size_foot)s
%(output)s
%(mode_multiple)s
%(cval)s
%(origin_multiple)s
Returns
-------
maximum_filter : ndarray
Filtered array. Has the same shape as `input`.
Examples
--------
>>> from scipy import ndimage, misc
>>> import matplotlib.pyplot as plt
>>> fig = plt.figure()
>>> plt.gray() # show the filtered result in grayscale
>>> ax1 = fig.add_subplot(121) # left side
>>> ax2 = fig.add_subplot(122) # right side
>>> ascent = misc.ascent()
>>> result = ndimage.maximum_filter(ascent, size=20)
>>> ax1.imshow(ascent)
>>> ax2.imshow(result)
>>> plt.show()
"""
return _min_or_max_filter(input, size, footprint, None, output, mode,
cval, origin, 0)
@_ni_docstrings.docfiller
def _rank_filter(input, rank, size=None, footprint=None, output=None,
mode="reflect", cval=0.0, origin=0, operation='rank'):
if (size is not None) and (footprint is not None):
warnings.warn("ignoring size because footprint is set", UserWarning, stacklevel=3)
input = numpy.asarray(input)
if numpy.iscomplexobj(input):
raise TypeError('Complex type not supported')
origins = _ni_support._normalize_sequence(origin, input.ndim)
if footprint is None:
if size is None:
raise RuntimeError("no footprint or filter size provided")
sizes = _ni_support._normalize_sequence(size, input.ndim)
footprint = numpy.ones(sizes, dtype=bool)
else:
footprint = numpy.asarray(footprint, dtype=bool)
fshape = [ii for ii in footprint.shape if ii > 0]
if len(fshape) != input.ndim:
raise RuntimeError('filter footprint array has incorrect shape.')
for origin, lenf in zip(origins, fshape):
if (lenf // 2 + origin < 0) or (lenf // 2 + origin >= lenf):
raise ValueError('invalid origin')
if not footprint.flags.contiguous:
footprint = footprint.copy()
filter_size = numpy.where(footprint, 1, 0).sum()
if operation == 'median':
rank = filter_size // 2
elif operation == 'percentile':
percentile = rank
if percentile < 0.0:
percentile += 100.0
if percentile < 0 or percentile > 100:
raise RuntimeError('invalid percentile')
if percentile == 100.0:
rank = filter_size - 1
else:
rank = int(float(filter_size) * percentile / 100.0)
if rank < 0:
rank += filter_size
if rank < 0 or rank >= filter_size:
raise RuntimeError('rank not within filter footprint size')
if rank == 0:
return minimum_filter(input, None, footprint, output, mode, cval,
origins)
elif rank == filter_size - 1:
return maximum_filter(input, None, footprint, output, mode, cval,
origins)
else:
output = _ni_support._get_output(output, input)
mode = _ni_support._extend_mode_to_code(mode)
_nd_image.rank_filter(input, rank, footprint, output, mode, cval,
origins)
return output
@_ni_docstrings.docfiller
def rank_filter(input, rank, size=None, footprint=None, output=None,
mode="reflect", cval=0.0, origin=0):
"""Calculate a multi-dimensional rank filter.
Parameters
----------
%(input)s
rank : int
The rank parameter may be less then zero, i.e., rank = -1
indicates the largest element.
%(size_foot)s
%(output)s
%(mode_multiple)s
%(cval)s
%(origin_multiple)s
Returns
-------
rank_filter : ndarray
Filtered array. Has the same shape as `input`.
Examples
--------
>>> from scipy import ndimage, misc
>>> import matplotlib.pyplot as plt
>>> fig = plt.figure()
>>> plt.gray() # show the filtered result in grayscale
>>> ax1 = fig.add_subplot(121) # left side
>>> ax2 = fig.add_subplot(122) # right side
>>> ascent = misc.ascent()
>>> result = ndimage.rank_filter(ascent, rank=42, size=20)
>>> ax1.imshow(ascent)
>>> ax2.imshow(result)
>>> plt.show()
"""
return _rank_filter(input, rank, size, footprint, output, mode, cval,
origin, 'rank')
@_ni_docstrings.docfiller
def median_filter(input, size=None, footprint=None, output=None,
mode="reflect", cval=0.0, origin=0):
"""
Calculate a multidimensional median filter.
Parameters
----------
%(input)s
%(size_foot)s
%(output)s
%(mode_multiple)s
%(cval)s
%(origin_multiple)s
Returns
-------
median_filter : ndarray
Filtered array. Has the same shape as `input`.
Examples
--------
>>> from scipy import ndimage, misc
>>> import matplotlib.pyplot as plt
>>> fig = plt.figure()
>>> plt.gray() # show the filtered result in grayscale
>>> ax1 = fig.add_subplot(121) # left side
>>> ax2 = fig.add_subplot(122) # right side
>>> ascent = misc.ascent()
>>> result = ndimage.median_filter(ascent, size=20)
>>> ax1.imshow(ascent)
>>> ax2.imshow(result)
>>> plt.show()
"""
return _rank_filter(input, 0, size, footprint, output, mode, cval,
origin, 'median')
@_ni_docstrings.docfiller
def percentile_filter(input, percentile, size=None, footprint=None,
output=None, mode="reflect", cval=0.0, origin=0):
"""Calculate a multi-dimensional percentile filter.
Parameters
----------
%(input)s
percentile : scalar
The percentile parameter may be less then zero, i.e.,
percentile = -20 equals percentile = 80
%(size_foot)s
%(output)s
%(mode_multiple)s
%(cval)s
%(origin_multiple)s
Returns
-------
percentile_filter : ndarray
Filtered array. Has the same shape as `input`.
Examples
--------
>>> from scipy import ndimage, misc
>>> import matplotlib.pyplot as plt
>>> fig = plt.figure()
>>> plt.gray() # show the filtered result in grayscale
>>> ax1 = fig.add_subplot(121) # left side
>>> ax2 = fig.add_subplot(122) # right side
>>> ascent = misc.ascent()
>>> result = ndimage.percentile_filter(ascent, percentile=20, size=20)
>>> ax1.imshow(ascent)
>>> ax2.imshow(result)
>>> plt.show()
"""
return _rank_filter(input, percentile, size, footprint, output, mode,
cval, origin, 'percentile')
@_ni_docstrings.docfiller
def generic_filter1d(input, function, filter_size, axis=-1,
output=None, mode="reflect", cval=0.0, origin=0,
extra_arguments=(), extra_keywords=None):
"""Calculate a one-dimensional filter along the given axis.
`generic_filter1d` iterates over the lines of the array, calling the
given function at each line. The arguments of the line are the
input line, and the output line. The input and output lines are 1D
double arrays. The input line is extended appropriately according
to the filter size and origin. The output line must be modified
in-place with the result.
Parameters
----------
%(input)s
function : {callable, scipy.LowLevelCallable}
Function to apply along given axis.
filter_size : scalar
Length of the filter.
%(axis)s
%(output)s
%(mode)s
%(cval)s
%(origin)s
%(extra_arguments)s
%(extra_keywords)s
Notes
-----
This function also accepts low-level callback functions with one of
the following signatures and wrapped in `scipy.LowLevelCallable`:
.. code:: c
int function(double *input_line, npy_intp input_length,
double *output_line, npy_intp output_length,
void *user_data)
int function(double *input_line, intptr_t input_length,
double *output_line, intptr_t output_length,
void *user_data)
The calling function iterates over the lines of the input and output
arrays, calling the callback function at each line. The current line
is extended according to the border conditions set by the calling
function, and the result is copied into the array that is passed
through ``input_line``. The length of the input line (after extension)
is passed through ``input_length``. The callback function should apply
the filter and store the result in the array passed through
``output_line``. The length of the output line is passed through
``output_length``. ``user_data`` is the data pointer provided
to `scipy.LowLevelCallable` as-is.
The callback function must return an integer error status that is zero
if something went wrong and one otherwise. If an error occurs, you should
normally set the python error status with an informative message
before returning, otherwise a default error message is set by the
calling function.
In addition, some other low-level function pointer specifications
are accepted, but these are for backward compatibility only and should
not be used in new code.
"""
if extra_keywords is None:
extra_keywords = {}
input = numpy.asarray(input)
if numpy.iscomplexobj(input):
raise TypeError('Complex type not supported')
output = _ni_support._get_output(output, input)
if filter_size < 1:
raise RuntimeError('invalid filter size')
axis = _ni_support._check_axis(axis, input.ndim)
if (filter_size // 2 + origin < 0) or (filter_size // 2 + origin >=
filter_size):
raise ValueError('invalid origin')
mode = _ni_support._extend_mode_to_code(mode)
_nd_image.generic_filter1d(input, function, filter_size, axis, output,
mode, cval, origin, extra_arguments,
extra_keywords)
return output
@_ni_docstrings.docfiller
def generic_filter(input, function, size=None, footprint=None,
output=None, mode="reflect", cval=0.0, origin=0,
extra_arguments=(), extra_keywords=None):
"""Calculate a multi-dimensional filter using the given function.
At each element the provided function is called. The input values
within the filter footprint at that element are passed to the function
as a 1D array of double values.
Parameters
----------
%(input)s
function : {callable, scipy.LowLevelCallable}
Function to apply at each element.
%(size_foot)s
%(output)s
%(mode_multiple)s
%(cval)s
%(origin_multiple)s
%(extra_arguments)s
%(extra_keywords)s
Notes
-----
This function also accepts low-level callback functions with one of
the following signatures and wrapped in `scipy.LowLevelCallable`:
.. code:: c
int callback(double *buffer, npy_intp filter_size,
double *return_value, void *user_data)
int callback(double *buffer, intptr_t filter_size,
double *return_value, void *user_data)
The calling function iterates over the elements of the input and
output arrays, calling the callback function at each element. The
elements within the footprint of the filter at the current element are
passed through the ``buffer`` parameter, and the number of elements
within the footprint through ``filter_size``. The calculated value is
returned in ``return_value``. ``user_data`` is the data pointer provided
to `scipy.LowLevelCallable` as-is.
The callback function must return an integer error status that is zero
if something went wrong and one otherwise. If an error occurs, you should
normally set the python error status with an informative message
before returning, otherwise a default error message is set by the
calling function.
In addition, some other low-level function pointer specifications
are accepted, but these are for backward compatibility only and should
not be used in new code.
"""
if (size is not None) and (footprint is not None):
warnings.warn("ignoring size because footprint is set", UserWarning, stacklevel=2)
if extra_keywords is None:
extra_keywords = {}
input = numpy.asarray(input)
if numpy.iscomplexobj(input):
raise TypeError('Complex type not supported')
origins = _ni_support._normalize_sequence(origin, input.ndim)
if footprint is None:
if size is None:
raise RuntimeError("no footprint or filter size provided")
sizes = _ni_support._normalize_sequence(size, input.ndim)
footprint = numpy.ones(sizes, dtype=bool)
else:
footprint = numpy.asarray(footprint, dtype=bool)
fshape = [ii for ii in footprint.shape if ii > 0]
if len(fshape) != input.ndim:
raise RuntimeError('filter footprint array has incorrect shape.')
for origin, lenf in zip(origins, fshape):
if (lenf // 2 + origin < 0) or (lenf // 2 + origin >= lenf):
raise ValueError('invalid origin')
if not footprint.flags.contiguous:
footprint = footprint.copy()
output = _ni_support._get_output(output, input)
mode = _ni_support._extend_mode_to_code(mode)
_nd_image.generic_filter(input, function, footprint, output, mode,
cval, origins, extra_arguments, extra_keywords)
return output
| bsd-3-clause |
CIFASIS/pylearn2 | pylearn2/models/tests/test_s3c_inference.py | 44 | 14386 | from __future__ import print_function
from pylearn2.models.s3c import S3C
from pylearn2.models.s3c import E_Step_Scan
from pylearn2.models.s3c import Grad_M_Step
from pylearn2.models.s3c import E_Step
from pylearn2.utils import contains_nan
from theano import function
import numpy as np
from theano.compat.six.moves import xrange
import theano.tensor as T
from theano import config
#from pylearn2.utils import serial
def broadcast(mat, shape_0):
rval = mat
if mat.shape[0] != shape_0:
assert mat.shape[0] == 1
rval = np.zeros((shape_0, mat.shape[1]),dtype=mat.dtype)
for i in xrange(shape_0):
rval[i,:] = mat[0,:]
return rval
class Test_S3C_Inference:
def setUp(self):
# Temporarily change config.floatX to float64, as s3c inference
# tests currently fail due to numerical issues for float32.
self.prev_floatX = config.floatX
config.floatX = 'float64'
def tearDown(self):
# Restore previous value of floatX
config.floatX = self.prev_floatX
def __init__(self):
""" gets a small batch of data
sets up an S3C model
"""
# We also have to change the value of config.floatX in __init__.
self.prev_floatX = config.floatX
config.floatX = 'float64'
try:
self.tol = 1e-5
#dataset = serial.load('${PYLEARN2_DATA_PATH}/stl10/stl10_patches/data.pkl')
#X = dataset.get_batch_design(1000)
#X = X[:,0:5]
X = np.random.RandomState([1,2,3]).randn(1000,5)
X -= X.mean()
X /= X.std()
m, D = X.shape
N = 5
#don't give the model an e_step or learning rate so it won't spend years compiling a learn_func
self.model = S3C(nvis = D,
nhid = N,
irange = .1,
init_bias_hid = 0.,
init_B = 3.,
min_B = 1e-8,
max_B = 1000.,
init_alpha = 1., min_alpha = 1e-8, max_alpha = 1000.,
init_mu = 1., e_step = None,
m_step = Grad_M_Step(),
min_bias_hid = -1e30, max_bias_hid = 1e30,
)
self.model.make_pseudoparams()
self.h_new_coeff_schedule = [.1, .2, .3, .4, .5, .6, .7, .8, .9, 1. ]
self.e_step = E_Step_Scan(h_new_coeff_schedule = self.h_new_coeff_schedule)
self.e_step.register_model(self.model)
self.X = X
self.N = N
self.m = m
finally:
config.floatX = self.prev_floatX
def test_match_unrolled(self):
""" tests that inference with scan matches result using unrolled loops """
unrolled_e_step = E_Step(h_new_coeff_schedule = self.h_new_coeff_schedule)
unrolled_e_step.register_model(self.model)
V = T.matrix()
scan_result = self.e_step.infer(V)
unrolled_result = unrolled_e_step.infer(V)
outputs = []
for key in scan_result:
outputs.append(scan_result[key])
outputs.append(unrolled_result[key])
f = function([V], outputs)
outputs = f(self.X)
assert len(outputs) % 2 == 0
for i in xrange(0,len(outputs),2):
assert np.allclose(outputs[i],outputs[i+1])
def test_grad_s(self):
"tests that the gradients with respect to s_i are 0 after doing a mean field update of s_i "
model = self.model
e_step = self.e_step
X = self.X
assert X.shape[0] == self.m
model.test_batch_size = X.shape[0]
init_H = e_step.init_H_hat(V = X)
init_Mu1 = e_step.init_S_hat(V = X)
prev_setting = config.compute_test_value
config.compute_test_value= 'off'
H, Mu1 = function([], outputs=[init_H, init_Mu1])()
config.compute_test_value = prev_setting
H = broadcast(H, self.m)
Mu1 = broadcast(Mu1, self.m)
H = np.cast[config.floatX](self.model.rng.uniform(0.,1.,H.shape))
Mu1 = np.cast[config.floatX](self.model.rng.uniform(-5.,5.,Mu1.shape))
H_var = T.matrix(name='H_var')
H_var.tag.test_value = H
Mu1_var = T.matrix(name='Mu1_var')
Mu1_var.tag.test_value = Mu1
idx = T.iscalar()
idx.tag.test_value = 0
S = e_step.infer_S_hat(V = X, H_hat = H_var, S_hat = Mu1_var)
s_idx = S[:,idx]
s_i_func = function([H_var,Mu1_var,idx],s_idx)
sigma0 = 1. / model.alpha
Sigma1 = e_step.infer_var_s1_hat()
mu0 = T.zeros_like(model.mu)
#by truncated KL, I mean that I am dropping terms that don't depend on H and Mu1
# (they don't affect the outcome of this test and some of them are intractable )
trunc_kl = - model.entropy_hs(H_hat = H_var, var_s0_hat = sigma0, var_s1_hat = Sigma1) + \
model.expected_energy_vhs(V = X, H_hat = H_var, S_hat = Mu1_var, var_s0_hat = sigma0, var_s1_hat = Sigma1)
grad_Mu1 = T.grad(trunc_kl.sum(), Mu1_var)
grad_Mu1_idx = grad_Mu1[:,idx]
grad_func = function([H_var, Mu1_var, idx], grad_Mu1_idx)
for i in xrange(self.N):
Mu1[:,i] = s_i_func(H, Mu1, i)
g = grad_func(H,Mu1,i)
assert not contains_nan(g)
g_abs_max = np.abs(g).max()
if g_abs_max > self.tol:
raise Exception('after mean field step, gradient of kl divergence wrt mean field parameter should be 0, but here the max magnitude of a gradient element is '+str(g_abs_max)+' after updating s_'+str(i))
def test_value_s(self):
"tests that the value of the kl divergence decreases with each update to s_i "
model = self.model
e_step = self.e_step
X = self.X
assert X.shape[0] == self.m
init_H = e_step.init_H_hat(V = X)
init_Mu1 = e_step.init_S_hat(V = X)
prev_setting = config.compute_test_value
config.compute_test_value= 'off'
H, Mu1 = function([], outputs=[init_H, init_Mu1])()
config.compute_test_value = prev_setting
H = broadcast(H, self.m)
Mu1 = broadcast(Mu1, self.m)
H = np.cast[config.floatX](self.model.rng.uniform(0.,1.,H.shape))
Mu1 = np.cast[config.floatX](self.model.rng.uniform(-5.,5.,Mu1.shape))
H_var = T.matrix(name='H_var')
H_var.tag.test_value = H
Mu1_var = T.matrix(name='Mu1_var')
Mu1_var.tag.test_value = Mu1
idx = T.iscalar()
idx.tag.test_value = 0
S = e_step.infer_S_hat( V = X, H_hat = H_var, S_hat = Mu1_var)
s_idx = S[:,idx]
s_i_func = function([H_var,Mu1_var,idx],s_idx)
sigma0 = 1. / model.alpha
Sigma1 = e_step.infer_var_s1_hat()
mu0 = T.zeros_like(model.mu)
#by truncated KL, I mean that I am dropping terms that don't depend on H and Mu1
# (they don't affect the outcome of this test and some of them are intractable )
trunc_kl = - model.entropy_hs(H_hat = H_var, var_s0_hat = sigma0, var_s1_hat = Sigma1) + \
model.expected_energy_vhs(V = X, H_hat = H_var, S_hat = Mu1_var, var_s0_hat = sigma0, var_s1_hat = Sigma1)
trunc_kl_func = function([H_var, Mu1_var], trunc_kl)
for i in xrange(self.N):
prev_kl = trunc_kl_func(H,Mu1)
Mu1[:,i] = s_i_func(H, Mu1, i)
new_kl = trunc_kl_func(H,Mu1)
increase = new_kl - prev_kl
mx = increase.max()
if mx > 1e-3:
raise Exception('after mean field step in s, kl divergence should decrease, but some elements increased by as much as '+str(mx)+' after updating s_'+str(i))
def test_grad_h(self):
"tests that the gradients with respect to h_i are 0 after doing a mean field update of h_i "
model = self.model
e_step = self.e_step
X = self.X
assert X.shape[0] == self.m
init_H = e_step.init_H_hat(V = X)
init_Mu1 = e_step.init_S_hat(V = X)
prev_setting = config.compute_test_value
config.compute_test_value= 'off'
H, Mu1 = function([], outputs=[init_H, init_Mu1])()
config.compute_test_value = prev_setting
H = broadcast(H, self.m)
Mu1 = broadcast(Mu1, self.m)
H = np.cast[config.floatX](self.model.rng.uniform(0.,1.,H.shape))
Mu1 = np.cast[config.floatX](self.model.rng.uniform(-5.,5.,Mu1.shape))
H_var = T.matrix(name='H_var')
H_var.tag.test_value = H
Mu1_var = T.matrix(name='Mu1_var')
Mu1_var.tag.test_value = Mu1
idx = T.iscalar()
idx.tag.test_value = 0
new_H = e_step.infer_H_hat(V = X, H_hat = H_var, S_hat = Mu1_var)
h_idx = new_H[:,idx]
updates_func = function([H_var,Mu1_var,idx], h_idx)
sigma0 = 1. / model.alpha
Sigma1 = e_step.infer_var_s1_hat()
mu0 = T.zeros_like(model.mu)
#by truncated KL, I mean that I am dropping terms that don't depend on H and Mu1
# (they don't affect the outcome of this test and some of them are intractable )
trunc_kl = - model.entropy_hs(H_hat = H_var, var_s0_hat = sigma0, var_s1_hat = Sigma1) + \
model.expected_energy_vhs(V = X, H_hat = H_var, S_hat = Mu1_var, var_s0_hat = sigma0,
var_s1_hat = Sigma1)
grad_H = T.grad(trunc_kl.sum(), H_var)
assert len(grad_H.type.broadcastable) == 2
#from theano.printing import min_informative_str
#print min_informative_str(grad_H)
#grad_H = Print('grad_H')(grad_H)
#grad_H_idx = grad_H[:,idx]
grad_func = function([H_var, Mu1_var], grad_H)
failed = False
for i in xrange(self.N):
rval = updates_func(H, Mu1, i)
H[:,i] = rval
g = grad_func(H,Mu1)[:,i]
assert not contains_nan(g)
g_abs_max = np.abs(g).max()
if g_abs_max > self.tol:
#print "new values of H"
#print H[:,i]
#print "gradient on new values of H"
#print g
failed = True
print('iteration ',i)
#print 'max value of new H: ',H[:,i].max()
#print 'H for failing g: '
failing_h = H[np.abs(g) > self.tol, i]
#print failing_h
#from matplotlib import pyplot as plt
#plt.scatter(H[:,i],g)
#plt.show()
#ignore failures extremely close to h=1
high_mask = failing_h > .001
low_mask = failing_h < .999
mask = high_mask * low_mask
print('masked failures: ',mask.shape[0],' err ',g_abs_max)
if mask.sum() > 0:
print('failing h passing the range mask')
print(failing_h[ mask.astype(bool) ])
raise Exception('after mean field step, gradient of kl divergence'
' wrt freshly updated variational parameter should be 0, '
'but here the max magnitude of a gradient element is '
+str(g_abs_max)+' after updating h_'+str(i))
#assert not failed
def test_value_h(self):
"tests that the value of the kl divergence decreases with each update to h_i "
model = self.model
e_step = self.e_step
X = self.X
assert X.shape[0] == self.m
init_H = e_step.init_H_hat(V = X)
init_Mu1 = e_step.init_S_hat(V = X)
prev_setting = config.compute_test_value
config.compute_test_value= 'off'
H, Mu1 = function([], outputs=[init_H, init_Mu1])()
config.compute_test_value = prev_setting
H = broadcast(H, self.m)
Mu1 = broadcast(Mu1, self.m)
H = np.cast[config.floatX](self.model.rng.uniform(0.,1.,H.shape))
Mu1 = np.cast[config.floatX](self.model.rng.uniform(-5.,5.,Mu1.shape))
H_var = T.matrix(name='H_var')
H_var.tag.test_value = H
Mu1_var = T.matrix(name='Mu1_var')
Mu1_var.tag.test_value = Mu1
idx = T.iscalar()
idx.tag.test_value = 0
newH = e_step.infer_H_hat(V = X, H_hat = H_var, S_hat = Mu1_var)
h_idx = newH[:,idx]
h_i_func = function([H_var,Mu1_var,idx],h_idx)
sigma0 = 1. / model.alpha
Sigma1 = e_step.infer_var_s1_hat()
mu0 = T.zeros_like(model.mu)
#by truncated KL, I mean that I am dropping terms that don't depend on H and Mu1
# (they don't affect the outcome of this test and some of them are intractable )
trunc_kl = - model.entropy_hs(H_hat = H_var, var_s0_hat = sigma0, var_s1_hat = Sigma1) + \
model.expected_energy_vhs(V = X, H_hat = H_var, S_hat = Mu1_var, var_s0_hat = sigma0, var_s1_hat = Sigma1)
trunc_kl_func = function([H_var, Mu1_var], trunc_kl)
for i in xrange(self.N):
prev_kl = trunc_kl_func(H,Mu1)
H[:,i] = h_i_func(H, Mu1, i)
#we don't update mu, the whole point of the split e step is we don't have to
new_kl = trunc_kl_func(H,Mu1)
increase = new_kl - prev_kl
print('failures after iteration ',i,': ',(increase > self.tol).sum())
mx = increase.max()
if mx > 1e-4:
print('increase amounts of failing examples:')
print(increase[increase > self.tol])
print('failing H:')
print(H[increase > self.tol,:])
print('failing Mu1:')
print(Mu1[increase > self.tol,:])
print('failing V:')
print(X[increase > self.tol,:])
raise Exception('after mean field step in h, kl divergence should decrease, but some elements increased by as much as '+str(mx)+' after updating h_'+str(i))
if __name__ == '__main__':
obj = Test_S3C_Inference()
#obj.test_grad_h()
#obj.test_grad_s()
#obj.test_value_s()
obj.test_value_h()
| bsd-3-clause |
tawsifkhan/scikit-learn | examples/svm/plot_separating_hyperplane_unbalanced.py | 329 | 1850 | """
=================================================
SVM: Separating hyperplane for unbalanced classes
=================================================
Find the optimal separating hyperplane using an SVC for classes that
are unbalanced.
We first find the separating plane with a plain SVC and then plot
(dashed) the separating hyperplane with automatically correction for
unbalanced classes.
.. currentmodule:: sklearn.linear_model
.. note::
This example will also work by replacing ``SVC(kernel="linear")``
with ``SGDClassifier(loss="hinge")``. Setting the ``loss`` parameter
of the :class:`SGDClassifier` equal to ``hinge`` will yield behaviour
such as that of a SVC with a linear kernel.
For example try instead of the ``SVC``::
clf = SGDClassifier(n_iter=100, alpha=0.01)
"""
print(__doc__)
import numpy as np
import matplotlib.pyplot as plt
from sklearn import svm
#from sklearn.linear_model import SGDClassifier
# we create 40 separable points
rng = np.random.RandomState(0)
n_samples_1 = 1000
n_samples_2 = 100
X = np.r_[1.5 * rng.randn(n_samples_1, 2),
0.5 * rng.randn(n_samples_2, 2) + [2, 2]]
y = [0] * (n_samples_1) + [1] * (n_samples_2)
# fit the model and get the separating hyperplane
clf = svm.SVC(kernel='linear', C=1.0)
clf.fit(X, y)
w = clf.coef_[0]
a = -w[0] / w[1]
xx = np.linspace(-5, 5)
yy = a * xx - clf.intercept_[0] / w[1]
# get the separating hyperplane using weighted classes
wclf = svm.SVC(kernel='linear', class_weight={1: 10})
wclf.fit(X, y)
ww = wclf.coef_[0]
wa = -ww[0] / ww[1]
wyy = wa * xx - wclf.intercept_[0] / ww[1]
# plot separating hyperplanes and samples
h0 = plt.plot(xx, yy, 'k-', label='no weights')
h1 = plt.plot(xx, wyy, 'k--', label='with weights')
plt.scatter(X[:, 0], X[:, 1], c=y, cmap=plt.cm.Paired)
plt.legend()
plt.axis('tight')
plt.show()
| bsd-3-clause |
joernhees/scikit-learn | examples/ensemble/plot_adaboost_multiclass.py | 354 | 4124 | """
=====================================
Multi-class AdaBoosted Decision Trees
=====================================
This example reproduces Figure 1 of Zhu et al [1] and shows how boosting can
improve prediction accuracy on a multi-class problem. The classification
dataset is constructed by taking a ten-dimensional standard normal distribution
and defining three classes separated by nested concentric ten-dimensional
spheres such that roughly equal numbers of samples are in each class (quantiles
of the :math:`\chi^2` distribution).
The performance of the SAMME and SAMME.R [1] algorithms are compared. SAMME.R
uses the probability estimates to update the additive model, while SAMME uses
the classifications only. As the example illustrates, the SAMME.R algorithm
typically converges faster than SAMME, achieving a lower test error with fewer
boosting iterations. The error of each algorithm on the test set after each
boosting iteration is shown on the left, the classification error on the test
set of each tree is shown in the middle, and the boost weight of each tree is
shown on the right. All trees have a weight of one in the SAMME.R algorithm and
therefore are not shown.
.. [1] J. Zhu, H. Zou, S. Rosset, T. Hastie, "Multi-class AdaBoost", 2009.
"""
print(__doc__)
# Author: Noel Dawe <noel.dawe@gmail.com>
#
# License: BSD 3 clause
from sklearn.externals.six.moves import zip
import matplotlib.pyplot as plt
from sklearn.datasets import make_gaussian_quantiles
from sklearn.ensemble import AdaBoostClassifier
from sklearn.metrics import accuracy_score
from sklearn.tree import DecisionTreeClassifier
X, y = make_gaussian_quantiles(n_samples=13000, n_features=10,
n_classes=3, random_state=1)
n_split = 3000
X_train, X_test = X[:n_split], X[n_split:]
y_train, y_test = y[:n_split], y[n_split:]
bdt_real = AdaBoostClassifier(
DecisionTreeClassifier(max_depth=2),
n_estimators=600,
learning_rate=1)
bdt_discrete = AdaBoostClassifier(
DecisionTreeClassifier(max_depth=2),
n_estimators=600,
learning_rate=1.5,
algorithm="SAMME")
bdt_real.fit(X_train, y_train)
bdt_discrete.fit(X_train, y_train)
real_test_errors = []
discrete_test_errors = []
for real_test_predict, discrete_train_predict in zip(
bdt_real.staged_predict(X_test), bdt_discrete.staged_predict(X_test)):
real_test_errors.append(
1. - accuracy_score(real_test_predict, y_test))
discrete_test_errors.append(
1. - accuracy_score(discrete_train_predict, y_test))
n_trees_discrete = len(bdt_discrete)
n_trees_real = len(bdt_real)
# Boosting might terminate early, but the following arrays are always
# n_estimators long. We crop them to the actual number of trees here:
discrete_estimator_errors = bdt_discrete.estimator_errors_[:n_trees_discrete]
real_estimator_errors = bdt_real.estimator_errors_[:n_trees_real]
discrete_estimator_weights = bdt_discrete.estimator_weights_[:n_trees_discrete]
plt.figure(figsize=(15, 5))
plt.subplot(131)
plt.plot(range(1, n_trees_discrete + 1),
discrete_test_errors, c='black', label='SAMME')
plt.plot(range(1, n_trees_real + 1),
real_test_errors, c='black',
linestyle='dashed', label='SAMME.R')
plt.legend()
plt.ylim(0.18, 0.62)
plt.ylabel('Test Error')
plt.xlabel('Number of Trees')
plt.subplot(132)
plt.plot(range(1, n_trees_discrete + 1), discrete_estimator_errors,
"b", label='SAMME', alpha=.5)
plt.plot(range(1, n_trees_real + 1), real_estimator_errors,
"r", label='SAMME.R', alpha=.5)
plt.legend()
plt.ylabel('Error')
plt.xlabel('Number of Trees')
plt.ylim((.2,
max(real_estimator_errors.max(),
discrete_estimator_errors.max()) * 1.2))
plt.xlim((-20, len(bdt_discrete) + 20))
plt.subplot(133)
plt.plot(range(1, n_trees_discrete + 1), discrete_estimator_weights,
"b", label='SAMME')
plt.legend()
plt.ylabel('Weight')
plt.xlabel('Number of Trees')
plt.ylim((0, discrete_estimator_weights.max() * 1.2))
plt.xlim((-20, n_trees_discrete + 20))
# prevent overlapping y-axis labels
plt.subplots_adjust(wspace=0.25)
plt.show()
| bsd-3-clause |
cactusbin/nyt | matplotlib/lib/matplotlib/afm.py | 4 | 16093 | """
This is a python interface to Adobe Font Metrics Files. Although a
number of other python implementations exist, and may be more complete
than this, it was decided not to go with them because they were
either:
1) copyrighted or used a non-BSD compatible license
2) had too many dependencies and a free standing lib was needed
3) Did more than needed and it was easier to write afresh rather than
figure out how to get just what was needed.
It is pretty easy to use, and requires only built-in python libs:
>>> from matplotlib import rcParams
>>> import os.path
>>> afm_fname = os.path.join(rcParams['datapath'],
... 'fonts', 'afm', 'ptmr8a.afm')
>>>
>>> from matplotlib.afm import AFM
>>> afm = AFM(open(afm_fname))
>>> afm.string_width_height('What the heck?')
(6220.0, 694)
>>> afm.get_fontname()
'Times-Roman'
>>> afm.get_kern_dist('A', 'f')
0
>>> afm.get_kern_dist('A', 'y')
-92.0
>>> afm.get_bbox_char('!')
[130, -9, 238, 676]
"""
from __future__ import print_function
import sys
import os
import re
from _mathtext_data import uni2type1
#Convert string the a python type
# some afm files have floats where we are expecting ints -- there is
# probably a better way to handle this (support floats, round rather
# than truncate). But I don't know what the best approach is now and
# this change to _to_int should at least prevent mpl from crashing on
# these JDH (2009-11-06)
def _to_int(x):
return int(float(x))
_to_float = float
if sys.version_info[0] >= 3:
def _to_str(x):
return x.decode('utf8')
else:
_to_str = str
def _to_list_of_ints(s):
s = s.replace(b',', b' ')
return [_to_int(val) for val in s.split()]
def _to_list_of_floats(s):
return [_to_float(val) for val in s.split()]
def _to_bool(s):
if s.lower().strip() in (b'false', b'0', b'no'):
return False
else:
return True
def _sanity_check(fh):
"""
Check if the file at least looks like AFM.
If not, raise :exc:`RuntimeError`.
"""
# Remember the file position in case the caller wants to
# do something else with the file.
pos = fh.tell()
try:
line = fh.readline()
finally:
fh.seek(pos, 0)
# AFM spec, Section 4: The StartFontMetrics keyword [followed by a
# version number] must be the first line in the file, and the
# EndFontMetrics keyword must be the last non-empty line in the
# file. We just check the first line.
if not line.startswith(b'StartFontMetrics'):
raise RuntimeError('Not an AFM file')
def _parse_header(fh):
"""
Reads the font metrics header (up to the char metrics) and returns
a dictionary mapping *key* to *val*. *val* will be converted to the
appropriate python type as necessary; e.g.:
* 'False'->False
* '0'->0
* '-168 -218 1000 898'-> [-168, -218, 1000, 898]
Dictionary keys are
StartFontMetrics, FontName, FullName, FamilyName, Weight,
ItalicAngle, IsFixedPitch, FontBBox, UnderlinePosition,
UnderlineThickness, Version, Notice, EncodingScheme, CapHeight,
XHeight, Ascender, Descender, StartCharMetrics
"""
headerConverters = {
b'StartFontMetrics': _to_float,
b'FontName': _to_str,
b'FullName': _to_str,
b'FamilyName': _to_str,
b'Weight': _to_str,
b'ItalicAngle': _to_float,
b'IsFixedPitch': _to_bool,
b'FontBBox': _to_list_of_ints,
b'UnderlinePosition': _to_int,
b'UnderlineThickness': _to_int,
b'Version': _to_str,
b'Notice': _to_str,
b'EncodingScheme': _to_str,
b'CapHeight': _to_float, # Is the second version a mistake, or
b'Capheight': _to_float, # do some AFM files contain 'Capheight'? -JKS
b'XHeight': _to_float,
b'Ascender': _to_float,
b'Descender': _to_float,
b'StdHW': _to_float,
b'StdVW': _to_float,
b'StartCharMetrics': _to_int,
b'CharacterSet': _to_str,
b'Characters': _to_int,
}
d = {}
while 1:
line = fh.readline()
if not line:
break
line = line.rstrip()
if line.startswith(b'Comment'):
continue
lst = line.split(b' ', 1)
#print '%-s\t%-d line :: %-s' % ( fh.name, len(lst), line )
key = lst[0]
if len(lst) == 2:
val = lst[1]
else:
val = b''
#key, val = line.split(' ', 1)
try:
d[key] = headerConverters[key](val)
except ValueError:
print('Value error parsing header in AFM:',
key, val, file=sys.stderr)
continue
except KeyError:
print('Found an unknown keyword in AFM header (was %s)' % key,
file=sys.stderr)
continue
if key == b'StartCharMetrics':
return d
raise RuntimeError('Bad parse')
def _parse_char_metrics(fh):
"""
Return a character metric dictionary. Keys are the ASCII num of
the character, values are a (*wx*, *name*, *bbox*) tuple, where
*wx* is the character width, *name* is the postscript language
name, and *bbox* is a (*llx*, *lly*, *urx*, *ury*) tuple.
This function is incomplete per the standard, but thus far parses
all the sample afm files tried.
"""
ascii_d = {}
name_d = {}
while 1:
line = fh.readline()
if not line:
break
line = line.rstrip()
if line.startswith(b'EndCharMetrics'):
return ascii_d, name_d
vals = line.split(b';')[:4]
if len(vals) != 4:
raise RuntimeError('Bad char metrics line: %s' % line)
num = _to_int(vals[0].split()[1])
wx = _to_float(vals[1].split()[1])
name = vals[2].split()[1]
name = name.decode('ascii')
bbox = _to_list_of_floats(vals[3][2:])
bbox = map(int, bbox)
# Workaround: If the character name is 'Euro', give it the
# corresponding character code, according to WinAnsiEncoding (see PDF
# Reference).
if name == 'Euro':
num = 128
if num != -1:
ascii_d[num] = (wx, name, bbox)
name_d[name] = (wx, bbox)
raise RuntimeError('Bad parse')
def _parse_kern_pairs(fh):
"""
Return a kern pairs dictionary; keys are (*char1*, *char2*) tuples and
values are the kern pair value. For example, a kern pairs line like
``KPX A y -50``
will be represented as::
d[ ('A', 'y') ] = -50
"""
line = fh.readline()
if not line.startswith(b'StartKernPairs'):
raise RuntimeError('Bad start of kern pairs data: %s' % line)
d = {}
while 1:
line = fh.readline()
if not line:
break
line = line.rstrip()
if len(line) == 0:
continue
if line.startswith(b'EndKernPairs'):
fh.readline() # EndKernData
return d
vals = line.split()
if len(vals) != 4 or vals[0] != b'KPX':
raise RuntimeError('Bad kern pairs line: %s' % line)
c1, c2, val = _to_str(vals[1]), _to_str(vals[2]), _to_float(vals[3])
d[(c1, c2)] = val
raise RuntimeError('Bad kern pairs parse')
def _parse_composites(fh):
"""
Return a composites dictionary. Keys are the names of the
composites. Values are a num parts list of composite information,
with each element being a (*name*, *dx*, *dy*) tuple. Thus a
composites line reading:
CC Aacute 2 ; PCC A 0 0 ; PCC acute 160 170 ;
will be represented as::
d['Aacute'] = [ ('A', 0, 0), ('acute', 160, 170) ]
"""
d = {}
while 1:
line = fh.readline()
if not line:
break
line = line.rstrip()
if len(line) == 0:
continue
if line.startswith(b'EndComposites'):
return d
vals = line.split(b';')
cc = vals[0].split()
name, numParts = cc[1], _to_int(cc[2])
pccParts = []
for s in vals[1:-1]:
pcc = s.split()
name, dx, dy = pcc[1], _to_float(pcc[2]), _to_float(pcc[3])
pccParts.append((name, dx, dy))
d[name] = pccParts
raise RuntimeError('Bad composites parse')
def _parse_optional(fh):
"""
Parse the optional fields for kern pair data and composites
return value is a (*kernDict*, *compositeDict*) which are the
return values from :func:`_parse_kern_pairs`, and
:func:`_parse_composites` if the data exists, or empty dicts
otherwise
"""
optional = {
b'StartKernData': _parse_kern_pairs,
b'StartComposites': _parse_composites,
}
d = {b'StartKernData': {}, b'StartComposites': {}}
while 1:
line = fh.readline()
if not line:
break
line = line.rstrip()
if len(line) == 0:
continue
key = line.split()[0]
if key in optional:
d[key] = optional[key](fh)
l = (d[b'StartKernData'], d[b'StartComposites'])
return l
def parse_afm(fh):
"""
Parse the Adobe Font Metics file in file handle *fh*. Return value
is a (*dhead*, *dcmetrics*, *dkernpairs*, *dcomposite*) tuple where
*dhead* is a :func:`_parse_header` dict, *dcmetrics* is a
:func:`_parse_composites` dict, *dkernpairs* is a
:func:`_parse_kern_pairs` dict (possibly {}), and *dcomposite* is a
:func:`_parse_composites` dict (possibly {})
"""
_sanity_check(fh)
dhead = _parse_header(fh)
dcmetrics_ascii, dcmetrics_name = _parse_char_metrics(fh)
doptional = _parse_optional(fh)
return dhead, dcmetrics_ascii, dcmetrics_name, doptional[0], doptional[1]
class AFM(object):
def __init__(self, fh):
"""
Parse the AFM file in file object *fh*
"""
(dhead, dcmetrics_ascii, dcmetrics_name, dkernpairs, dcomposite) = \
parse_afm(fh)
self._header = dhead
self._kern = dkernpairs
self._metrics = dcmetrics_ascii
self._metrics_by_name = dcmetrics_name
self._composite = dcomposite
def get_bbox_char(self, c, isord=False):
if not isord:
c = ord(c)
wx, name, bbox = self._metrics[c]
return bbox
def string_width_height(self, s):
"""
Return the string width (including kerning) and string height
as a (*w*, *h*) tuple.
"""
if not len(s):
return 0, 0
totalw = 0
namelast = None
miny = 1e9
maxy = 0
for c in s:
if c == '\n':
continue
wx, name, bbox = self._metrics[ord(c)]
l, b, w, h = bbox
# find the width with kerning
try:
kp = self._kern[(namelast, name)]
except KeyError:
kp = 0
totalw += wx + kp
# find the max y
thismax = b + h
if thismax > maxy:
maxy = thismax
# find the min y
thismin = b
if thismin < miny:
miny = thismin
namelast = name
return totalw, maxy - miny
def get_str_bbox_and_descent(self, s):
"""
Return the string bounding box
"""
if not len(s):
return 0, 0, 0, 0
totalw = 0
namelast = None
miny = 1e9
maxy = 0
left = 0
if not isinstance(s, unicode):
s = s.decode('ascii')
for c in s:
if c == '\n':
continue
name = uni2type1.get(ord(c), 'question')
try:
wx, bbox = self._metrics_by_name[name]
except KeyError:
name = 'question'
wx, bbox = self._metrics_by_name[name]
l, b, w, h = bbox
if l < left:
left = l
# find the width with kerning
try:
kp = self._kern[(namelast, name)]
except KeyError:
kp = 0
totalw += wx + kp
# find the max y
thismax = b + h
if thismax > maxy:
maxy = thismax
# find the min y
thismin = b
if thismin < miny:
miny = thismin
namelast = name
return left, miny, totalw, maxy - miny, -miny
def get_str_bbox(self, s):
"""
Return the string bounding box
"""
return self.get_str_bbox_and_descent(s)[:4]
def get_name_char(self, c, isord=False):
"""
Get the name of the character, ie, ';' is 'semicolon'
"""
if not isord:
c = ord(c)
wx, name, bbox = self._metrics[c]
return name
def get_width_char(self, c, isord=False):
"""
Get the width of the character from the character metric WX
field
"""
if not isord:
c = ord(c)
wx, name, bbox = self._metrics[c]
return wx
def get_width_from_char_name(self, name):
"""
Get the width of the character from a type1 character name
"""
wx, bbox = self._metrics_by_name[name]
return wx
def get_height_char(self, c, isord=False):
"""
Get the height of character *c* from the bounding box. This
is the ink height (space is 0)
"""
if not isord:
c = ord(c)
wx, name, bbox = self._metrics[c]
return bbox[-1]
def get_kern_dist(self, c1, c2):
"""
Return the kerning pair distance (possibly 0) for chars *c1*
and *c2*
"""
name1, name2 = self.get_name_char(c1), self.get_name_char(c2)
return self.get_kern_dist_from_name(name1, name2)
def get_kern_dist_from_name(self, name1, name2):
"""
Return the kerning pair distance (possibly 0) for chars
*name1* and *name2*
"""
try:
return self._kern[(name1, name2)]
except:
return 0
def get_fontname(self):
"Return the font name, e.g., 'Times-Roman'"
return self._header[b'FontName']
def get_fullname(self):
"Return the font full name, e.g., 'Times-Roman'"
name = self._header.get(b'FullName')
if name is None: # use FontName as a substitute
name = self._header[b'FontName']
return name
def get_familyname(self):
"Return the font family name, e.g., 'Times'"
name = self._header.get(b'FamilyName')
if name is not None:
return name
# FamilyName not specified so we'll make a guess
name = self.get_fullname()
extras = br'(?i)([ -](regular|plain|italic|oblique|bold|semibold|light|ultralight|extra|condensed))+$'
return re.sub(extras, '', name)
def get_weight(self):
"Return the font weight, e.g., 'Bold' or 'Roman'"
return self._header[b'Weight']
def get_angle(self):
"Return the fontangle as float"
return self._header[b'ItalicAngle']
def get_capheight(self):
"Return the cap height as float"
return self._header[b'CapHeight']
def get_xheight(self):
"Return the xheight as float"
return self._header[b'XHeight']
def get_underline_thickness(self):
"Return the underline thickness as float"
return self._header[b'UnderlineThickness']
def get_horizontal_stem_width(self):
"""
Return the standard horizontal stem width as float, or *None* if
not specified in AFM file.
"""
return self._header.get(b'StdHW', None)
def get_vertical_stem_width(self):
"""
Return the standard vertical stem width as float, or *None* if
not specified in AFM file.
"""
return self._header.get(b'StdVW', None) | unlicense |
wholmgren/pvlib-python | pvlib/test/test_midc.py | 2 | 2266 | import inspect
import os
import pandas as pd
from pandas.util.testing import network
import pytest
import pytz
from pvlib.iotools import midc
@pytest.fixture
def test_mapping():
return {
'Direct Normal [W/m^2]': 'dni',
'Global PSP [W/m^2]': 'ghi',
'Rel Humidity [%]': 'relative_humidity',
'Temperature @ 2m [deg C]': 'temp_air',
'Non Existant': 'variable',
}
test_dir = os.path.dirname(
os.path.abspath(inspect.getfile(inspect.currentframe())))
midc_testfile = os.path.join(test_dir, '../data/midc_20181014.txt')
midc_raw_testfile = os.path.join(test_dir, '../data/midc_raw_20181018.txt')
midc_network_testfile = ('https://midcdmz.nrel.gov/apps/data_api.pl'
'?site=UAT&begin=20181018&end=20181019')
def test_midc_format_index():
data = pd.read_csv(midc_testfile)
data = midc.format_index(data)
start = pd.Timestamp("20181014 00:00")
start = start.tz_localize("MST")
end = pd.Timestamp("20181014 23:59")
end = end.tz_localize("MST")
assert type(data.index) == pd.DatetimeIndex
assert data.index[0] == start
assert data.index[-1] == end
def test_midc_format_index_tz_conversion():
data = pd.read_csv(midc_testfile)
data = data.rename(columns={'MST': 'PST'})
data = midc.format_index(data)
assert data.index[0].tz == pytz.timezone('Etc/GMT+8')
def test_midc_format_index_raw():
data = pd.read_csv(midc_raw_testfile)
data = midc.format_index_raw(data)
start = pd.Timestamp('20181018 00:00')
start = start.tz_localize('MST')
end = pd.Timestamp('20181018 23:59')
end = end.tz_localize('MST')
assert data.index[0] == start
assert data.index[-1] == end
def test_read_midc_var_mapping_as_arg(test_mapping):
data = midc.read_midc(midc_testfile, variable_map=test_mapping)
assert 'ghi' in data.columns
assert 'temp_air' in data.columns
@network
def test_read_midc_raw_data_from_nrel():
start_ts = pd.Timestamp('20181018')
end_ts = pd.Timestamp('20181019')
var_map = midc.MIDC_VARIABLE_MAP['UAT']
data = midc.read_midc_raw_data_from_nrel('UAT', start_ts, end_ts, var_map)
for k, v in var_map.items():
assert v in data.columns
assert data.index.size == 2880
| bsd-3-clause |
Titan-C/scikit-learn | examples/linear_model/plot_multi_task_lasso_support.py | 77 | 2319 | #!/usr/bin/env python
"""
=============================================
Joint feature selection with multi-task Lasso
=============================================
The multi-task lasso allows to fit multiple regression problems
jointly enforcing the selected features to be the same across
tasks. This example simulates sequential measurements, each task
is a time instant, and the relevant features vary in amplitude
over time while being the same. The multi-task lasso imposes that
features that are selected at one time point are select for all time
point. This makes feature selection by the Lasso more stable.
"""
print(__doc__)
# Author: Alexandre Gramfort <alexandre.gramfort@inria.fr>
# License: BSD 3 clause
import matplotlib.pyplot as plt
import numpy as np
from sklearn.linear_model import MultiTaskLasso, Lasso
rng = np.random.RandomState(42)
# Generate some 2D coefficients with sine waves with random frequency and phase
n_samples, n_features, n_tasks = 100, 30, 40
n_relevant_features = 5
coef = np.zeros((n_tasks, n_features))
times = np.linspace(0, 2 * np.pi, n_tasks)
for k in range(n_relevant_features):
coef[:, k] = np.sin((1. + rng.randn(1)) * times + 3 * rng.randn(1))
X = rng.randn(n_samples, n_features)
Y = np.dot(X, coef.T) + rng.randn(n_samples, n_tasks)
coef_lasso_ = np.array([Lasso(alpha=0.5).fit(X, y).coef_ for y in Y.T])
coef_multi_task_lasso_ = MultiTaskLasso(alpha=1.).fit(X, Y).coef_
# #############################################################################
# Plot support and time series
fig = plt.figure(figsize=(8, 5))
plt.subplot(1, 2, 1)
plt.spy(coef_lasso_)
plt.xlabel('Feature')
plt.ylabel('Time (or Task)')
plt.text(10, 5, 'Lasso')
plt.subplot(1, 2, 2)
plt.spy(coef_multi_task_lasso_)
plt.xlabel('Feature')
plt.ylabel('Time (or Task)')
plt.text(10, 5, 'MultiTaskLasso')
fig.suptitle('Coefficient non-zero location')
feature_to_plot = 0
plt.figure()
lw = 2
plt.plot(coef[:, feature_to_plot], color='seagreen', linewidth=lw,
label='Ground truth')
plt.plot(coef_lasso_[:, feature_to_plot], color='cornflowerblue', linewidth=lw,
label='Lasso')
plt.plot(coef_multi_task_lasso_[:, feature_to_plot], color='gold', linewidth=lw,
label='MultiTaskLasso')
plt.legend(loc='upper center')
plt.axis('tight')
plt.ylim([-1.1, 1.1])
plt.show()
| bsd-3-clause |
bennlich/scikit-image | doc/examples/plot_censure.py | 23 | 1079 | """
========================
CENSURE feature detector
========================
The CENSURE feature detector is a scale-invariant center-surround detector
(CENSURE) that claims to outperform other detectors and is capable of real-time
implementation.
"""
from skimage import data
from skimage import transform as tf
from skimage.feature import CENSURE
from skimage.color import rgb2gray
import matplotlib.pyplot as plt
img1 = rgb2gray(data.astronaut())
tform = tf.AffineTransform(scale=(1.5, 1.5), rotation=0.5,
translation=(150, -200))
img2 = tf.warp(img1, tform)
detector = CENSURE()
fig, ax = plt.subplots(nrows=1, ncols=2)
plt.gray()
detector.detect(img1)
ax[0].imshow(img1)
ax[0].axis('off')
ax[0].scatter(detector.keypoints[:, 1], detector.keypoints[:, 0],
2 ** detector.scales, facecolors='none', edgecolors='r')
detector.detect(img2)
ax[1].imshow(img2)
ax[1].axis('off')
ax[1].scatter(detector.keypoints[:, 1], detector.keypoints[:, 0],
2 ** detector.scales, facecolors='none', edgecolors='r')
plt.show()
| bsd-3-clause |
cactusbin/nyt | matplotlib/examples/statistics/histogram_demo_features.py | 7 | 1039 | """
Demo of the histogram (hist) function with a few features.
In addition to the basic histogram, this demo shows a few optional features:
* Setting the number of data bins
* The ``normed`` flag, which normalizes bin heights so that the integral of
the histogram is 1. The resulting histogram is a probability density.
* Setting the face color of the bars
* Setting the opacity (alpha value).
"""
import numpy as np
import matplotlib.mlab as mlab
import matplotlib.pyplot as plt
# example data
mu = 100 # mean of distribution
sigma = 15 # standard deviation of distribution
x = mu + sigma * np.random.randn(10000)
num_bins = 50
# the histogram of the data
n, bins, patches = plt.hist(x, num_bins, normed=1, facecolor='green', alpha=0.5)
# add a 'best fit' line
y = mlab.normpdf(bins, mu, sigma)
plt.plot(bins, y, 'r--')
plt.xlabel('Smarts')
plt.ylabel('Probability')
plt.title(r'Histogram of IQ: $\mu=100$, $\sigma=15$')
# Tweak spacing to prevent clipping of ylabel
plt.subplots_adjust(left=0.15)
plt.show()
| unlicense |
tomlof/scikit-learn | sklearn/tests/test_kernel_ridge.py | 342 | 3027 | import numpy as np
import scipy.sparse as sp
from sklearn.datasets import make_regression
from sklearn.linear_model import Ridge
from sklearn.kernel_ridge import KernelRidge
from sklearn.metrics.pairwise import pairwise_kernels
from sklearn.utils.testing import ignore_warnings
from sklearn.utils.testing import assert_array_almost_equal
X, y = make_regression(n_features=10)
Xcsr = sp.csr_matrix(X)
Xcsc = sp.csc_matrix(X)
Y = np.array([y, y]).T
def test_kernel_ridge():
pred = Ridge(alpha=1, fit_intercept=False).fit(X, y).predict(X)
pred2 = KernelRidge(kernel="linear", alpha=1).fit(X, y).predict(X)
assert_array_almost_equal(pred, pred2)
def test_kernel_ridge_csr():
pred = Ridge(alpha=1, fit_intercept=False,
solver="cholesky").fit(Xcsr, y).predict(Xcsr)
pred2 = KernelRidge(kernel="linear", alpha=1).fit(Xcsr, y).predict(Xcsr)
assert_array_almost_equal(pred, pred2)
def test_kernel_ridge_csc():
pred = Ridge(alpha=1, fit_intercept=False,
solver="cholesky").fit(Xcsc, y).predict(Xcsc)
pred2 = KernelRidge(kernel="linear", alpha=1).fit(Xcsc, y).predict(Xcsc)
assert_array_almost_equal(pred, pred2)
def test_kernel_ridge_singular_kernel():
# alpha=0 causes a LinAlgError in computing the dual coefficients,
# which causes a fallback to a lstsq solver. This is tested here.
pred = Ridge(alpha=0, fit_intercept=False).fit(X, y).predict(X)
kr = KernelRidge(kernel="linear", alpha=0)
ignore_warnings(kr.fit)(X, y)
pred2 = kr.predict(X)
assert_array_almost_equal(pred, pred2)
def test_kernel_ridge_precomputed():
for kernel in ["linear", "rbf", "poly", "cosine"]:
K = pairwise_kernels(X, X, metric=kernel)
pred = KernelRidge(kernel=kernel).fit(X, y).predict(X)
pred2 = KernelRidge(kernel="precomputed").fit(K, y).predict(K)
assert_array_almost_equal(pred, pred2)
def test_kernel_ridge_precomputed_kernel_unchanged():
K = np.dot(X, X.T)
K2 = K.copy()
KernelRidge(kernel="precomputed").fit(K, y)
assert_array_almost_equal(K, K2)
def test_kernel_ridge_sample_weights():
K = np.dot(X, X.T) # precomputed kernel
sw = np.random.RandomState(0).rand(X.shape[0])
pred = Ridge(alpha=1,
fit_intercept=False).fit(X, y, sample_weight=sw).predict(X)
pred2 = KernelRidge(kernel="linear",
alpha=1).fit(X, y, sample_weight=sw).predict(X)
pred3 = KernelRidge(kernel="precomputed",
alpha=1).fit(K, y, sample_weight=sw).predict(K)
assert_array_almost_equal(pred, pred2)
assert_array_almost_equal(pred, pred3)
def test_kernel_ridge_multi_output():
pred = Ridge(alpha=1, fit_intercept=False).fit(X, Y).predict(X)
pred2 = KernelRidge(kernel="linear", alpha=1).fit(X, Y).predict(X)
assert_array_almost_equal(pred, pred2)
pred3 = KernelRidge(kernel="linear", alpha=1).fit(X, y).predict(X)
pred3 = np.array([pred3, pred3]).T
assert_array_almost_equal(pred2, pred3)
| bsd-3-clause |
StratsOn/zipline | zipline/examples/dual_moving_average.py | 5 | 1974 | #!/usr/bin/env python
#
# Copyright 2014 Quantopian, Inc.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
"""Dual Moving Average Crossover algorithm.
This algorithm buys apple once its short moving average crosses
its long moving average (indicating upwards momentum) and sells
its shares once the averages cross again (indicating downwards
momentum).
"""
from zipline.api import order_target, record, symbol, history, add_history
def initialize(context):
# Register 2 histories that track daily prices,
# one with a 100 window and one with a 300 day window
add_history(100, '1d', 'price')
add_history(300, '1d', 'price')
context.i = 0
def handle_data(context, data):
# Skip first 300 days to get full windows
context.i += 1
if context.i < 300:
return
# Compute averages
# history() has to be called with the same params
# from above and returns a pandas dataframe.
short_mavg = history(100, '1d', 'price').mean()
long_mavg = history(300, '1d', 'price').mean()
sym = symbol('AAPL')
# Trading logic
if short_mavg[sym] > long_mavg[sym]:
# order_target orders as many shares as needed to
# achieve the desired number of shares.
order_target(sym, 100)
elif short_mavg[sym] < long_mavg[sym]:
order_target(sym, 0)
# Save values for later inspection
record(AAPL=data[sym].price,
short_mavg=short_mavg[sym],
long_mavg=long_mavg[sym])
| apache-2.0 |
BiaDarkia/scikit-learn | sklearn/utils/tests/test_shortest_path.py | 303 | 2841 | from collections import defaultdict
import numpy as np
from numpy.testing import assert_array_almost_equal
from sklearn.utils.graph import (graph_shortest_path,
single_source_shortest_path_length)
def floyd_warshall_slow(graph, directed=False):
N = graph.shape[0]
#set nonzero entries to infinity
graph[np.where(graph == 0)] = np.inf
#set diagonal to zero
graph.flat[::N + 1] = 0
if not directed:
graph = np.minimum(graph, graph.T)
for k in range(N):
for i in range(N):
for j in range(N):
graph[i, j] = min(graph[i, j], graph[i, k] + graph[k, j])
graph[np.where(np.isinf(graph))] = 0
return graph
def generate_graph(N=20):
#sparse grid of distances
rng = np.random.RandomState(0)
dist_matrix = rng.random_sample((N, N))
#make symmetric: distances are not direction-dependent
dist_matrix = dist_matrix + dist_matrix.T
#make graph sparse
i = (rng.randint(N, size=N * N // 2), rng.randint(N, size=N * N // 2))
dist_matrix[i] = 0
#set diagonal to zero
dist_matrix.flat[::N + 1] = 0
return dist_matrix
def test_floyd_warshall():
dist_matrix = generate_graph(20)
for directed in (True, False):
graph_FW = graph_shortest_path(dist_matrix, directed, 'FW')
graph_py = floyd_warshall_slow(dist_matrix.copy(), directed)
assert_array_almost_equal(graph_FW, graph_py)
def test_dijkstra():
dist_matrix = generate_graph(20)
for directed in (True, False):
graph_D = graph_shortest_path(dist_matrix, directed, 'D')
graph_py = floyd_warshall_slow(dist_matrix.copy(), directed)
assert_array_almost_equal(graph_D, graph_py)
def test_shortest_path():
dist_matrix = generate_graph(20)
# We compare path length and not costs (-> set distances to 0 or 1)
dist_matrix[dist_matrix != 0] = 1
for directed in (True, False):
if not directed:
dist_matrix = np.minimum(dist_matrix, dist_matrix.T)
graph_py = floyd_warshall_slow(dist_matrix.copy(), directed)
for i in range(dist_matrix.shape[0]):
# Non-reachable nodes have distance 0 in graph_py
dist_dict = defaultdict(int)
dist_dict.update(single_source_shortest_path_length(dist_matrix,
i))
for j in range(graph_py[i].shape[0]):
assert_array_almost_equal(dist_dict[j], graph_py[i, j])
def test_dijkstra_bug_fix():
X = np.array([[0., 0., 4.],
[1., 0., 2.],
[0., 5., 0.]])
dist_FW = graph_shortest_path(X, directed=False, method='FW')
dist_D = graph_shortest_path(X, directed=False, method='D')
assert_array_almost_equal(dist_D, dist_FW)
| bsd-3-clause |
poryfly/scikit-learn | sklearn/utils/tests/test_class_weight.py | 140 | 11909 | import numpy as np
from sklearn.linear_model import LogisticRegression
from sklearn.datasets import make_blobs
from sklearn.utils.class_weight import compute_class_weight
from sklearn.utils.class_weight import compute_sample_weight
from sklearn.utils.testing import assert_array_almost_equal
from sklearn.utils.testing import assert_almost_equal
from sklearn.utils.testing import assert_raises
from sklearn.utils.testing import assert_true
from sklearn.utils.testing import assert_equal
from sklearn.utils.testing import assert_warns
def test_compute_class_weight():
# Test (and demo) compute_class_weight.
y = np.asarray([2, 2, 2, 3, 3, 4])
classes = np.unique(y)
cw = assert_warns(DeprecationWarning,
compute_class_weight, "auto", classes, y)
assert_almost_equal(cw.sum(), classes.shape)
assert_true(cw[0] < cw[1] < cw[2])
cw = compute_class_weight("balanced", classes, y)
# total effect of samples is preserved
class_counts = np.bincount(y)[2:]
assert_almost_equal(np.dot(cw, class_counts), y.shape[0])
assert_true(cw[0] < cw[1] < cw[2])
def test_compute_class_weight_not_present():
# Raise error when y does not contain all class labels
classes = np.arange(4)
y = np.asarray([0, 0, 0, 1, 1, 2])
assert_raises(ValueError, compute_class_weight, "auto", classes, y)
assert_raises(ValueError, compute_class_weight, "balanced", classes, y)
def test_compute_class_weight_invariance():
# Test that results with class_weight="balanced" is invariant wrt
# class imbalance if the number of samples is identical.
# The test uses a balanced two class dataset with 100 datapoints.
# It creates three versions, one where class 1 is duplicated
# resulting in 150 points of class 1 and 50 of class 0,
# one where there are 50 points in class 1 and 150 in class 0,
# and one where there are 100 points of each class (this one is balanced
# again).
# With balancing class weights, all three should give the same model.
X, y = make_blobs(centers=2, random_state=0)
# create dataset where class 1 is duplicated twice
X_1 = np.vstack([X] + [X[y == 1]] * 2)
y_1 = np.hstack([y] + [y[y == 1]] * 2)
# create dataset where class 0 is duplicated twice
X_0 = np.vstack([X] + [X[y == 0]] * 2)
y_0 = np.hstack([y] + [y[y == 0]] * 2)
# cuplicate everything
X_ = np.vstack([X] * 2)
y_ = np.hstack([y] * 2)
# results should be identical
logreg1 = LogisticRegression(class_weight="balanced").fit(X_1, y_1)
logreg0 = LogisticRegression(class_weight="balanced").fit(X_0, y_0)
logreg = LogisticRegression(class_weight="balanced").fit(X_, y_)
assert_array_almost_equal(logreg1.coef_, logreg0.coef_)
assert_array_almost_equal(logreg.coef_, logreg0.coef_)
def test_compute_class_weight_auto_negative():
# Test compute_class_weight when labels are negative
# Test with balanced class labels.
classes = np.array([-2, -1, 0])
y = np.asarray([-1, -1, 0, 0, -2, -2])
cw = assert_warns(DeprecationWarning, compute_class_weight, "auto",
classes, y)
assert_almost_equal(cw.sum(), classes.shape)
assert_equal(len(cw), len(classes))
assert_array_almost_equal(cw, np.array([1., 1., 1.]))
cw = compute_class_weight("balanced", classes, y)
assert_equal(len(cw), len(classes))
assert_array_almost_equal(cw, np.array([1., 1., 1.]))
# Test with unbalanced class labels.
y = np.asarray([-1, 0, 0, -2, -2, -2])
cw = assert_warns(DeprecationWarning, compute_class_weight, "auto",
classes, y)
assert_almost_equal(cw.sum(), classes.shape)
assert_equal(len(cw), len(classes))
assert_array_almost_equal(cw, np.array([0.545, 1.636, 0.818]), decimal=3)
cw = compute_class_weight("balanced", classes, y)
assert_equal(len(cw), len(classes))
class_counts = np.bincount(y + 2)
assert_almost_equal(np.dot(cw, class_counts), y.shape[0])
assert_array_almost_equal(cw, [2. / 3, 2., 1.])
def test_compute_class_weight_auto_unordered():
# Test compute_class_weight when classes are unordered
classes = np.array([1, 0, 3])
y = np.asarray([1, 0, 0, 3, 3, 3])
cw = assert_warns(DeprecationWarning, compute_class_weight, "auto",
classes, y)
assert_almost_equal(cw.sum(), classes.shape)
assert_equal(len(cw), len(classes))
assert_array_almost_equal(cw, np.array([1.636, 0.818, 0.545]), decimal=3)
cw = compute_class_weight("balanced", classes, y)
class_counts = np.bincount(y)[classes]
assert_almost_equal(np.dot(cw, class_counts), y.shape[0])
assert_array_almost_equal(cw, [2., 1., 2. / 3])
def test_compute_sample_weight():
# Test (and demo) compute_sample_weight.
# Test with balanced classes
y = np.asarray([1, 1, 1, 2, 2, 2])
sample_weight = assert_warns(DeprecationWarning,
compute_sample_weight, "auto", y)
assert_array_almost_equal(sample_weight, [1., 1., 1., 1., 1., 1.])
sample_weight = compute_sample_weight("balanced", y)
assert_array_almost_equal(sample_weight, [1., 1., 1., 1., 1., 1.])
# Test with user-defined weights
sample_weight = compute_sample_weight({1: 2, 2: 1}, y)
assert_array_almost_equal(sample_weight, [2., 2., 2., 1., 1., 1.])
# Test with column vector of balanced classes
y = np.asarray([[1], [1], [1], [2], [2], [2]])
sample_weight = assert_warns(DeprecationWarning,
compute_sample_weight, "auto", y)
assert_array_almost_equal(sample_weight, [1., 1., 1., 1., 1., 1.])
sample_weight = compute_sample_weight("balanced", y)
assert_array_almost_equal(sample_weight, [1., 1., 1., 1., 1., 1.])
# Test with unbalanced classes
y = np.asarray([1, 1, 1, 2, 2, 2, 3])
sample_weight = assert_warns(DeprecationWarning,
compute_sample_weight, "auto", y)
expected_auto = np.asarray([.6, .6, .6, .6, .6, .6, 1.8])
assert_array_almost_equal(sample_weight, expected_auto)
sample_weight = compute_sample_weight("balanced", y)
expected_balanced = np.array([0.7777, 0.7777, 0.7777, 0.7777, 0.7777, 0.7777, 2.3333])
assert_array_almost_equal(sample_weight, expected_balanced, decimal=4)
# Test with `None` weights
sample_weight = compute_sample_weight(None, y)
assert_array_almost_equal(sample_weight, [1., 1., 1., 1., 1., 1., 1.])
# Test with multi-output of balanced classes
y = np.asarray([[1, 0], [1, 0], [1, 0], [2, 1], [2, 1], [2, 1]])
sample_weight = assert_warns(DeprecationWarning,
compute_sample_weight, "auto", y)
assert_array_almost_equal(sample_weight, [1., 1., 1., 1., 1., 1.])
sample_weight = compute_sample_weight("balanced", y)
assert_array_almost_equal(sample_weight, [1., 1., 1., 1., 1., 1.])
# Test with multi-output with user-defined weights
y = np.asarray([[1, 0], [1, 0], [1, 0], [2, 1], [2, 1], [2, 1]])
sample_weight = compute_sample_weight([{1: 2, 2: 1}, {0: 1, 1: 2}], y)
assert_array_almost_equal(sample_weight, [2., 2., 2., 2., 2., 2.])
# Test with multi-output of unbalanced classes
y = np.asarray([[1, 0], [1, 0], [1, 0], [2, 1], [2, 1], [2, 1], [3, -1]])
sample_weight = assert_warns(DeprecationWarning,
compute_sample_weight, "auto", y)
assert_array_almost_equal(sample_weight, expected_auto ** 2)
sample_weight = compute_sample_weight("balanced", y)
assert_array_almost_equal(sample_weight, expected_balanced ** 2, decimal=3)
def test_compute_sample_weight_with_subsample():
# Test compute_sample_weight with subsamples specified.
# Test with balanced classes and all samples present
y = np.asarray([1, 1, 1, 2, 2, 2])
sample_weight = assert_warns(DeprecationWarning,
compute_sample_weight, "auto", y)
assert_array_almost_equal(sample_weight, [1., 1., 1., 1., 1., 1.])
sample_weight = compute_sample_weight("balanced", y, range(6))
assert_array_almost_equal(sample_weight, [1., 1., 1., 1., 1., 1.])
# Test with column vector of balanced classes and all samples present
y = np.asarray([[1], [1], [1], [2], [2], [2]])
sample_weight = assert_warns(DeprecationWarning,
compute_sample_weight, "auto", y)
assert_array_almost_equal(sample_weight, [1., 1., 1., 1., 1., 1.])
sample_weight = compute_sample_weight("balanced", y, range(6))
assert_array_almost_equal(sample_weight, [1., 1., 1., 1., 1., 1.])
# Test with a subsample
y = np.asarray([1, 1, 1, 2, 2, 2])
sample_weight = assert_warns(DeprecationWarning,
compute_sample_weight, "auto", y, range(4))
assert_array_almost_equal(sample_weight, [.5, .5, .5, 1.5, 1.5, 1.5])
sample_weight = compute_sample_weight("balanced", y, range(4))
assert_array_almost_equal(sample_weight, [2. / 3, 2. / 3,
2. / 3, 2., 2., 2.])
# Test with a bootstrap subsample
y = np.asarray([1, 1, 1, 2, 2, 2])
sample_weight = assert_warns(DeprecationWarning, compute_sample_weight,
"auto", y, [0, 1, 1, 2, 2, 3])
expected_auto = np.asarray([1 / 3., 1 / 3., 1 / 3., 5 / 3., 5 / 3., 5 / 3.])
assert_array_almost_equal(sample_weight, expected_auto)
sample_weight = compute_sample_weight("balanced", y, [0, 1, 1, 2, 2, 3])
expected_balanced = np.asarray([0.6, 0.6, 0.6, 3., 3., 3.])
assert_array_almost_equal(sample_weight, expected_balanced)
# Test with a bootstrap subsample for multi-output
y = np.asarray([[1, 0], [1, 0], [1, 0], [2, 1], [2, 1], [2, 1]])
sample_weight = assert_warns(DeprecationWarning, compute_sample_weight,
"auto", y, [0, 1, 1, 2, 2, 3])
assert_array_almost_equal(sample_weight, expected_auto ** 2)
sample_weight = compute_sample_weight("balanced", y, [0, 1, 1, 2, 2, 3])
assert_array_almost_equal(sample_weight, expected_balanced ** 2)
# Test with a missing class
y = np.asarray([1, 1, 1, 2, 2, 2, 3])
sample_weight = assert_warns(DeprecationWarning, compute_sample_weight,
"auto", y, range(6))
assert_array_almost_equal(sample_weight, [1., 1., 1., 1., 1., 1., 0.])
sample_weight = compute_sample_weight("balanced", y, range(6))
assert_array_almost_equal(sample_weight, [1., 1., 1., 1., 1., 1., 0.])
# Test with a missing class for multi-output
y = np.asarray([[1, 0], [1, 0], [1, 0], [2, 1], [2, 1], [2, 1], [2, 2]])
sample_weight = assert_warns(DeprecationWarning, compute_sample_weight,
"auto", y, range(6))
assert_array_almost_equal(sample_weight, [1., 1., 1., 1., 1., 1., 0.])
sample_weight = compute_sample_weight("balanced", y, range(6))
assert_array_almost_equal(sample_weight, [1., 1., 1., 1., 1., 1., 0.])
def test_compute_sample_weight_errors():
# Test compute_sample_weight raises errors expected.
# Invalid preset string
y = np.asarray([1, 1, 1, 2, 2, 2])
y_ = np.asarray([[1, 0], [1, 0], [1, 0], [2, 1], [2, 1], [2, 1]])
assert_raises(ValueError, compute_sample_weight, "ni", y)
assert_raises(ValueError, compute_sample_weight, "ni", y, range(4))
assert_raises(ValueError, compute_sample_weight, "ni", y_)
assert_raises(ValueError, compute_sample_weight, "ni", y_, range(4))
# Not "auto" for subsample
assert_raises(ValueError,
compute_sample_weight, {1: 2, 2: 1}, y, range(4))
# Not a list or preset for multi-output
assert_raises(ValueError, compute_sample_weight, {1: 2, 2: 1}, y_)
# Incorrect length list for multi-output
assert_raises(ValueError, compute_sample_weight, [{1: 2, 2: 1}], y_)
| bsd-3-clause |
UDST/activitysim | activitysim/abm/models/util/test/test_vectorize_tour_scheduling.py | 2 | 2354 | # ActivitySim
# See full license in LICENSE.txt.
import os
import pytest
import pandas as pd
import numpy as np
import pandas.util.testing as pdt
from activitysim.core import inject
from ..vectorize_tour_scheduling import get_previous_tour_by_tourid, \
vectorize_tour_scheduling
def test_vts():
inject.add_injectable("settings", {})
# note: need 0 duration tour on one end of day to guarantee at least one available tour
alts = pd.DataFrame({
"start": [1, 1, 2, 3],
"end": [1, 4, 5, 6]
})
alts['duration'] = alts.end - alts.start
inject.add_injectable("tdd_alts", alts)
current_tour_person_ids = pd.Series(['b', 'c'],
index=['d', 'e'])
previous_tour_by_personid = pd.Series([2, 2, 1],
index=['a', 'b', 'c'])
prev_tour_attrs = get_previous_tour_by_tourid(current_tour_person_ids,
previous_tour_by_personid,
alts)
pdt.assert_series_equal(
prev_tour_attrs.start_previous,
pd.Series([2, 1], index=['d', 'e'], name='start_previous'))
pdt.assert_series_equal(
prev_tour_attrs.end_previous,
pd.Series([5, 4], index=['d', 'e'], name='end_previous'))
tours = pd.DataFrame({
"person_id": [1, 1, 2, 3, 3],
"tour_num": [1, 2, 1, 1, 2],
"tour_type": ['x', 'x', 'x', 'x', 'x']
})
persons = pd.DataFrame({
"income": [20, 30, 25]
}, index=[1, 2, 3])
inject.add_table('persons', persons)
spec = pd.DataFrame({"Coefficient": [1.2]},
index=["income"])
spec.index.name = "Expression"
segment_col = None # no segmentation of model_spec
inject.add_injectable("check_for_variability", True)
tdd_choices, timetable = vectorize_tour_scheduling(
tours, persons, alts, spec, segment_col,
model_settings={},
chunk_size=0, trace_label='test_vts')
# FIXME - dead reckoning regression
# there's no real logic here - this is just what came out of the monte carlo
# note that the result comes out ordered by the nth trips and not ordered
# by the trip index. shrug?
expected = [2, 2, 2, 0, 0]
assert (tdd_choices.tdd.values == expected).all()
| bsd-3-clause |
quheng/scikit-learn | doc/sphinxext/numpy_ext/docscrape_sphinx.py | 408 | 8061 | import re
import inspect
import textwrap
import pydoc
from .docscrape import NumpyDocString
from .docscrape import FunctionDoc
from .docscrape import ClassDoc
class SphinxDocString(NumpyDocString):
def __init__(self, docstring, config=None):
config = {} if config is None else config
self.use_plots = config.get('use_plots', False)
NumpyDocString.__init__(self, docstring, config=config)
# string conversion routines
def _str_header(self, name, symbol='`'):
return ['.. rubric:: ' + name, '']
def _str_field_list(self, name):
return [':' + name + ':']
def _str_indent(self, doc, indent=4):
out = []
for line in doc:
out += [' ' * indent + line]
return out
def _str_signature(self):
return ['']
if self['Signature']:
return ['``%s``' % self['Signature']] + ['']
else:
return ['']
def _str_summary(self):
return self['Summary'] + ['']
def _str_extended_summary(self):
return self['Extended Summary'] + ['']
def _str_param_list(self, name):
out = []
if self[name]:
out += self._str_field_list(name)
out += ['']
for param, param_type, desc in self[name]:
out += self._str_indent(['**%s** : %s' % (param.strip(),
param_type)])
out += ['']
out += self._str_indent(desc, 8)
out += ['']
return out
@property
def _obj(self):
if hasattr(self, '_cls'):
return self._cls
elif hasattr(self, '_f'):
return self._f
return None
def _str_member_list(self, name):
"""
Generate a member listing, autosummary:: table where possible,
and a table where not.
"""
out = []
if self[name]:
out += ['.. rubric:: %s' % name, '']
prefix = getattr(self, '_name', '')
if prefix:
prefix = '~%s.' % prefix
autosum = []
others = []
for param, param_type, desc in self[name]:
param = param.strip()
if not self._obj or hasattr(self._obj, param):
autosum += [" %s%s" % (prefix, param)]
else:
others.append((param, param_type, desc))
if autosum:
# GAEL: Toctree commented out below because it creates
# hundreds of sphinx warnings
# out += ['.. autosummary::', ' :toctree:', '']
out += ['.. autosummary::', '']
out += autosum
if others:
maxlen_0 = max([len(x[0]) for x in others])
maxlen_1 = max([len(x[1]) for x in others])
hdr = "=" * maxlen_0 + " " + "=" * maxlen_1 + " " + "=" * 10
fmt = '%%%ds %%%ds ' % (maxlen_0, maxlen_1)
n_indent = maxlen_0 + maxlen_1 + 4
out += [hdr]
for param, param_type, desc in others:
out += [fmt % (param.strip(), param_type)]
out += self._str_indent(desc, n_indent)
out += [hdr]
out += ['']
return out
def _str_section(self, name):
out = []
if self[name]:
out += self._str_header(name)
out += ['']
content = textwrap.dedent("\n".join(self[name])).split("\n")
out += content
out += ['']
return out
def _str_see_also(self, func_role):
out = []
if self['See Also']:
see_also = super(SphinxDocString, self)._str_see_also(func_role)
out = ['.. seealso::', '']
out += self._str_indent(see_also[2:])
return out
def _str_warnings(self):
out = []
if self['Warnings']:
out = ['.. warning::', '']
out += self._str_indent(self['Warnings'])
return out
def _str_index(self):
idx = self['index']
out = []
if len(idx) == 0:
return out
out += ['.. index:: %s' % idx.get('default', '')]
for section, references in idx.iteritems():
if section == 'default':
continue
elif section == 'refguide':
out += [' single: %s' % (', '.join(references))]
else:
out += [' %s: %s' % (section, ','.join(references))]
return out
def _str_references(self):
out = []
if self['References']:
out += self._str_header('References')
if isinstance(self['References'], str):
self['References'] = [self['References']]
out.extend(self['References'])
out += ['']
# Latex collects all references to a separate bibliography,
# so we need to insert links to it
import sphinx # local import to avoid test dependency
if sphinx.__version__ >= "0.6":
out += ['.. only:: latex', '']
else:
out += ['.. latexonly::', '']
items = []
for line in self['References']:
m = re.match(r'.. \[([a-z0-9._-]+)\]', line, re.I)
if m:
items.append(m.group(1))
out += [' ' + ", ".join(["[%s]_" % item for item in items]), '']
return out
def _str_examples(self):
examples_str = "\n".join(self['Examples'])
if (self.use_plots and 'import matplotlib' in examples_str
and 'plot::' not in examples_str):
out = []
out += self._str_header('Examples')
out += ['.. plot::', '']
out += self._str_indent(self['Examples'])
out += ['']
return out
else:
return self._str_section('Examples')
def __str__(self, indent=0, func_role="obj"):
out = []
out += self._str_signature()
out += self._str_index() + ['']
out += self._str_summary()
out += self._str_extended_summary()
for param_list in ('Parameters', 'Returns', 'Raises', 'Attributes'):
out += self._str_param_list(param_list)
out += self._str_warnings()
out += self._str_see_also(func_role)
out += self._str_section('Notes')
out += self._str_references()
out += self._str_examples()
for param_list in ('Methods',):
out += self._str_member_list(param_list)
out = self._str_indent(out, indent)
return '\n'.join(out)
class SphinxFunctionDoc(SphinxDocString, FunctionDoc):
def __init__(self, obj, doc=None, config={}):
self.use_plots = config.get('use_plots', False)
FunctionDoc.__init__(self, obj, doc=doc, config=config)
class SphinxClassDoc(SphinxDocString, ClassDoc):
def __init__(self, obj, doc=None, func_doc=None, config={}):
self.use_plots = config.get('use_plots', False)
ClassDoc.__init__(self, obj, doc=doc, func_doc=None, config=config)
class SphinxObjDoc(SphinxDocString):
def __init__(self, obj, doc=None, config=None):
self._f = obj
SphinxDocString.__init__(self, doc, config=config)
def get_doc_object(obj, what=None, doc=None, config={}):
if what is None:
if inspect.isclass(obj):
what = 'class'
elif inspect.ismodule(obj):
what = 'module'
elif callable(obj):
what = 'function'
else:
what = 'object'
if what == 'class':
return SphinxClassDoc(obj, func_doc=SphinxFunctionDoc, doc=doc,
config=config)
elif what in ('function', 'method'):
return SphinxFunctionDoc(obj, doc=doc, config=config)
else:
if doc is None:
doc = pydoc.getdoc(obj)
return SphinxObjDoc(obj, doc, config=config)
| bsd-3-clause |
gautam1858/tensorflow | tensorflow/contrib/learn/python/learn/grid_search_test.py | 137 | 2035 | # Copyright 2016 The TensorFlow Authors. All Rights Reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
# ==============================================================================
"""Grid search tests."""
from __future__ import absolute_import
from __future__ import division
from __future__ import print_function
import os
import random
from tensorflow.contrib.learn.python import learn
from tensorflow.python.platform import test
HAS_SKLEARN = os.environ.get('TENSORFLOW_SKLEARN', False)
if HAS_SKLEARN:
try:
# pylint: disable=g-import-not-at-top
from sklearn import datasets
from sklearn.grid_search import GridSearchCV
from sklearn.metrics import accuracy_score
except ImportError:
HAS_SKLEARN = False
class GridSearchTest(test.TestCase):
"""Grid search tests."""
def testIrisDNN(self):
if HAS_SKLEARN:
random.seed(42)
iris = datasets.load_iris()
feature_columns = learn.infer_real_valued_columns_from_input(iris.data)
classifier = learn.DNNClassifier(
feature_columns=feature_columns,
hidden_units=[10, 20, 10],
n_classes=3)
grid_search = GridSearchCV(
classifier, {'hidden_units': [[5, 5], [10, 10]]},
scoring='accuracy',
fit_params={'steps': [50]})
grid_search.fit(iris.data, iris.target)
score = accuracy_score(iris.target, grid_search.predict(iris.data))
self.assertGreater(score, 0.5, 'Failed with score = {0}'.format(score))
if __name__ == '__main__':
test.main()
| apache-2.0 |
xpspectre/multiple-myeloma | prep_baseline_clinical_data.py | 1 | 3680 | # Run this after prep_clinical_data.py
import os
import pandas as pd
import numpy as np
# from fancyimpute import KNN, MICE
data_dir = 'data/processed'
# Load study data - to keep just the CoMMpass patients
study_data = pd.read_csv(os.path.join(data_dir, 'patient_study.csv'))
study_data.set_index('PUBLIC_ID', inplace=True)
# Load demographic data
demo_data = pd.read_csv(os.path.join(data_dir, 'patient_data.csv'))
demo_data.set_index('PUBLIC_ID', inplace=True)
# Load visit data
visit_data = pd.read_csv(os.path.join(data_dir, 'clinical_data.csv'))
# Select just the baseline/screening data
visit_data = visit_data.loc[visit_data['VISIT'] <= 0]
# Combine rows for each patient
# Averaging the rows takes care of missing data/NaNs properly
# unique_ids = data['PUBLIC_ID'].unique()
# print(unique_ids)
visit_data = visit_data.groupby('PUBLIC_ID').mean()
visit_data.drop(['VISIT', 'VISITDY'], axis=1, inplace=True)
# Combine demographic and visit data
data = demo_data
data = data.join(visit_data)
# Only keep CoMMpass patients
data = data[study_data['STUDY_ID'] == 1]
# Drop cols that represent change over last visit
data.drop(['AT_INCREASEOF25F', 'AT_SERUMMCOMPONE', 'AT_URINEMCOMPONE', 'AT_ONLYINPATIENT', 'AT_ONLYINPATIENT2', 'AT_DEVELOPMENTOF'], axis=1, inplace=True)
# Keep only the cols that have >= threashold % entries present
# Set this to determine how much we have to consider missing data
# A smallish amount of missing data should be pretty straightforward imputation
# More missing data is harder
# Which cols can be used for regression of missing data? (if our method requires that)
keep_thres = 0.5
cols = list(data)
present = []
N, N_cols = data.shape
for col in cols:
n = pd.notnull(data[col]).sum()
present.append(float(n)/N)
present = np.array(present)
drop_cols = np.where(present < keep_thres)[0]
data.drop(data.columns[drop_cols], axis=1, inplace=True)
print('Dropped {n}/{N} cols that had less than {x} frac of values'.format(n=drop_cols.size, N=N_cols, x=keep_thres))
# Load endpoints and join/split with data
endp_data = pd.read_csv(os.path.join(data_dir, 'patient_endp.csv'))
endp_data.set_index('PUBLIC_ID', inplace=True)
endp_cols = list(endp_data)
data_ = data
data_ = data_.join(endp_data)
data = data_.drop(endp_cols, axis=1)
endp_data = data_[endp_cols]
# Save combined baseline patient data
data.to_csv(os.path.join(data_dir, 'baseline_clinical_data.csv'))
endp_data.to_csv(os.path.join(data_dir, 'baseline_clinical_endp.csv'))
# Impute missing data
# If all the cols are allowed to be treated as numeric vals, then this is OK as is
# Otherwise, if some cols still need to be categorical/indicator, then threshold and convert
# Not sure if these funs below are supposed to return multiple datasets?
# May want to recombine categorical cols into 1 col, then multinomial or softmax logistic regression on them in MI,
# then resplit
do_mi = False
if do_mi:
cols = list(data)
inds = data.index.values
X = data.as_matrix()
X_filled_knn = KNN(k=3).complete(X)
data_filled_knn = pd.DataFrame(data=X_filled_knn, columns=cols)
data_filled_knn.insert(0, 'PUBLIC_ID', inds)
data_filled_knn.set_index('PUBLIC_ID', inplace=True)
X_filled_mice = MICE().complete(X)
data_filled_mice = pd.DataFrame(data=X_filled_mice, columns=cols)
data_filled_mice.insert(0, 'PUBLIC_ID', inds)
data_filled_mice.set_index('PUBLIC_ID', inplace=True)
# Save imputed data ready for standard analysis
data_filled_knn.to_csv(os.path.join(data_dir, 'baseline_clinical_data_imputed_knn.csv'))
data_filled_mice.to_csv(os.path.join(data_dir, 'baseline_clinical_data_imputed_mice.csv'))
| mit |
MJuddBooth/pandas | pandas/tests/indexes/multi/test_names.py | 2 | 3942 | # -*- coding: utf-8 -*-
import pytest
import pandas as pd
from pandas import MultiIndex
import pandas.util.testing as tm
def check_level_names(index, names):
assert [level.name for level in index.levels] == list(names)
def test_slice_keep_name():
x = MultiIndex.from_tuples([('a', 'b'), (1, 2), ('c', 'd')],
names=['x', 'y'])
assert x[1:].names == x.names
def test_index_name_retained():
# GH9857
result = pd.DataFrame({'x': [1, 2, 6],
'y': [2, 2, 8],
'z': [-5, 0, 5]})
result = result.set_index('z')
result.loc[10] = [9, 10]
df_expected = pd.DataFrame({'x': [1, 2, 6, 9],
'y': [2, 2, 8, 10],
'z': [-5, 0, 5, 10]})
df_expected = df_expected.set_index('z')
tm.assert_frame_equal(result, df_expected)
def test_changing_names(idx):
# names should be applied to levels
level_names = [level.name for level in idx.levels]
check_level_names(idx, idx.names)
view = idx.view()
copy = idx.copy()
shallow_copy = idx._shallow_copy()
# changing names should change level names on object
new_names = [name + "a" for name in idx.names]
idx.names = new_names
check_level_names(idx, new_names)
# but not on copies
check_level_names(view, level_names)
check_level_names(copy, level_names)
check_level_names(shallow_copy, level_names)
# and copies shouldn't change original
shallow_copy.names = [name + "c" for name in shallow_copy.names]
check_level_names(idx, new_names)
def test_take_preserve_name(idx):
taken = idx.take([3, 0, 1])
assert taken.names == idx.names
def test_copy_names():
# Check that adding a "names" parameter to the copy is honored
# GH14302
multi_idx = pd.Index([(1, 2), (3, 4)], names=['MyName1', 'MyName2'])
multi_idx1 = multi_idx.copy()
assert multi_idx.equals(multi_idx1)
assert multi_idx.names == ['MyName1', 'MyName2']
assert multi_idx1.names == ['MyName1', 'MyName2']
multi_idx2 = multi_idx.copy(names=['NewName1', 'NewName2'])
assert multi_idx.equals(multi_idx2)
assert multi_idx.names == ['MyName1', 'MyName2']
assert multi_idx2.names == ['NewName1', 'NewName2']
multi_idx3 = multi_idx.copy(name=['NewName1', 'NewName2'])
assert multi_idx.equals(multi_idx3)
assert multi_idx.names == ['MyName1', 'MyName2']
assert multi_idx3.names == ['NewName1', 'NewName2']
def test_names(idx, index_names):
# names are assigned in setup
names = index_names
level_names = [level.name for level in idx.levels]
assert names == level_names
# setting bad names on existing
index = idx
with pytest.raises(ValueError, match="^Length of names"):
setattr(index, "names", list(index.names) + ["third"])
with pytest.raises(ValueError, match="^Length of names"):
setattr(index, "names", [])
# initializing with bad names (should always be equivalent)
major_axis, minor_axis = idx.levels
major_codes, minor_codes = idx.codes
with pytest.raises(ValueError, match="^Length of names"):
MultiIndex(levels=[major_axis, minor_axis],
codes=[major_codes, minor_codes],
names=['first'])
with pytest.raises(ValueError, match="^Length of names"):
MultiIndex(levels=[major_axis, minor_axis],
codes=[major_codes, minor_codes],
names=['first', 'second', 'third'])
# names are assigned
index.names = ["a", "b"]
ind_names = list(index.names)
level_names = [level.name for level in index.levels]
assert ind_names == level_names
def test_duplicate_level_names_access_raises(idx):
# GH19029
idx.names = ['foo', 'foo']
with pytest.raises(ValueError, match='name foo occurs multiple times'):
idx._get_level_number('foo')
| bsd-3-clause |
ElDeveloper/scikit-learn | benchmarks/bench_plot_approximate_neighbors.py | 244 | 6011 | """
Benchmark for approximate nearest neighbor search using
locality sensitive hashing forest.
There are two types of benchmarks.
First, accuracy of LSHForest queries are measured for various
hyper-parameters and index sizes.
Second, speed up of LSHForest queries compared to brute force
method in exact nearest neighbors is measures for the
aforementioned settings. In general, speed up is increasing as
the index size grows.
"""
from __future__ import division
import numpy as np
from tempfile import gettempdir
from time import time
from sklearn.neighbors import NearestNeighbors
from sklearn.neighbors.approximate import LSHForest
from sklearn.datasets import make_blobs
from sklearn.externals.joblib import Memory
m = Memory(cachedir=gettempdir())
@m.cache()
def make_data(n_samples, n_features, n_queries, random_state=0):
"""Create index and query data."""
print('Generating random blob-ish data')
X, _ = make_blobs(n_samples=n_samples + n_queries,
n_features=n_features, centers=100,
shuffle=True, random_state=random_state)
# Keep the last samples as held out query vectors: note since we used
# shuffle=True we have ensured that index and query vectors are
# samples from the same distribution (a mixture of 100 gaussians in this
# case)
return X[:n_samples], X[n_samples:]
def calc_exact_neighbors(X, queries, n_queries, n_neighbors):
"""Measures average times for exact neighbor queries."""
print ('Building NearestNeighbors for %d samples in %d dimensions' %
(X.shape[0], X.shape[1]))
nbrs = NearestNeighbors(algorithm='brute', metric='cosine').fit(X)
average_time = 0
t0 = time()
neighbors = nbrs.kneighbors(queries, n_neighbors=n_neighbors,
return_distance=False)
average_time = (time() - t0) / n_queries
return neighbors, average_time
def calc_accuracy(X, queries, n_queries, n_neighbors, exact_neighbors,
average_time_exact, **lshf_params):
"""Calculates accuracy and the speed up of LSHForest."""
print('Building LSHForest for %d samples in %d dimensions' %
(X.shape[0], X.shape[1]))
lshf = LSHForest(**lshf_params)
t0 = time()
lshf.fit(X)
lshf_build_time = time() - t0
print('Done in %0.3fs' % lshf_build_time)
accuracy = 0
t0 = time()
approx_neighbors = lshf.kneighbors(queries, n_neighbors=n_neighbors,
return_distance=False)
average_time_approx = (time() - t0) / n_queries
for i in range(len(queries)):
accuracy += np.in1d(approx_neighbors[i], exact_neighbors[i]).mean()
accuracy /= n_queries
speed_up = average_time_exact / average_time_approx
print('Average time for lshf neighbor queries: %0.3fs' %
average_time_approx)
print ('Average time for exact neighbor queries: %0.3fs' %
average_time_exact)
print ('Average Accuracy : %0.2f' % accuracy)
print ('Speed up: %0.1fx' % speed_up)
return speed_up, accuracy
if __name__ == '__main__':
import matplotlib.pyplot as plt
# Initialize index sizes
n_samples = [int(1e3), int(1e4), int(1e5), int(1e6)]
n_features = int(1e2)
n_queries = 100
n_neighbors = 10
X_index, X_query = make_data(np.max(n_samples), n_features, n_queries,
random_state=0)
params_list = [{'n_estimators': 3, 'n_candidates': 50},
{'n_estimators': 5, 'n_candidates': 70},
{'n_estimators': 10, 'n_candidates': 100}]
accuracies = np.zeros((len(n_samples), len(params_list)), dtype=float)
speed_ups = np.zeros((len(n_samples), len(params_list)), dtype=float)
for i, sample_size in enumerate(n_samples):
print ('==========================================================')
print ('Sample size: %i' % sample_size)
print ('------------------------')
exact_neighbors, average_time_exact = calc_exact_neighbors(
X_index[:sample_size], X_query, n_queries, n_neighbors)
for j, params in enumerate(params_list):
print ('LSHF parameters: n_estimators = %i, n_candidates = %i' %
(params['n_estimators'], params['n_candidates']))
speed_ups[i, j], accuracies[i, j] = calc_accuracy(
X_index[:sample_size], X_query, n_queries, n_neighbors,
exact_neighbors, average_time_exact, random_state=0, **params)
print ('')
print ('==========================================================')
# Set labels for LSHForest parameters
colors = ['c', 'm', 'y']
legend_rects = [plt.Rectangle((0, 0), 0.1, 0.1, fc=color)
for color in colors]
legend_labels = ['n_estimators={n_estimators}, '
'n_candidates={n_candidates}'.format(**p)
for p in params_list]
# Plot precision
plt.figure()
plt.legend(legend_rects, legend_labels,
loc='upper left')
for i in range(len(params_list)):
plt.scatter(n_samples, accuracies[:, i], c=colors[i])
plt.plot(n_samples, accuracies[:, i], c=colors[i])
plt.ylim([0, 1.3])
plt.xlim(np.min(n_samples), np.max(n_samples))
plt.semilogx()
plt.ylabel("Precision@10")
plt.xlabel("Index size")
plt.grid(which='both')
plt.title("Precision of first 10 neighbors with index size")
# Plot speed up
plt.figure()
plt.legend(legend_rects, legend_labels,
loc='upper left')
for i in range(len(params_list)):
plt.scatter(n_samples, speed_ups[:, i], c=colors[i])
plt.plot(n_samples, speed_ups[:, i], c=colors[i])
plt.ylim(0, np.max(speed_ups))
plt.xlim(np.min(n_samples), np.max(n_samples))
plt.semilogx()
plt.ylabel("Speed up")
plt.xlabel("Index size")
plt.grid(which='both')
plt.title("Relationship between Speed up and index size")
plt.show()
| bsd-3-clause |
fspaolo/scikit-learn | examples/ensemble/plot_gradient_boosting_quantile.py | 14 | 2087 | """
=====================================================
Prediction Intervals for Gradient Boosting Regression
=====================================================
This example shows how quantile regression can be used
to create prediction intervals.
"""
import numpy as np
import pylab as pl
from sklearn.ensemble import GradientBoostingRegressor
np.random.seed(1)
def f(x):
"""The function to predict."""
return x * np.sin(x)
#----------------------------------------------------------------------
# First the noiseless case
X = np.atleast_2d(np.random.uniform(0, 10.0, size=100)).T
X = X.astype(np.float32)
# Observations
y = f(X).ravel()
dy = 1.5 + 1.0 * np.random.random(y.shape)
noise = np.random.normal(0, dy)
y += noise
y = y.astype(np.float32)
# Mesh the input space for evaluations of the real function, the prediction and
# its MSE
xx = np.atleast_2d(np.linspace(0, 10, 1000)).T
xx = xx.astype(np.float32)
alpha = 0.95
clf = GradientBoostingRegressor(loss='quantile', alpha=alpha,
n_estimators=250, max_depth=3,
learning_rate=.1, min_samples_leaf=9,
min_samples_split=9)
clf.fit(X, y)
# Make the prediction on the meshed x-axis
y_upper = clf.predict(xx)
clf.set_params(alpha=1.0 - alpha)
clf.fit(X, y)
# Make the prediction on the meshed x-axis
y_lower = clf.predict(xx)
clf.set_params(loss='ls')
clf.fit(X, y)
# Make the prediction on the meshed x-axis
y_pred = clf.predict(xx)
# Plot the function, the prediction and the 95% confidence interval based on
# the MSE
fig = pl.figure()
pl.plot(xx, f(xx), 'g:', label=u'$f(x) = x\,\sin(x)$')
pl.plot(X, y, 'b.', markersize=10, label=u'Observations')
pl.plot(xx, y_pred, 'r-', label=u'Prediction')
pl.plot(xx, y_upper, 'k-')
pl.plot(xx, y_lower, 'k-')
pl.fill(np.concatenate([xx, xx[::-1]]),
np.concatenate([y_upper, y_lower[::-1]]),
alpha=.5, fc='b', ec='None', label='95% prediction interval')
pl.xlabel('$x$')
pl.ylabel('$f(x)$')
pl.ylim(-10, 20)
pl.legend(loc='upper left')
pl.show()
| bsd-3-clause |
bnaul/scikit-learn | examples/linear_model/plot_sparse_logistic_regression_20newsgroups.py | 18 | 4240 | """
====================================================
Multiclass sparse logistic regression on 20newgroups
====================================================
Comparison of multinomial logistic L1 vs one-versus-rest L1 logistic regression
to classify documents from the newgroups20 dataset. Multinomial logistic
regression yields more accurate results and is faster to train on the larger
scale dataset.
Here we use the l1 sparsity that trims the weights of not informative
features to zero. This is good if the goal is to extract the strongly
discriminative vocabulary of each class. If the goal is to get the best
predictive accuracy, it is better to use the non sparsity-inducing l2 penalty
instead.
A more traditional (and possibly better) way to predict on a sparse subset of
input features would be to use univariate feature selection followed by a
traditional (l2-penalised) logistic regression model.
"""
import timeit
import warnings
import matplotlib.pyplot as plt
import numpy as np
from sklearn.datasets import fetch_20newsgroups_vectorized
from sklearn.linear_model import LogisticRegression
from sklearn.model_selection import train_test_split
from sklearn.exceptions import ConvergenceWarning
print(__doc__)
# Author: Arthur Mensch
warnings.filterwarnings("ignore", category=ConvergenceWarning,
module="sklearn")
t0 = timeit.default_timer()
# We use SAGA solver
solver = 'saga'
# Turn down for faster run time
n_samples = 10000
X, y = fetch_20newsgroups_vectorized(subset='all', return_X_y=True)
X = X[:n_samples]
y = y[:n_samples]
X_train, X_test, y_train, y_test = train_test_split(X, y,
random_state=42,
stratify=y,
test_size=0.1)
train_samples, n_features = X_train.shape
n_classes = np.unique(y).shape[0]
print('Dataset 20newsgroup, train_samples=%i, n_features=%i, n_classes=%i'
% (train_samples, n_features, n_classes))
models = {'ovr': {'name': 'One versus Rest', 'iters': [1, 2, 4]},
'multinomial': {'name': 'Multinomial', 'iters': [1, 3, 7]}}
for model in models:
# Add initial chance-level values for plotting purpose
accuracies = [1 / n_classes]
times = [0]
densities = [1]
model_params = models[model]
# Small number of epochs for fast runtime
for this_max_iter in model_params['iters']:
print('[model=%s, solver=%s] Number of epochs: %s' %
(model_params['name'], solver, this_max_iter))
lr = LogisticRegression(solver=solver,
multi_class=model,
penalty='l1',
max_iter=this_max_iter,
random_state=42,
)
t1 = timeit.default_timer()
lr.fit(X_train, y_train)
train_time = timeit.default_timer() - t1
y_pred = lr.predict(X_test)
accuracy = np.sum(y_pred == y_test) / y_test.shape[0]
density = np.mean(lr.coef_ != 0, axis=1) * 100
accuracies.append(accuracy)
densities.append(density)
times.append(train_time)
models[model]['times'] = times
models[model]['densities'] = densities
models[model]['accuracies'] = accuracies
print('Test accuracy for model %s: %.4f' % (model, accuracies[-1]))
print('%% non-zero coefficients for model %s, '
'per class:\n %s' % (model, densities[-1]))
print('Run time (%i epochs) for model %s:'
'%.2f' % (model_params['iters'][-1], model, times[-1]))
fig = plt.figure()
ax = fig.add_subplot(111)
for model in models:
name = models[model]['name']
times = models[model]['times']
accuracies = models[model]['accuracies']
ax.plot(times, accuracies, marker='o',
label='Model: %s' % name)
ax.set_xlabel('Train time (s)')
ax.set_ylabel('Test accuracy')
ax.legend()
fig.suptitle('Multinomial vs One-vs-Rest Logistic L1\n'
'Dataset %s' % '20newsgroups')
fig.tight_layout()
fig.subplots_adjust(top=0.85)
run_time = timeit.default_timer() - t0
print('Example run in %.3f s' % run_time)
plt.show()
| bsd-3-clause |
btabibian/scikit-learn | sklearn/manifold/tests/test_spectral_embedding.py | 7 | 11096 | import numpy as np
from numpy.testing import assert_array_almost_equal
from numpy.testing import assert_array_equal
from scipy import sparse
from scipy.linalg import eigh
from sklearn.manifold.spectral_embedding_ import SpectralEmbedding
from sklearn.manifold.spectral_embedding_ import _graph_is_connected
from sklearn.manifold.spectral_embedding_ import _graph_connected_component
from sklearn.manifold import spectral_embedding
from sklearn.metrics.pairwise import rbf_kernel
from sklearn.metrics import normalized_mutual_info_score
from sklearn.cluster import KMeans
from sklearn.datasets.samples_generator import make_blobs
from sklearn.utils.extmath import _deterministic_vector_sign_flip
from sklearn.utils.testing import assert_true, assert_equal, assert_raises
from sklearn.utils.testing import SkipTest
# non centered, sparse centers to check the
centers = np.array([
[0.0, 5.0, 0.0, 0.0, 0.0],
[0.0, 0.0, 4.0, 0.0, 0.0],
[1.0, 0.0, 0.0, 5.0, 1.0],
])
n_samples = 1000
n_clusters, n_features = centers.shape
S, true_labels = make_blobs(n_samples=n_samples, centers=centers,
cluster_std=1., random_state=42)
def _check_with_col_sign_flipping(A, B, tol=0.0):
""" Check array A and B are equal with possible sign flipping on
each columns"""
sign = True
for column_idx in range(A.shape[1]):
sign = sign and ((((A[:, column_idx] -
B[:, column_idx]) ** 2).mean() <= tol ** 2) or
(((A[:, column_idx] +
B[:, column_idx]) ** 2).mean() <= tol ** 2))
if not sign:
return False
return True
def test_sparse_graph_connected_component():
rng = np.random.RandomState(42)
n_samples = 300
boundaries = [0, 42, 121, 200, n_samples]
p = rng.permutation(n_samples)
connections = []
for start, stop in zip(boundaries[:-1], boundaries[1:]):
group = p[start:stop]
# Connect all elements within the group at least once via an
# arbitrary path that spans the group.
for i in range(len(group) - 1):
connections.append((group[i], group[i + 1]))
# Add some more random connections within the group
min_idx, max_idx = 0, len(group) - 1
n_random_connections = 1000
source = rng.randint(min_idx, max_idx, size=n_random_connections)
target = rng.randint(min_idx, max_idx, size=n_random_connections)
connections.extend(zip(group[source], group[target]))
# Build a symmetric affinity matrix
row_idx, column_idx = tuple(np.array(connections).T)
data = rng.uniform(.1, 42, size=len(connections))
affinity = sparse.coo_matrix((data, (row_idx, column_idx)))
affinity = 0.5 * (affinity + affinity.T)
for start, stop in zip(boundaries[:-1], boundaries[1:]):
component_1 = _graph_connected_component(affinity, p[start])
component_size = stop - start
assert_equal(component_1.sum(), component_size)
# We should retrieve the same component mask by starting by both ends
# of the group
component_2 = _graph_connected_component(affinity, p[stop - 1])
assert_equal(component_2.sum(), component_size)
assert_array_equal(component_1, component_2)
def test_spectral_embedding_two_components(seed=36):
# Test spectral embedding with two components
random_state = np.random.RandomState(seed)
n_sample = 100
affinity = np.zeros(shape=[n_sample * 2, n_sample * 2])
# first component
affinity[0:n_sample,
0:n_sample] = np.abs(random_state.randn(n_sample, n_sample)) + 2
# second component
affinity[n_sample::,
n_sample::] = np.abs(random_state.randn(n_sample, n_sample)) + 2
# Test of internal _graph_connected_component before connection
component = _graph_connected_component(affinity, 0)
assert_true(component[:n_sample].all())
assert_true(not component[n_sample:].any())
component = _graph_connected_component(affinity, -1)
assert_true(not component[:n_sample].any())
assert_true(component[n_sample:].all())
# connection
affinity[0, n_sample + 1] = 1
affinity[n_sample + 1, 0] = 1
affinity.flat[::2 * n_sample + 1] = 0
affinity = 0.5 * (affinity + affinity.T)
true_label = np.zeros(shape=2 * n_sample)
true_label[0:n_sample] = 1
se_precomp = SpectralEmbedding(n_components=1, affinity="precomputed",
random_state=np.random.RandomState(seed))
embedded_coordinate = se_precomp.fit_transform(affinity)
# Some numpy versions are touchy with types
embedded_coordinate = \
se_precomp.fit_transform(affinity.astype(np.float32))
# thresholding on the first components using 0.
label_ = np.array(embedded_coordinate.ravel() < 0, dtype="float")
assert_equal(normalized_mutual_info_score(true_label, label_), 1.0)
def test_spectral_embedding_precomputed_affinity(seed=36):
# Test spectral embedding with precomputed kernel
gamma = 1.0
se_precomp = SpectralEmbedding(n_components=2, affinity="precomputed",
random_state=np.random.RandomState(seed))
se_rbf = SpectralEmbedding(n_components=2, affinity="rbf",
gamma=gamma,
random_state=np.random.RandomState(seed))
embed_precomp = se_precomp.fit_transform(rbf_kernel(S, gamma=gamma))
embed_rbf = se_rbf.fit_transform(S)
assert_array_almost_equal(
se_precomp.affinity_matrix_, se_rbf.affinity_matrix_)
assert_true(_check_with_col_sign_flipping(embed_precomp, embed_rbf, 0.05))
def test_spectral_embedding_callable_affinity(seed=36):
# Test spectral embedding with callable affinity
gamma = 0.9
kern = rbf_kernel(S, gamma=gamma)
se_callable = SpectralEmbedding(n_components=2,
affinity=(
lambda x: rbf_kernel(x, gamma=gamma)),
gamma=gamma,
random_state=np.random.RandomState(seed))
se_rbf = SpectralEmbedding(n_components=2, affinity="rbf",
gamma=gamma,
random_state=np.random.RandomState(seed))
embed_rbf = se_rbf.fit_transform(S)
embed_callable = se_callable.fit_transform(S)
assert_array_almost_equal(
se_callable.affinity_matrix_, se_rbf.affinity_matrix_)
assert_array_almost_equal(kern, se_rbf.affinity_matrix_)
assert_true(
_check_with_col_sign_flipping(embed_rbf, embed_callable, 0.05))
def test_spectral_embedding_amg_solver(seed=36):
# Test spectral embedding with amg solver
try:
from pyamg import smoothed_aggregation_solver
except ImportError:
raise SkipTest("pyamg not available.")
se_amg = SpectralEmbedding(n_components=2, affinity="nearest_neighbors",
eigen_solver="amg", n_neighbors=5,
random_state=np.random.RandomState(seed))
se_arpack = SpectralEmbedding(n_components=2, affinity="nearest_neighbors",
eigen_solver="arpack", n_neighbors=5,
random_state=np.random.RandomState(seed))
embed_amg = se_amg.fit_transform(S)
embed_arpack = se_arpack.fit_transform(S)
assert_true(_check_with_col_sign_flipping(embed_amg, embed_arpack, 0.05))
def test_pipeline_spectral_clustering(seed=36):
# Test using pipeline to do spectral clustering
random_state = np.random.RandomState(seed)
se_rbf = SpectralEmbedding(n_components=n_clusters,
affinity="rbf",
random_state=random_state)
se_knn = SpectralEmbedding(n_components=n_clusters,
affinity="nearest_neighbors",
n_neighbors=5,
random_state=random_state)
for se in [se_rbf, se_knn]:
km = KMeans(n_clusters=n_clusters, random_state=random_state)
km.fit(se.fit_transform(S))
assert_array_almost_equal(
normalized_mutual_info_score(
km.labels_,
true_labels), 1.0, 2)
def test_spectral_embedding_unknown_eigensolver(seed=36):
# Test that SpectralClustering fails with an unknown eigensolver
se = SpectralEmbedding(n_components=1, affinity="precomputed",
random_state=np.random.RandomState(seed),
eigen_solver="<unknown>")
assert_raises(ValueError, se.fit, S)
def test_spectral_embedding_unknown_affinity(seed=36):
# Test that SpectralClustering fails with an unknown affinity type
se = SpectralEmbedding(n_components=1, affinity="<unknown>",
random_state=np.random.RandomState(seed))
assert_raises(ValueError, se.fit, S)
def test_connectivity(seed=36):
# Test that graph connectivity test works as expected
graph = np.array([[1, 0, 0, 0, 0],
[0, 1, 1, 0, 0],
[0, 1, 1, 1, 0],
[0, 0, 1, 1, 1],
[0, 0, 0, 1, 1]])
assert_equal(_graph_is_connected(graph), False)
assert_equal(_graph_is_connected(sparse.csr_matrix(graph)), False)
assert_equal(_graph_is_connected(sparse.csc_matrix(graph)), False)
graph = np.array([[1, 1, 0, 0, 0],
[1, 1, 1, 0, 0],
[0, 1, 1, 1, 0],
[0, 0, 1, 1, 1],
[0, 0, 0, 1, 1]])
assert_equal(_graph_is_connected(graph), True)
assert_equal(_graph_is_connected(sparse.csr_matrix(graph)), True)
assert_equal(_graph_is_connected(sparse.csc_matrix(graph)), True)
def test_spectral_embedding_deterministic():
# Test that Spectral Embedding is deterministic
random_state = np.random.RandomState(36)
data = random_state.randn(10, 30)
sims = rbf_kernel(data)
embedding_1 = spectral_embedding(sims)
embedding_2 = spectral_embedding(sims)
assert_array_almost_equal(embedding_1, embedding_2)
def test_spectral_embedding_unnormalized():
# Test that spectral_embedding is also processing unnormalized laplacian
# correctly
random_state = np.random.RandomState(36)
data = random_state.randn(10, 30)
sims = rbf_kernel(data)
n_components = 8
embedding_1 = spectral_embedding(sims,
norm_laplacian=False,
n_components=n_components,
drop_first=False)
# Verify using manual computation with dense eigh
laplacian, dd = sparse.csgraph.laplacian(sims, normed=False,
return_diag=True)
_, diffusion_map = eigh(laplacian)
embedding_2 = diffusion_map.T[:n_components] * dd
embedding_2 = _deterministic_vector_sign_flip(embedding_2).T
assert_array_almost_equal(embedding_1, embedding_2)
| bsd-3-clause |
petosegan/scikit-learn | sklearn/tree/tree.py | 113 | 34767 | """
This module gathers tree-based methods, including decision, regression and
randomized trees. Single and multi-output problems are both handled.
"""
# Authors: Gilles Louppe <g.louppe@gmail.com>
# Peter Prettenhofer <peter.prettenhofer@gmail.com>
# Brian Holt <bdholt1@gmail.com>
# Noel Dawe <noel@dawe.me>
# Satrajit Gosh <satrajit.ghosh@gmail.com>
# Joly Arnaud <arnaud.v.joly@gmail.com>
# Fares Hedayati <fares.hedayati@gmail.com>
#
# Licence: BSD 3 clause
from __future__ import division
import numbers
from abc import ABCMeta, abstractmethod
import numpy as np
from scipy.sparse import issparse
from ..base import BaseEstimator, ClassifierMixin, RegressorMixin
from ..externals import six
from ..feature_selection.from_model import _LearntSelectorMixin
from ..utils import check_array, check_random_state, compute_sample_weight
from ..utils.validation import NotFittedError
from ._tree import Criterion
from ._tree import Splitter
from ._tree import DepthFirstTreeBuilder, BestFirstTreeBuilder
from ._tree import Tree
from . import _tree
__all__ = ["DecisionTreeClassifier",
"DecisionTreeRegressor",
"ExtraTreeClassifier",
"ExtraTreeRegressor"]
# =============================================================================
# Types and constants
# =============================================================================
DTYPE = _tree.DTYPE
DOUBLE = _tree.DOUBLE
CRITERIA_CLF = {"gini": _tree.Gini, "entropy": _tree.Entropy}
CRITERIA_REG = {"mse": _tree.MSE, "friedman_mse": _tree.FriedmanMSE}
DENSE_SPLITTERS = {"best": _tree.BestSplitter,
"presort-best": _tree.PresortBestSplitter,
"random": _tree.RandomSplitter}
SPARSE_SPLITTERS = {"best": _tree.BestSparseSplitter,
"random": _tree.RandomSparseSplitter}
# =============================================================================
# Base decision tree
# =============================================================================
class BaseDecisionTree(six.with_metaclass(ABCMeta, BaseEstimator,
_LearntSelectorMixin)):
"""Base class for decision trees.
Warning: This class should not be used directly.
Use derived classes instead.
"""
@abstractmethod
def __init__(self,
criterion,
splitter,
max_depth,
min_samples_split,
min_samples_leaf,
min_weight_fraction_leaf,
max_features,
max_leaf_nodes,
random_state,
class_weight=None):
self.criterion = criterion
self.splitter = splitter
self.max_depth = max_depth
self.min_samples_split = min_samples_split
self.min_samples_leaf = min_samples_leaf
self.min_weight_fraction_leaf = min_weight_fraction_leaf
self.max_features = max_features
self.random_state = random_state
self.max_leaf_nodes = max_leaf_nodes
self.class_weight = class_weight
self.n_features_ = None
self.n_outputs_ = None
self.classes_ = None
self.n_classes_ = None
self.tree_ = None
self.max_features_ = None
def fit(self, X, y, sample_weight=None, check_input=True):
"""Build a decision tree from the training set (X, y).
Parameters
----------
X : array-like or sparse matrix, shape = [n_samples, n_features]
The training input samples. Internally, it will be converted to
``dtype=np.float32`` and if a sparse matrix is provided
to a sparse ``csc_matrix``.
y : array-like, shape = [n_samples] or [n_samples, n_outputs]
The target values (class labels in classification, real numbers in
regression). In the regression case, use ``dtype=np.float64`` and
``order='C'`` for maximum efficiency.
sample_weight : array-like, shape = [n_samples] or None
Sample weights. If None, then samples are equally weighted. Splits
that would create child nodes with net zero or negative weight are
ignored while searching for a split in each node. In the case of
classification, splits are also ignored if they would result in any
single class carrying a negative weight in either child node.
check_input : boolean, (default=True)
Allow to bypass several input checking.
Don't use this parameter unless you know what you do.
Returns
-------
self : object
Returns self.
"""
random_state = check_random_state(self.random_state)
if check_input:
X = check_array(X, dtype=DTYPE, accept_sparse="csc")
if issparse(X):
X.sort_indices()
if X.indices.dtype != np.intc or X.indptr.dtype != np.intc:
raise ValueError("No support for np.int64 index based "
"sparse matrices")
# Determine output settings
n_samples, self.n_features_ = X.shape
is_classification = isinstance(self, ClassifierMixin)
y = np.atleast_1d(y)
expanded_class_weight = None
if y.ndim == 1:
# reshape is necessary to preserve the data contiguity against vs
# [:, np.newaxis] that does not.
y = np.reshape(y, (-1, 1))
self.n_outputs_ = y.shape[1]
if is_classification:
y = np.copy(y)
self.classes_ = []
self.n_classes_ = []
if self.class_weight is not None:
y_original = np.copy(y)
y_store_unique_indices = np.zeros(y.shape, dtype=np.int)
for k in range(self.n_outputs_):
classes_k, y_store_unique_indices[:, k] = np.unique(y[:, k], return_inverse=True)
self.classes_.append(classes_k)
self.n_classes_.append(classes_k.shape[0])
y = y_store_unique_indices
if self.class_weight is not None:
expanded_class_weight = compute_sample_weight(
self.class_weight, y_original)
else:
self.classes_ = [None] * self.n_outputs_
self.n_classes_ = [1] * self.n_outputs_
self.n_classes_ = np.array(self.n_classes_, dtype=np.intp)
if getattr(y, "dtype", None) != DOUBLE or not y.flags.contiguous:
y = np.ascontiguousarray(y, dtype=DOUBLE)
# Check parameters
max_depth = ((2 ** 31) - 1 if self.max_depth is None
else self.max_depth)
max_leaf_nodes = (-1 if self.max_leaf_nodes is None
else self.max_leaf_nodes)
if isinstance(self.max_features, six.string_types):
if self.max_features == "auto":
if is_classification:
max_features = max(1, int(np.sqrt(self.n_features_)))
else:
max_features = self.n_features_
elif self.max_features == "sqrt":
max_features = max(1, int(np.sqrt(self.n_features_)))
elif self.max_features == "log2":
max_features = max(1, int(np.log2(self.n_features_)))
else:
raise ValueError(
'Invalid value for max_features. Allowed string '
'values are "auto", "sqrt" or "log2".')
elif self.max_features is None:
max_features = self.n_features_
elif isinstance(self.max_features, (numbers.Integral, np.integer)):
max_features = self.max_features
else: # float
if self.max_features > 0.0:
max_features = max(1, int(self.max_features * self.n_features_))
else:
max_features = 0
self.max_features_ = max_features
if len(y) != n_samples:
raise ValueError("Number of labels=%d does not match "
"number of samples=%d" % (len(y), n_samples))
if self.min_samples_split <= 0:
raise ValueError("min_samples_split must be greater than zero.")
if self.min_samples_leaf <= 0:
raise ValueError("min_samples_leaf must be greater than zero.")
if not 0 <= self.min_weight_fraction_leaf <= 0.5:
raise ValueError("min_weight_fraction_leaf must in [0, 0.5]")
if max_depth <= 0:
raise ValueError("max_depth must be greater than zero. ")
if not (0 < max_features <= self.n_features_):
raise ValueError("max_features must be in (0, n_features]")
if not isinstance(max_leaf_nodes, (numbers.Integral, np.integer)):
raise ValueError("max_leaf_nodes must be integral number but was "
"%r" % max_leaf_nodes)
if -1 < max_leaf_nodes < 2:
raise ValueError(("max_leaf_nodes {0} must be either smaller than "
"0 or larger than 1").format(max_leaf_nodes))
if sample_weight is not None:
if (getattr(sample_weight, "dtype", None) != DOUBLE or
not sample_weight.flags.contiguous):
sample_weight = np.ascontiguousarray(
sample_weight, dtype=DOUBLE)
if len(sample_weight.shape) > 1:
raise ValueError("Sample weights array has more "
"than one dimension: %d" %
len(sample_weight.shape))
if len(sample_weight) != n_samples:
raise ValueError("Number of weights=%d does not match "
"number of samples=%d" %
(len(sample_weight), n_samples))
if expanded_class_weight is not None:
if sample_weight is not None:
sample_weight = sample_weight * expanded_class_weight
else:
sample_weight = expanded_class_weight
# Set min_weight_leaf from min_weight_fraction_leaf
if self.min_weight_fraction_leaf != 0. and sample_weight is not None:
min_weight_leaf = (self.min_weight_fraction_leaf *
np.sum(sample_weight))
else:
min_weight_leaf = 0.
# Set min_samples_split sensibly
min_samples_split = max(self.min_samples_split,
2 * self.min_samples_leaf)
# Build tree
criterion = self.criterion
if not isinstance(criterion, Criterion):
if is_classification:
criterion = CRITERIA_CLF[self.criterion](self.n_outputs_,
self.n_classes_)
else:
criterion = CRITERIA_REG[self.criterion](self.n_outputs_)
SPLITTERS = SPARSE_SPLITTERS if issparse(X) else DENSE_SPLITTERS
splitter = self.splitter
if not isinstance(self.splitter, Splitter):
splitter = SPLITTERS[self.splitter](criterion,
self.max_features_,
self.min_samples_leaf,
min_weight_leaf,
random_state)
self.tree_ = Tree(self.n_features_, self.n_classes_, self.n_outputs_)
# Use BestFirst if max_leaf_nodes given; use DepthFirst otherwise
if max_leaf_nodes < 0:
builder = DepthFirstTreeBuilder(splitter, min_samples_split,
self.min_samples_leaf,
min_weight_leaf,
max_depth)
else:
builder = BestFirstTreeBuilder(splitter, min_samples_split,
self.min_samples_leaf,
min_weight_leaf,
max_depth,
max_leaf_nodes)
builder.build(self.tree_, X, y, sample_weight)
if self.n_outputs_ == 1:
self.n_classes_ = self.n_classes_[0]
self.classes_ = self.classes_[0]
return self
def _validate_X_predict(self, X, check_input):
"""Validate X whenever one tries to predict, apply, predict_proba"""
if self.tree_ is None:
raise NotFittedError("Estimator not fitted, "
"call `fit` before exploiting the model.")
if check_input:
X = check_array(X, dtype=DTYPE, accept_sparse="csr")
if issparse(X) and (X.indices.dtype != np.intc or
X.indptr.dtype != np.intc):
raise ValueError("No support for np.int64 index based "
"sparse matrices")
n_features = X.shape[1]
if self.n_features_ != n_features:
raise ValueError("Number of features of the model must "
" match the input. Model n_features is %s and "
" input n_features is %s "
% (self.n_features_, n_features))
return X
def predict(self, X, check_input=True):
"""Predict class or regression value for X.
For a classification model, the predicted class for each sample in X is
returned. For a regression model, the predicted value based on X is
returned.
Parameters
----------
X : array-like or sparse matrix of shape = [n_samples, n_features]
The input samples. Internally, it will be converted to
``dtype=np.float32`` and if a sparse matrix is provided
to a sparse ``csr_matrix``.
check_input : boolean, (default=True)
Allow to bypass several input checking.
Don't use this parameter unless you know what you do.
Returns
-------
y : array of shape = [n_samples] or [n_samples, n_outputs]
The predicted classes, or the predict values.
"""
X = self._validate_X_predict(X, check_input)
proba = self.tree_.predict(X)
n_samples = X.shape[0]
# Classification
if isinstance(self, ClassifierMixin):
if self.n_outputs_ == 1:
return self.classes_.take(np.argmax(proba, axis=1), axis=0)
else:
predictions = np.zeros((n_samples, self.n_outputs_))
for k in range(self.n_outputs_):
predictions[:, k] = self.classes_[k].take(
np.argmax(proba[:, k], axis=1),
axis=0)
return predictions
# Regression
else:
if self.n_outputs_ == 1:
return proba[:, 0]
else:
return proba[:, :, 0]
def apply(self, X, check_input=True):
"""
Returns the index of the leaf that each sample is predicted as.
Parameters
----------
X : array_like or sparse matrix, shape = [n_samples, n_features]
The input samples. Internally, it will be converted to
``dtype=np.float32`` and if a sparse matrix is provided
to a sparse ``csr_matrix``.
check_input : boolean, (default=True)
Allow to bypass several input checking.
Don't use this parameter unless you know what you do.
Returns
-------
X_leaves : array_like, shape = [n_samples,]
For each datapoint x in X, return the index of the leaf x
ends up in. Leaves are numbered within
``[0; self.tree_.node_count)``, possibly with gaps in the
numbering.
"""
X = self._validate_X_predict(X, check_input)
return self.tree_.apply(X)
@property
def feature_importances_(self):
"""Return the feature importances.
The importance of a feature is computed as the (normalized) total
reduction of the criterion brought by that feature.
It is also known as the Gini importance.
Returns
-------
feature_importances_ : array, shape = [n_features]
"""
if self.tree_ is None:
raise NotFittedError("Estimator not fitted, call `fit` before"
" `feature_importances_`.")
return self.tree_.compute_feature_importances()
# =============================================================================
# Public estimators
# =============================================================================
class DecisionTreeClassifier(BaseDecisionTree, ClassifierMixin):
"""A decision tree classifier.
Read more in the :ref:`User Guide <tree>`.
Parameters
----------
criterion : string, optional (default="gini")
The function to measure the quality of a split. Supported criteria are
"gini" for the Gini impurity and "entropy" for the information gain.
splitter : string, optional (default="best")
The strategy used to choose the split at each node. Supported
strategies are "best" to choose the best split and "random" to choose
the best random split.
max_features : int, float, string or None, optional (default=None)
The number of features to consider when looking for the best split:
- If int, then consider `max_features` features at each split.
- If float, then `max_features` is a percentage and
`int(max_features * n_features)` features are considered at each
split.
- If "auto", then `max_features=sqrt(n_features)`.
- If "sqrt", then `max_features=sqrt(n_features)`.
- If "log2", then `max_features=log2(n_features)`.
- If None, then `max_features=n_features`.
Note: the search for a split does not stop until at least one
valid partition of the node samples is found, even if it requires to
effectively inspect more than ``max_features`` features.
max_depth : int or None, optional (default=None)
The maximum depth of the tree. If None, then nodes are expanded until
all leaves are pure or until all leaves contain less than
min_samples_split samples.
Ignored if ``max_leaf_nodes`` is not None.
min_samples_split : int, optional (default=2)
The minimum number of samples required to split an internal node.
min_samples_leaf : int, optional (default=1)
The minimum number of samples required to be at a leaf node.
min_weight_fraction_leaf : float, optional (default=0.)
The minimum weighted fraction of the input samples required to be at a
leaf node.
max_leaf_nodes : int or None, optional (default=None)
Grow a tree with ``max_leaf_nodes`` in best-first fashion.
Best nodes are defined as relative reduction in impurity.
If None then unlimited number of leaf nodes.
If not None then ``max_depth`` will be ignored.
class_weight : dict, list of dicts, "balanced" or None, optional
(default=None)
Weights associated with classes in the form ``{class_label: weight}``.
If not given, all classes are supposed to have weight one. For
multi-output problems, a list of dicts can be provided in the same
order as the columns of y.
The "balanced" mode uses the values of y to automatically adjust
weights inversely proportional to class frequencies in the input data
as ``n_samples / (n_classes * np.bincount(y))``
For multi-output, the weights of each column of y will be multiplied.
Note that these weights will be multiplied with sample_weight (passed
through the fit method) if sample_weight is specified.
random_state : int, RandomState instance or None, optional (default=None)
If int, random_state is the seed used by the random number generator;
If RandomState instance, random_state is the random number generator;
If None, the random number generator is the RandomState instance used
by `np.random`.
Attributes
----------
classes_ : array of shape = [n_classes] or a list of such arrays
The classes labels (single output problem),
or a list of arrays of class labels (multi-output problem).
feature_importances_ : array of shape = [n_features]
The feature importances. The higher, the more important the
feature. The importance of a feature is computed as the (normalized)
total reduction of the criterion brought by that feature. It is also
known as the Gini importance [4]_.
max_features_ : int,
The inferred value of max_features.
n_classes_ : int or list
The number of classes (for single output problems),
or a list containing the number of classes for each
output (for multi-output problems).
n_features_ : int
The number of features when ``fit`` is performed.
n_outputs_ : int
The number of outputs when ``fit`` is performed.
tree_ : Tree object
The underlying Tree object.
See also
--------
DecisionTreeRegressor
References
----------
.. [1] http://en.wikipedia.org/wiki/Decision_tree_learning
.. [2] L. Breiman, J. Friedman, R. Olshen, and C. Stone, "Classification
and Regression Trees", Wadsworth, Belmont, CA, 1984.
.. [3] T. Hastie, R. Tibshirani and J. Friedman. "Elements of Statistical
Learning", Springer, 2009.
.. [4] L. Breiman, and A. Cutler, "Random Forests",
http://www.stat.berkeley.edu/~breiman/RandomForests/cc_home.htm
Examples
--------
>>> from sklearn.datasets import load_iris
>>> from sklearn.cross_validation import cross_val_score
>>> from sklearn.tree import DecisionTreeClassifier
>>> clf = DecisionTreeClassifier(random_state=0)
>>> iris = load_iris()
>>> cross_val_score(clf, iris.data, iris.target, cv=10)
... # doctest: +SKIP
...
array([ 1. , 0.93..., 0.86..., 0.93..., 0.93...,
0.93..., 0.93..., 1. , 0.93..., 1. ])
"""
def __init__(self,
criterion="gini",
splitter="best",
max_depth=None,
min_samples_split=2,
min_samples_leaf=1,
min_weight_fraction_leaf=0.,
max_features=None,
random_state=None,
max_leaf_nodes=None,
class_weight=None):
super(DecisionTreeClassifier, self).__init__(
criterion=criterion,
splitter=splitter,
max_depth=max_depth,
min_samples_split=min_samples_split,
min_samples_leaf=min_samples_leaf,
min_weight_fraction_leaf=min_weight_fraction_leaf,
max_features=max_features,
max_leaf_nodes=max_leaf_nodes,
class_weight=class_weight,
random_state=random_state)
def predict_proba(self, X, check_input=True):
"""Predict class probabilities of the input samples X.
The predicted class probability is the fraction of samples of the same
class in a leaf.
check_input : boolean, (default=True)
Allow to bypass several input checking.
Don't use this parameter unless you know what you do.
Parameters
----------
X : array-like or sparse matrix of shape = [n_samples, n_features]
The input samples. Internally, it will be converted to
``dtype=np.float32`` and if a sparse matrix is provided
to a sparse ``csr_matrix``.
Returns
-------
p : array of shape = [n_samples, n_classes], or a list of n_outputs
such arrays if n_outputs > 1.
The class probabilities of the input samples. The order of the
classes corresponds to that in the attribute `classes_`.
"""
X = self._validate_X_predict(X, check_input)
proba = self.tree_.predict(X)
if self.n_outputs_ == 1:
proba = proba[:, :self.n_classes_]
normalizer = proba.sum(axis=1)[:, np.newaxis]
normalizer[normalizer == 0.0] = 1.0
proba /= normalizer
return proba
else:
all_proba = []
for k in range(self.n_outputs_):
proba_k = proba[:, k, :self.n_classes_[k]]
normalizer = proba_k.sum(axis=1)[:, np.newaxis]
normalizer[normalizer == 0.0] = 1.0
proba_k /= normalizer
all_proba.append(proba_k)
return all_proba
def predict_log_proba(self, X):
"""Predict class log-probabilities of the input samples X.
Parameters
----------
X : array-like or sparse matrix of shape = [n_samples, n_features]
The input samples. Internally, it will be converted to
``dtype=np.float32`` and if a sparse matrix is provided
to a sparse ``csr_matrix``.
Returns
-------
p : array of shape = [n_samples, n_classes], or a list of n_outputs
such arrays if n_outputs > 1.
The class log-probabilities of the input samples. The order of the
classes corresponds to that in the attribute `classes_`.
"""
proba = self.predict_proba(X)
if self.n_outputs_ == 1:
return np.log(proba)
else:
for k in range(self.n_outputs_):
proba[k] = np.log(proba[k])
return proba
class DecisionTreeRegressor(BaseDecisionTree, RegressorMixin):
"""A decision tree regressor.
Read more in the :ref:`User Guide <tree>`.
Parameters
----------
criterion : string, optional (default="mse")
The function to measure the quality of a split. The only supported
criterion is "mse" for the mean squared error, which is equal to
variance reduction as feature selection criterion.
splitter : string, optional (default="best")
The strategy used to choose the split at each node. Supported
strategies are "best" to choose the best split and "random" to choose
the best random split.
max_features : int, float, string or None, optional (default=None)
The number of features to consider when looking for the best split:
- If int, then consider `max_features` features at each split.
- If float, then `max_features` is a percentage and
`int(max_features * n_features)` features are considered at each
split.
- If "auto", then `max_features=n_features`.
- If "sqrt", then `max_features=sqrt(n_features)`.
- If "log2", then `max_features=log2(n_features)`.
- If None, then `max_features=n_features`.
Note: the search for a split does not stop until at least one
valid partition of the node samples is found, even if it requires to
effectively inspect more than ``max_features`` features.
max_depth : int or None, optional (default=None)
The maximum depth of the tree. If None, then nodes are expanded until
all leaves are pure or until all leaves contain less than
min_samples_split samples.
Ignored if ``max_leaf_nodes`` is not None.
min_samples_split : int, optional (default=2)
The minimum number of samples required to split an internal node.
min_samples_leaf : int, optional (default=1)
The minimum number of samples required to be at a leaf node.
min_weight_fraction_leaf : float, optional (default=0.)
The minimum weighted fraction of the input samples required to be at a
leaf node.
max_leaf_nodes : int or None, optional (default=None)
Grow a tree with ``max_leaf_nodes`` in best-first fashion.
Best nodes are defined as relative reduction in impurity.
If None then unlimited number of leaf nodes.
If not None then ``max_depth`` will be ignored.
random_state : int, RandomState instance or None, optional (default=None)
If int, random_state is the seed used by the random number generator;
If RandomState instance, random_state is the random number generator;
If None, the random number generator is the RandomState instance used
by `np.random`.
Attributes
----------
feature_importances_ : array of shape = [n_features]
The feature importances.
The higher, the more important the feature.
The importance of a feature is computed as the
(normalized) total reduction of the criterion brought
by that feature. It is also known as the Gini importance [4]_.
max_features_ : int,
The inferred value of max_features.
n_features_ : int
The number of features when ``fit`` is performed.
n_outputs_ : int
The number of outputs when ``fit`` is performed.
tree_ : Tree object
The underlying Tree object.
See also
--------
DecisionTreeClassifier
References
----------
.. [1] http://en.wikipedia.org/wiki/Decision_tree_learning
.. [2] L. Breiman, J. Friedman, R. Olshen, and C. Stone, "Classification
and Regression Trees", Wadsworth, Belmont, CA, 1984.
.. [3] T. Hastie, R. Tibshirani and J. Friedman. "Elements of Statistical
Learning", Springer, 2009.
.. [4] L. Breiman, and A. Cutler, "Random Forests",
http://www.stat.berkeley.edu/~breiman/RandomForests/cc_home.htm
Examples
--------
>>> from sklearn.datasets import load_boston
>>> from sklearn.cross_validation import cross_val_score
>>> from sklearn.tree import DecisionTreeRegressor
>>> boston = load_boston()
>>> regressor = DecisionTreeRegressor(random_state=0)
>>> cross_val_score(regressor, boston.data, boston.target, cv=10)
... # doctest: +SKIP
...
array([ 0.61..., 0.57..., -0.34..., 0.41..., 0.75...,
0.07..., 0.29..., 0.33..., -1.42..., -1.77...])
"""
def __init__(self,
criterion="mse",
splitter="best",
max_depth=None,
min_samples_split=2,
min_samples_leaf=1,
min_weight_fraction_leaf=0.,
max_features=None,
random_state=None,
max_leaf_nodes=None):
super(DecisionTreeRegressor, self).__init__(
criterion=criterion,
splitter=splitter,
max_depth=max_depth,
min_samples_split=min_samples_split,
min_samples_leaf=min_samples_leaf,
min_weight_fraction_leaf=min_weight_fraction_leaf,
max_features=max_features,
max_leaf_nodes=max_leaf_nodes,
random_state=random_state)
class ExtraTreeClassifier(DecisionTreeClassifier):
"""An extremely randomized tree classifier.
Extra-trees differ from classic decision trees in the way they are built.
When looking for the best split to separate the samples of a node into two
groups, random splits are drawn for each of the `max_features` randomly
selected features and the best split among those is chosen. When
`max_features` is set 1, this amounts to building a totally random
decision tree.
Warning: Extra-trees should only be used within ensemble methods.
Read more in the :ref:`User Guide <tree>`.
See also
--------
ExtraTreeRegressor, ExtraTreesClassifier, ExtraTreesRegressor
References
----------
.. [1] P. Geurts, D. Ernst., and L. Wehenkel, "Extremely randomized trees",
Machine Learning, 63(1), 3-42, 2006.
"""
def __init__(self,
criterion="gini",
splitter="random",
max_depth=None,
min_samples_split=2,
min_samples_leaf=1,
min_weight_fraction_leaf=0.,
max_features="auto",
random_state=None,
max_leaf_nodes=None,
class_weight=None):
super(ExtraTreeClassifier, self).__init__(
criterion=criterion,
splitter=splitter,
max_depth=max_depth,
min_samples_split=min_samples_split,
min_samples_leaf=min_samples_leaf,
min_weight_fraction_leaf=min_weight_fraction_leaf,
max_features=max_features,
max_leaf_nodes=max_leaf_nodes,
class_weight=class_weight,
random_state=random_state)
class ExtraTreeRegressor(DecisionTreeRegressor):
"""An extremely randomized tree regressor.
Extra-trees differ from classic decision trees in the way they are built.
When looking for the best split to separate the samples of a node into two
groups, random splits are drawn for each of the `max_features` randomly
selected features and the best split among those is chosen. When
`max_features` is set 1, this amounts to building a totally random
decision tree.
Warning: Extra-trees should only be used within ensemble methods.
Read more in the :ref:`User Guide <tree>`.
See also
--------
ExtraTreeClassifier, ExtraTreesClassifier, ExtraTreesRegressor
References
----------
.. [1] P. Geurts, D. Ernst., and L. Wehenkel, "Extremely randomized trees",
Machine Learning, 63(1), 3-42, 2006.
"""
def __init__(self,
criterion="mse",
splitter="random",
max_depth=None,
min_samples_split=2,
min_samples_leaf=1,
min_weight_fraction_leaf=0.,
max_features="auto",
random_state=None,
max_leaf_nodes=None):
super(ExtraTreeRegressor, self).__init__(
criterion=criterion,
splitter=splitter,
max_depth=max_depth,
min_samples_split=min_samples_split,
min_samples_leaf=min_samples_leaf,
min_weight_fraction_leaf=min_weight_fraction_leaf,
max_features=max_features,
max_leaf_nodes=max_leaf_nodes,
random_state=random_state)
| bsd-3-clause |
advatar/caffe | examples/web_demo/app.py | 10 | 7400 | import os
import time
import cPickle
import datetime
import logging
import flask
import werkzeug
import optparse
import tornado.wsgi
import tornado.httpserver
import numpy as np
import pandas as pd
from PIL import Image as PILImage
import cStringIO as StringIO
import urllib
import caffe
import exifutil
REPO_DIRNAME = os.path.abspath(os.path.dirname(__file__) + '/../..')
UPLOAD_FOLDER = '/tmp/caffe_demos_uploads'
ALLOWED_IMAGE_EXTENSIONS = set(['png', 'bmp', 'jpg', 'jpe', 'jpeg', 'gif'])
# Obtain the flask app object
app = flask.Flask(__name__)
@app.route('/')
def index():
return flask.render_template('index.html', has_result=False)
@app.route('/classify_url', methods=['GET'])
def classify_url():
imageurl = flask.request.args.get('imageurl', '')
try:
string_buffer = StringIO.StringIO(
urllib.urlopen(imageurl).read())
image = caffe.io.load_image(string_buffer)
except Exception as err:
# For any exception we encounter in reading the image, we will just
# not continue.
logging.info('URL Image open error: %s', err)
return flask.render_template(
'index.html', has_result=True,
result=(False, 'Cannot open image from URL.')
)
logging.info('Image: %s', imageurl)
result = app.clf.classify_image(image)
return flask.render_template(
'index.html', has_result=True, result=result, imagesrc=imageurl)
@app.route('/classify_upload', methods=['POST'])
def classify_upload():
try:
# We will save the file to disk for possible data collection.
imagefile = flask.request.files['imagefile']
filename_ = str(datetime.datetime.now()).replace(' ', '_') + \
werkzeug.secure_filename(imagefile.filename)
filename = os.path.join(UPLOAD_FOLDER, filename_)
imagefile.save(filename)
logging.info('Saving to %s.', filename)
image = exifutil.open_oriented_im(filename)
except Exception as err:
logging.info('Uploaded image open error: %s', err)
return flask.render_template(
'index.html', has_result=True,
result=(False, 'Cannot open uploaded image.')
)
result = app.clf.classify_image(image)
return flask.render_template(
'index.html', has_result=True, result=result,
imagesrc=embed_image_html(image)
)
def embed_image_html(image):
"""Creates an image embedded in HTML base64 format."""
image_pil = PILImage.fromarray((255 * image).astype('uint8'))
image_pil = image_pil.resize((256, 256))
string_buf = StringIO.StringIO()
image_pil.save(string_buf, format='png')
data = string_buf.getvalue().encode('base64').replace('\n', '')
return 'data:image/png;base64,' + data
def allowed_file(filename):
return (
'.' in filename and
filename.rsplit('.', 1)[1] in ALLOWED_IMAGE_EXTENSIONS
)
class ImagenetClassifier(object):
default_args = {
'model_def_file': (
'{}/examples/imagenet/imagenet_deploy.prototxt'.format(REPO_DIRNAME)),
'pretrained_model_file': (
'{}/examples/imagenet/caffe_reference_imagenet_model'.format(REPO_DIRNAME)),
'mean_file': (
'{}/python/caffe/imagenet/ilsvrc_2012_mean.npy'.format(REPO_DIRNAME)),
'class_labels_file': (
'{}/data/ilsvrc12/synset_words.txt'.format(REPO_DIRNAME)),
'bet_file': (
'{}/data/ilsvrc12/imagenet.bet.pickle'.format(REPO_DIRNAME)),
}
for key, val in default_args.iteritems():
if not os.path.exists(val):
raise Exception(
"File for {} is missing. Should be at: {}".format(key, val))
default_args['image_dim'] = 227
default_args['gpu_mode'] = True
def __init__(self, model_def_file, pretrained_model_file, mean_file,
class_labels_file, bet_file, image_dim, gpu_mode=False):
logging.info('Loading net and associated files...')
self.net = caffe.Classifier(
model_def_file, pretrained_model_file, input_scale=255,
image_dims=(image_dim, image_dim), gpu=gpu_mode,
mean_file=mean_file, channel_swap=(2, 1, 0)
)
with open(class_labels_file) as f:
labels_df = pd.DataFrame([
{
'synset_id': l.strip().split(' ')[0],
'name': ' '.join(l.strip().split(' ')[1:]).split(',')[0]
}
for l in f.readlines()
])
self.labels = labels_df.sort('synset_id')['name'].values
self.bet = cPickle.load(open(bet_file))
# A bias to prefer children nodes in single-chain paths
# I am setting the value to 0.1 as a quick, simple model.
# We could use better psychological models here...
self.bet['infogain'] -= np.array(self.bet['preferences']) * 0.1
def classify_image(self, image):
try:
starttime = time.time()
scores = self.net.predict([image], oversample=True).flatten()
endtime = time.time()
indices = (-scores).argsort()[:5]
predictions = self.labels[indices]
# In addition to the prediction text, we will also produce
# the length for the progress bar visualization.
meta = [
(p, '%.5f' % scores[i])
for i, p in zip(indices, predictions)
]
logging.info('result: %s', str(meta))
# Compute expected information gain
expected_infogain = np.dot(
self.bet['probmat'], scores[self.bet['idmapping']])
expected_infogain *= self.bet['infogain']
# sort the scores
infogain_sort = expected_infogain.argsort()[::-1]
bet_result = [(self.bet['words'][v], '%.5f' % expected_infogain[v])
for v in infogain_sort[:5]]
logging.info('bet result: %s', str(bet_result))
return (True, meta, bet_result, '%.3f' % (endtime - starttime))
except Exception as err:
logging.info('Classification error: %s', err)
return (False, 'Something went wrong when classifying the '
'image. Maybe try another one?')
def start_tornado(app, port=5000):
http_server = tornado.httpserver.HTTPServer(
tornado.wsgi.WSGIContainer(app))
http_server.listen(port)
print("Tornado server starting on port {}".format(port))
tornado.ioloop.IOLoop.instance().start()
def start_from_terminal(app):
"""
Parse command line options and start the server.
"""
parser = optparse.OptionParser()
parser.add_option(
'-d', '--debug',
help="enable debug mode",
action="store_true", default=False)
parser.add_option(
'-p', '--port',
help="which port to serve content on",
type='int', default=5000)
opts, args = parser.parse_args()
# Initialize classifier
app.clf = ImagenetClassifier(**ImagenetClassifier.default_args)
if opts.debug:
app.run(debug=True, host='0.0.0.0', port=opts.port)
else:
start_tornado(app, opts.port)
if __name__ == '__main__':
logging.getLogger().setLevel(logging.INFO)
if not os.path.exists(UPLOAD_FOLDER):
os.makedirs(UPLOAD_FOLDER)
start_from_terminal(app)
| bsd-2-clause |
michigraber/scikit-learn | examples/applications/plot_outlier_detection_housing.py | 243 | 5577 | """
====================================
Outlier detection on a real data set
====================================
This example illustrates the need for robust covariance estimation
on a real data set. It is useful both for outlier detection and for
a better understanding of the data structure.
We selected two sets of two variables from the Boston housing data set
as an illustration of what kind of analysis can be done with several
outlier detection tools. For the purpose of visualization, we are working
with two-dimensional examples, but one should be aware that things are
not so trivial in high-dimension, as it will be pointed out.
In both examples below, the main result is that the empirical covariance
estimate, as a non-robust one, is highly influenced by the heterogeneous
structure of the observations. Although the robust covariance estimate is
able to focus on the main mode of the data distribution, it sticks to the
assumption that the data should be Gaussian distributed, yielding some biased
estimation of the data structure, but yet accurate to some extent.
The One-Class SVM algorithm
First example
-------------
The first example illustrates how robust covariance estimation can help
concentrating on a relevant cluster when another one exists. Here, many
observations are confounded into one and break down the empirical covariance
estimation.
Of course, some screening tools would have pointed out the presence of two
clusters (Support Vector Machines, Gaussian Mixture Models, univariate
outlier detection, ...). But had it been a high-dimensional example, none
of these could be applied that easily.
Second example
--------------
The second example shows the ability of the Minimum Covariance Determinant
robust estimator of covariance to concentrate on the main mode of the data
distribution: the location seems to be well estimated, although the covariance
is hard to estimate due to the banana-shaped distribution. Anyway, we can
get rid of some outlying observations.
The One-Class SVM is able to capture the real data structure, but the
difficulty is to adjust its kernel bandwidth parameter so as to obtain
a good compromise between the shape of the data scatter matrix and the
risk of over-fitting the data.
"""
print(__doc__)
# Author: Virgile Fritsch <virgile.fritsch@inria.fr>
# License: BSD 3 clause
import numpy as np
from sklearn.covariance import EllipticEnvelope
from sklearn.svm import OneClassSVM
import matplotlib.pyplot as plt
import matplotlib.font_manager
from sklearn.datasets import load_boston
# Get data
X1 = load_boston()['data'][:, [8, 10]] # two clusters
X2 = load_boston()['data'][:, [5, 12]] # "banana"-shaped
# Define "classifiers" to be used
classifiers = {
"Empirical Covariance": EllipticEnvelope(support_fraction=1.,
contamination=0.261),
"Robust Covariance (Minimum Covariance Determinant)":
EllipticEnvelope(contamination=0.261),
"OCSVM": OneClassSVM(nu=0.261, gamma=0.05)}
colors = ['m', 'g', 'b']
legend1 = {}
legend2 = {}
# Learn a frontier for outlier detection with several classifiers
xx1, yy1 = np.meshgrid(np.linspace(-8, 28, 500), np.linspace(3, 40, 500))
xx2, yy2 = np.meshgrid(np.linspace(3, 10, 500), np.linspace(-5, 45, 500))
for i, (clf_name, clf) in enumerate(classifiers.items()):
plt.figure(1)
clf.fit(X1)
Z1 = clf.decision_function(np.c_[xx1.ravel(), yy1.ravel()])
Z1 = Z1.reshape(xx1.shape)
legend1[clf_name] = plt.contour(
xx1, yy1, Z1, levels=[0], linewidths=2, colors=colors[i])
plt.figure(2)
clf.fit(X2)
Z2 = clf.decision_function(np.c_[xx2.ravel(), yy2.ravel()])
Z2 = Z2.reshape(xx2.shape)
legend2[clf_name] = plt.contour(
xx2, yy2, Z2, levels=[0], linewidths=2, colors=colors[i])
legend1_values_list = list( legend1.values() )
legend1_keys_list = list( legend1.keys() )
# Plot the results (= shape of the data points cloud)
plt.figure(1) # two clusters
plt.title("Outlier detection on a real data set (boston housing)")
plt.scatter(X1[:, 0], X1[:, 1], color='black')
bbox_args = dict(boxstyle="round", fc="0.8")
arrow_args = dict(arrowstyle="->")
plt.annotate("several confounded points", xy=(24, 19),
xycoords="data", textcoords="data",
xytext=(13, 10), bbox=bbox_args, arrowprops=arrow_args)
plt.xlim((xx1.min(), xx1.max()))
plt.ylim((yy1.min(), yy1.max()))
plt.legend((legend1_values_list[0].collections[0],
legend1_values_list[1].collections[0],
legend1_values_list[2].collections[0]),
(legend1_keys_list[0], legend1_keys_list[1], legend1_keys_list[2]),
loc="upper center",
prop=matplotlib.font_manager.FontProperties(size=12))
plt.ylabel("accessibility to radial highways")
plt.xlabel("pupil-teacher ratio by town")
legend2_values_list = list( legend2.values() )
legend2_keys_list = list( legend2.keys() )
plt.figure(2) # "banana" shape
plt.title("Outlier detection on a real data set (boston housing)")
plt.scatter(X2[:, 0], X2[:, 1], color='black')
plt.xlim((xx2.min(), xx2.max()))
plt.ylim((yy2.min(), yy2.max()))
plt.legend((legend2_values_list[0].collections[0],
legend2_values_list[1].collections[0],
legend2_values_list[2].collections[0]),
(legend2_values_list[0], legend2_values_list[1], legend2_values_list[2]),
loc="upper center",
prop=matplotlib.font_manager.FontProperties(size=12))
plt.ylabel("% lower status of the population")
plt.xlabel("average number of rooms per dwelling")
plt.show()
| bsd-3-clause |
dpshelio/scikit-image | doc/examples/plot_radon_transform.py | 17 | 8432 | """
===============
Radon transform
===============
In computed tomography, the tomography reconstruction problem is to obtain
a tomographic slice image from a set of projections [1]_. A projection is
formed by drawing a set of parallel rays through the 2D object of interest,
assigning the integral of the object's contrast along each ray to a single
pixel in the projection. A single projection of a 2D object is one dimensional.
To enable computed tomography reconstruction of the object, several projections
must be acquired, each of them corresponding to a different angle between the
rays with respect to the object. A collection of projections at several angles
is called a sinogram, which is a linear transform of the original image.
The inverse Radon transform is used in computed tomography to reconstruct
a 2D image from the measured projections (the sinogram). A practical, exact
implementation of the inverse Radon transform does not exist, but there are
several good approximate algorithms available.
As the inverse Radon transform reconstructs the object from a set of
projections, the (forward) Radon transform can be used to simulate a
tomography experiment.
This script performs the Radon transform to simulate a tomography experiment
and reconstructs the input image based on the resulting sinogram formed by
the simulation. Two methods for performing the inverse Radon transform
and reconstructing the original image are compared: The Filtered Back
Projection (FBP) and the Simultaneous Algebraic Reconstruction
Technique (SART).
For further information on tomographic reconstruction, see
- AC Kak, M Slaney, "Principles of Computerized Tomographic Imaging",
http://www.slaney.org/pct/pct-toc.html
- http://en.wikipedia.org/wiki/Radon_transform
The forward transform
=====================
As our original image, we will use the Shepp-Logan phantom. When calculating
the Radon transform, we need to decide how many projection angles we wish
to use. As a rule of thumb, the number of projections should be about the
same as the number of pixels there are across the object (to see why this
is so, consider how many unknown pixel values must be determined in the
reconstruction process and compare this to the number of measurements
provided by the projections), and we follow that rule here. Below is the
original image and its Radon transform, often known as its _sinogram_:
"""
from __future__ import print_function, division
import numpy as np
import matplotlib.pyplot as plt
from skimage.io import imread
from skimage import data_dir
from skimage.transform import radon, rescale
image = imread(data_dir + "/phantom.png", as_grey=True)
image = rescale(image, scale=0.4)
fig, (ax1, ax2) = plt.subplots(1, 2, figsize=(8, 4.5))
ax1.set_title("Original")
ax1.imshow(image, cmap=plt.cm.Greys_r)
theta = np.linspace(0., 180., max(image.shape), endpoint=False)
sinogram = radon(image, theta=theta, circle=True)
ax2.set_title("Radon transform\n(Sinogram)")
ax2.set_xlabel("Projection angle (deg)")
ax2.set_ylabel("Projection position (pixels)")
ax2.imshow(sinogram, cmap=plt.cm.Greys_r,
extent=(0, 180, 0, sinogram.shape[0]), aspect='auto')
fig.subplots_adjust(hspace=0.4, wspace=0.5)
plt.show()
"""
.. image:: PLOT2RST.current_figure
Reconstruction with the Filtered Back Projection (FBP)
======================================================
The mathematical foundation of the filtered back projection is the Fourier
slice theorem [2]_. It uses Fourier transform of the projection and
interpolation in Fourier space to obtain the 2D Fourier transform of the image,
which is then inverted to form the reconstructed image. The filtered back
projection is among the fastest methods of performing the inverse Radon
transform. The only tunable parameter for the FBP is the filter, which is
applied to the Fourier transformed projections. It may be used to suppress
high frequency noise in the reconstruction. ``skimage`` provides a few
different options for the filter.
"""
from skimage.transform import iradon
reconstruction_fbp = iradon(sinogram, theta=theta, circle=True)
error = reconstruction_fbp - image
print('FBP rms reconstruction error: %.3g' % np.sqrt(np.mean(error**2)))
imkwargs = dict(vmin=-0.2, vmax=0.2)
fig, (ax1, ax2) = plt.subplots(1, 2, figsize=(8, 4.5))
ax1.set_title("Reconstruction\nFiltered back projection")
ax1.imshow(reconstruction_fbp, cmap=plt.cm.Greys_r)
ax2.set_title("Reconstruction error\nFiltered back projection")
ax2.imshow(reconstruction_fbp - image, cmap=plt.cm.Greys_r, **imkwargs)
plt.show()
"""
.. image:: PLOT2RST.current_figure
Reconstruction with the Simultaneous Algebraic Reconstruction Technique
=======================================================================
Algebraic reconstruction techniques for tomography are based on a
straightforward idea: for a pixelated image the value of a single ray in a
particular projection is simply a sum of all the pixels the ray passes through
on its way through the object. This is a way of expressing the forward Radon
transform. The inverse Radon transform can then be formulated as a (large) set
of linear equations. As each ray passes through a small fraction of the pixels
in the image, this set of equations is sparse, allowing iterative solvers for
sparse linear systems to tackle the system of equations. One iterative method
has been particularly popular, namely Kaczmarz' method [3]_, which has the
property that the solution will approach a least-squares solution of the
equation set.
The combination of the formulation of the reconstruction problem as a set
of linear equations and an iterative solver makes algebraic techniques
relatively flexible, hence some forms of prior knowledge can be incorporated
with relative ease.
``skimage`` provides one of the more popular variations of the algebraic
reconstruction techniques: the Simultaneous Algebraic Reconstruction Technique
(SART) [1]_ [4]_. It uses Kaczmarz' method [3]_ as the iterative solver. A good
reconstruction is normally obtained in a single iteration, making the method
computationally effective. Running one or more extra iterations will normally
improve the reconstruction of sharp, high frequency features and reduce the
mean squared error at the expense of increased high frequency noise (the user
will need to decide on what number of iterations is best suited to the problem
at hand. The implementation in ``skimage`` allows prior information of the
form of a lower and upper threshold on the reconstructed values to be supplied
to the reconstruction.
"""
from skimage.transform import iradon_sart
reconstruction_sart = iradon_sart(sinogram, theta=theta)
error = reconstruction_sart - image
print('SART (1 iteration) rms reconstruction error: %.3g'
% np.sqrt(np.mean(error**2)))
fig, ax = plt.subplots(2, 2, figsize=(8, 8.5))
ax1, ax2, ax3, ax4 = ax.ravel()
ax1.set_title("Reconstruction\nSART")
ax1.imshow(reconstruction_sart, cmap=plt.cm.Greys_r)
ax2.set_title("Reconstruction error\nSART")
ax2.imshow(reconstruction_sart - image, cmap=plt.cm.Greys_r, **imkwargs)
# Run a second iteration of SART by supplying the reconstruction
# from the first iteration as an initial estimate
reconstruction_sart2 = iradon_sart(sinogram, theta=theta,
image=reconstruction_sart)
error = reconstruction_sart2 - image
print('SART (2 iterations) rms reconstruction error: %.3g'
% np.sqrt(np.mean(error**2)))
ax3.set_title("Reconstruction\nSART, 2 iterations")
ax3.imshow(reconstruction_sart2, cmap=plt.cm.Greys_r)
ax4.set_title("Reconstruction error\nSART, 2 iterations")
ax4.imshow(reconstruction_sart2 - image, cmap=plt.cm.Greys_r, **imkwargs)
plt.show()
"""
.. image:: PLOT2RST.current_figure
.. [1] AC Kak, M Slaney, "Principles of Computerized Tomographic Imaging",
IEEE Press 1988. http://www.slaney.org/pct/pct-toc.html
.. [2] Wikipedia, Radon transform,
http://en.wikipedia.org/wiki/Radon_transform#Relationship_with_the_Fourier_transform
.. [3] S Kaczmarz, "Angenaeherte Aufloesung von Systemen linearer
Gleichungen", Bulletin International de l'Academie Polonaise des
Sciences et des Lettres 35 pp 355--357 (1937)
.. [4] AH Andersen, AC Kak, "Simultaneous algebraic reconstruction technique
(SART): a superior implementation of the ART algorithm", Ultrasonic
Imaging 6 pp 81--94 (1984)
"""
| bsd-3-clause |
ceholden/yatsm | yatsm/regression/pickles/serialize.py | 3 | 1859 | """ Setup script to pickle various statistical estimators for distribution
Available pickles to build:
* glmnet_Lasso20.pkl
* sklearn_Lasso20.pkl
"""
from __future__ import print_function
import json
import logging
import os
import traceback
# Don't alias to ``np``: https://github.com/numba/numba/issues/1559
import numpy
import sklearn.linear_model
from sklearn.externals import joblib as jl
import six
logger = logging.getLogger()
# GLMNET pickles
try:
import glmnet
_glmnet_pickles = {
'glmnet_Lasso20.pkl': glmnet.Lasso(lambdas=20),
'glmnet_LassoCV_n50.pkl': glmnet.LassoCV(
lambdas=numpy.logspace(1e-4, 35, 50)),
}
except:
logger.error('Could not produce pickles from package "glmnet". '
'Check if it is installed')
print(traceback.format_exc())
_glmnet_pickles = {}
# scikit-learn pickles
_sklearn_pickles = {
'OLS.pkl': sklearn.linear_model.LinearRegression(),
'sklearn_Lasso20.pkl': sklearn.linear_model.Lasso(alpha=20.0),
'sklearn_LassoCV_n50.pkl': sklearn.linear_model.LassoCV(
alphas=numpy.logspace(1e-4, 35, 50)),
}
# YATSM pickles
from ..robust_fit import RLM # flake8: noqa
_yatsm_pickles = {
'rlm_maxiter10.pkl': RLM(maxiter=10)
}
pickles = [_glmnet_pickles, _sklearn_pickles, _yatsm_pickles]
here = os.path.dirname(__file__)
pickles_json = os.path.join(here, 'pickles.json')
def make_pickles():
logger.info('Serializing estimators to pickles...')
packaged = {}
for pickle in pickles:
for fname, obj in six.iteritems(pickle):
jl.dump(obj, os.path.join(here, fname), compress=5)
packaged[os.path.splitext(fname)[0]] = obj.__class__.__name__
with open(pickles_json, 'w') as f:
json.dump(packaged, f, indent=4)
logger.info('Wrote pickles.json to %s' % pickles_json)
| mit |
Karl-Marka/data-mining | scleroderma-prediction/Feature_selector_ANOVA-F.py | 1 | 1981 | from sklearn.feature_selection import SelectKBest
from sklearn.feature_selection import f_classif
import pandas as pd
def oligosList():
oligosPath = input('Path to the file containing the list of oligos to use: ')
oligos = open(oligosPath)
oligos = oligos.readlines()
oligosList = []
for oligo in oligos:
item = oligo.strip()
oligosList.append(item)
return oligosList
def closeFunc():
print('''Type 'quit' and press enter to exit program''')
answer = input(': ')
if answer == 'quit':
quit()
else:
closeFunc()
def outputFile(output, file, oligos):
#output = pd.DataFrame(data = output, index = None, columns = oligos)
output = output.T
filename = str(file) + '.f_classif.txt'
output.to_csv(filename, sep = "\t")
print ('Predictor values saved to to file ' + str(filename))
closeFunc()
def main(file, oligosList):
data = pd.read_csv(file, sep='\t', header = None)
data = pd.DataFrame(data)
data = data.dropna(axis=1,how='all')
classifier = data.tail(1)
del classifier[0]
classifier = classifier.unstack()
dataCropped = pd.DataFrame([])
for i in oligosList:
for j in data[0]:
if j == i:
dt = data.loc[data[0] == i]
dataCropped = pd.concat([dataCropped, dt])
data = pd.DataFrame.transpose(dataCropped)
#Removing oligo names from dataset
data = data.drop(data.index[0])
#For selecting a number of best features:
#result = SelectKBest(f_classif, k="all").fit_transform(data, classifier)
result = f_classif(data, classifier)
Fval = result[0]
Pval = result[1]
result = pd.DataFrame(data = [Fval, Pval], index = ['F-score', 'p-value'], columns = oligosList)
#print(result)
outputFile(result, file, oligosList)
closeFunc()
if __name__ == "__main__":
oligoslist = oligosList()
file = input('Input data file path: ')
main(file, oligoslist)
| gpl-3.0 |
rsivapr/scikit-learn | examples/grid_search_text_feature_extraction.py | 5 | 4157 | """
==========================================================
Sample pipeline for text feature extraction and evaluation
==========================================================
The dataset used in this example is the 20 newsgroups dataset which will be
automatically downloaded and then cached and reused for the document
classification example.
You can adjust the number of categories by giving there name to the dataset
loader or setting them to None to get the 20 of them.
Here is a sample output of a run on a quad-core machine::
Loading 20 newsgroups dataset for categories:
['alt.atheism', 'talk.religion.misc']
1427 documents
2 categories
Performing grid search...
pipeline: ['vect', 'tfidf', 'clf']
parameters:
{'clf__alpha': (1.0000000000000001e-05, 9.9999999999999995e-07),
'clf__n_iter': (10, 50, 80),
'clf__penalty': ('l2', 'elasticnet'),
'tfidf__use_idf': (True, False),
'vect__max_n': (1, 2),
'vect__max_df': (0.5, 0.75, 1.0),
'vect__max_features': (None, 5000, 10000, 50000)}
done in 1737.030s
Best score: 0.940
Best parameters set:
clf__alpha: 9.9999999999999995e-07
clf__n_iter: 50
clf__penalty: 'elasticnet'
tfidf__use_idf: True
vect__max_n: 2
vect__max_df: 0.75
vect__max_features: 50000
"""
# Author: Olivier Grisel <olivier.grisel@ensta.org>
# Peter Prettenhofer <peter.prettenhofer@gmail.com>
# Mathieu Blondel <mathieu@mblondel.org>
# License: BSD 3 clause
from __future__ import print_function
from pprint import pprint
from time import time
import logging
from sklearn.datasets import fetch_20newsgroups
from sklearn.feature_extraction.text import CountVectorizer
from sklearn.feature_extraction.text import TfidfTransformer
from sklearn.linear_model import SGDClassifier
from sklearn.grid_search import GridSearchCV
from sklearn.pipeline import Pipeline
print(__doc__)
# Display progress logs on stdout
logging.basicConfig(level=logging.INFO,
format='%(asctime)s %(levelname)s %(message)s')
###############################################################################
# Load some categories from the training set
categories = [
'alt.atheism',
'talk.religion.misc',
]
# Uncomment the following to do the analysis on all the categories
#categories = None
print("Loading 20 newsgroups dataset for categories:")
print(categories)
data = fetch_20newsgroups(subset='train', categories=categories)
print("%d documents" % len(data.filenames))
print("%d categories" % len(data.target_names))
print()
###############################################################################
# define a pipeline combining a text feature extractor with a simple
# classifier
pipeline = Pipeline([
('vect', CountVectorizer()),
('tfidf', TfidfTransformer()),
('clf', SGDClassifier()),
])
# uncommenting more parameters will give better exploring power but will
# increase processing time in a combinatorial way
parameters = {
'vect__max_df': (0.5, 0.75, 1.0),
#'vect__max_features': (None, 5000, 10000, 50000),
'vect__ngram_range': ((1, 1), (1, 2)), # unigrams or bigrams
#'tfidf__use_idf': (True, False),
#'tfidf__norm': ('l1', 'l2'),
'clf__alpha': (0.00001, 0.000001),
'clf__penalty': ('l2', 'elasticnet'),
#'clf__n_iter': (10, 50, 80),
}
if __name__ == "__main__":
# multiprocessing requires the fork to happen in a __main__ protected
# block
# find the best parameters for both the feature extraction and the
# classifier
grid_search = GridSearchCV(pipeline, parameters, n_jobs=-1, verbose=1)
print("Performing grid search...")
print("pipeline:", [name for name, _ in pipeline.steps])
print("parameters:")
pprint(parameters)
t0 = time()
grid_search.fit(data.data, data.target)
print("done in %0.3fs" % (time() - t0))
print()
print("Best score: %0.3f" % grid_search.best_score_)
print("Best parameters set:")
best_parameters = grid_search.best_estimator_.get_params()
for param_name in sorted(parameters.keys()):
print("\t%s: %r" % (param_name, best_parameters[param_name]))
| bsd-3-clause |
chris-nemeth/pseudo-extended-mcmc-code | Section_4.3-Sparse_logistic_regression_with_horseshoe_priors/main.py | 1 | 4141 | #This script tests out the horseshoe prior for variable selection based on Piironen and Vehtari (2017).
import scipy.io as spio
import numpy as np
import pystan
from scipy.stats import cauchy, norm
from matplotlib import pyplot as plt
import csv
#Load the data
mat = spio.loadmat('colon.mat', squeeze_me=True) #or 'prostate.mat' or 'leukemia.mat'
Y = mat['Y'] #responses
Y = (Y+1)/2
X = mat['X'] #covariates
X = X/2
num_obs, dim = X.shape
#Split into test and training 80/20
train = set(np.random.choice(num_obs,np.floor(0.8*num_obs).astype(int),replace=False))
idd = set(range(num_obs))
test = list(idd - train)
train = list(train)
Xtrain = X[train]
Xtest = X[test]
Ytrain = Y[train]
Ytest = Y[test]
num_obs, dim = Xtrain.shape
p0 = 3.0 #colon
#p0 = 200.0 #prostate
#p0 = 55.0 #leukemia
#Functions
def inv_logit(f):
return 1.0/(1.0+np.exp(-f))
def log_like(Y,x,beta0,beta):
f = inv_logit(beta0+np.matmul(x,beta))
logl = np.nansum(Y*np.log(f) + (1.0-Y)*np.log(1.0-f))
return logl
def log_priors(beta0,beta,tau,lambdas,tau0,sigma,scale_icept):
return np.sum(cauchy.logpdf(tau,0,tau0) + cauchy.logpdf(lambdas,0,sigma) + norm.logpdf(beta0,0,scale_icept) + norm.logpdf(beta,tau*lambdas))
def log_post(Y,x,beta0,beta,tau,lambdas,tau0,sigma,scale_icept):
return log_like(Y,x,beta0,beta) + log_priors(beta0,beta,tau,lambdas,tau0,sigma,scale_icept)
def log_pred(Y,x,beta0,beta):
f = inv_logit(beta0+np.matmul(x,beta))
logp = Y*np.log(f) + (1.0-Y)*np.log(1.0-f)
return np.nansum(logp)
#Compile the Stan code
hmcModel = pystan.StanModel(file="hmc_regularised_horseshoe_classification.stan")
peModel = pystan.StanModel(file="pseudo-extended_regularised_horseshoe_classification.stan")
peModelFixedBeta = pystan.StanModel(file="pseudo-extended_regularised_horseshoe_classification_fixedBeta.stan")
#-----------------------------------------------------------------------
#Run standard Stan model
iterations = 1000
#regularised
sigma = 2
data = {'n': num_obs,
'd': dim,
'y': Ytrain.astype(int),
'x': Xtrain,
'scale_icept': 10.0,
'scale_global': (p0/((dim-p0))*(sigma/np.sqrt(num_obs))),
'nu_global': 1.0,
'nu_local': 1.0,
'slab_scale': 2.0,
'slab_df': 3}
fit1 = hmcModel.sampling(data=data, iter=iterations, chains=1)
output1 = fit1.extract()
beta0HMC = output1['beta0']
betaHMC = output1['beta']
#---------------------------------------------------------------------
#Run standard pseudo-extended model
num_particles = 2
#regularised
data = {'n': num_obs,
'd': dim,
'y': Ytrain.astype(int),
'x': Xtrain,
'scale_icept': 10.0,
'scale_global': (p0/((dim-p0))*(sigma/np.sqrt(num_obs))),
'nu_global': 1.0,
'nu_local': 1.0,
'slab_scale': 2.0,
'slab_df': 3,
'N': num_particles}
fit2 = peModel.sampling(data=data, iter=iterations, chains=1)
output2 = fit2.extract()
beta0PE = output2['beta0']
betaPE = output2['beta']
idx = output2['index']
beta0PE = np.array([beta0PE[i,idx[i].astype(int)] for i in range(iterations//2)])
betaPE = np.array([betaPE[i,idx[i].astype(int),:] for i in range(iterations//2)])
#------------------------------------------------------------------
#Run pseudo-extended model with fixed beta
num_particles = 2
gamma = 0.25 #what is referred to in the paper as \beta
#regularised
data = {'n': num_obs,
'd': dim,
'y': Ytrain.astype(int),
'x': Xtrain,
'scale_icept': 10.0,
'scale_global': (p0/((dim-p0))*(sigma/np.sqrt(num_obs))),
'nu_global': 1.0,
'nu_local': 1.0,
'slab_scale': 2.0,
'slab_df': 3,
'N': num_particles,
'gamma': gamma}}
fit3 = peModelFixedBeta.sampling(data=data, iter=iterations, chains=1)
output3 = fit3.extract()
beta0PEfixedBeta = output3['beta0']
betaPEfixedBeta = output3['beta']
idx = output3['index']
beta0PEfixedBeta = np.array([beta0PEfixedBeta[i,idx[i].astype(int)] for i in range(iterations//2)])
betaPEfixedBeta = np.array([betaPEfixedBeta[i,idx[i].astype(int),:] for i in range(iterations//2)])
| gpl-3.0 |
meduz/scikit-learn | examples/linear_model/plot_sgd_separating_hyperplane.py | 84 | 1221 | """
=========================================
SGD: Maximum margin separating hyperplane
=========================================
Plot the maximum margin separating hyperplane within a two-class
separable dataset using a linear Support Vector Machines classifier
trained using SGD.
"""
print(__doc__)
import numpy as np
import matplotlib.pyplot as plt
from sklearn.linear_model import SGDClassifier
from sklearn.datasets.samples_generator import make_blobs
# we create 50 separable points
X, Y = make_blobs(n_samples=50, centers=2, random_state=0, cluster_std=0.60)
# fit the model
clf = SGDClassifier(loss="hinge", alpha=0.01, n_iter=200, fit_intercept=True)
clf.fit(X, Y)
# plot the line, the points, and the nearest vectors to the plane
xx = np.linspace(-1, 5, 10)
yy = np.linspace(-1, 5, 10)
X1, X2 = np.meshgrid(xx, yy)
Z = np.empty(X1.shape)
for (i, j), val in np.ndenumerate(X1):
x1 = val
x2 = X2[i, j]
p = clf.decision_function([[x1, x2]])
Z[i, j] = p[0]
levels = [-1.0, 0.0, 1.0]
linestyles = ['dashed', 'solid', 'dashed']
colors = 'k'
plt.contour(X1, X2, Z, levels, colors=colors, linestyles=linestyles)
plt.scatter(X[:, 0], X[:, 1], c=Y, cmap=plt.cm.Paired)
plt.axis('tight')
plt.show()
| bsd-3-clause |
galad-loth/LearnDescriptor | patchmatch/test_kpt_match.py | 1 | 2223 | # -*- codingL utf-8-*-
"""
Created on Tue Oct 07 10:10:15 2018
@author: galad-loth
"""
import numpy as npy
from matplotlib import pyplot as plt
import cv2
from cnn_desc import get_cnn_desc
img1=cv2.imread(r"D:\_Datasets\VGGAffine\ubc\img1.ppm",cv2.IMREAD_COLOR)
img2=cv2.imread(r"D:\_Datasets\VGGAffine\ubc\img2.ppm",cv2.IMREAD_COLOR)
gray1=cv2.cvtColor(img1,cv2.COLOR_BGR2GRAY)
gray2=cv2.cvtColor(img2,cv2.COLOR_BGR2GRAY)
gap_width=20
black_gap=npy.zeros((img1.shape[0],gap_width),dtype=npy.uint8)
border_width=64
kpt_mask=npy.zeros(gray1.shape, dtype=npy.uint8)
kpt_mask[border_width:-border_width,border_width:-border_width]=1
sift = cv2.xfeatures2d.SIFT_create(nfeatures=200)
kpt1 = sift.detect(gray1,kpt_mask)
#kpt1, desc1 = sift.compute(gray1,kpt1)
kpt1, desc1=get_cnn_desc(gray1, kpt1)
kpt2 = sift.detect(gray2,kpt_mask)
#kpt2, desc2 = sift.compute(gray2,kpt2)
kpt2, desc2=get_cnn_desc(gray2, kpt1)
matcher = cv2.BFMatcher(cv2.NORM_L2SQR)
match_pairs = matcher.knnMatch(desc1,desc2,k=2)
good_matches=[]
for bm1,bm2 in match_pairs:
if bm1.distance < 0.7*bm2.distance:
good_matches.append(bm1)
good_matches=sorted(good_matches, key = lambda x:x.distance)
if len(good_matches)>10:
pts_from = npy.float32([kpt1[bm.queryIdx].pt for bm in good_matches]).reshape(-1,1,2)
pts_to = npy.float32([kpt2[bm.trainIdx].pt for bm in good_matches]).reshape(-1,1,2)
mat_H, match_mask = cv2.findHomography(pts_from, pts_to, cv2.RANSAC,5.0)
imgcnb=npy.concatenate((gray1,black_gap,gray2),axis=1)
plt.figure(1,figsize=(15,6))
plt.imshow(imgcnb,cmap="gray")
idx=0
for bm in good_matches[:30]:
if 1==match_mask[idx]:
kpt_from=kpt1[bm.queryIdx]
kpt_to=kpt2[bm.trainIdx]
plt.plot(kpt_from.pt[0],kpt_from.pt[1],"rs",
markerfacecolor="none",markeredgecolor="r",markeredgewidth=2)
plt.plot(kpt_to.pt[0]+img1.shape[1]+gap_width,kpt_to.pt[1],"bo",
markerfacecolor="none",markeredgecolor="b",markeredgewidth=2)
plt.plot([kpt_from.pt[0],kpt_to.pt[0]+img1.shape[1]+gap_width],
[kpt_from.pt[1],kpt_to.pt[1]],"c-",linewidth=2)
idx+=1
plt.axis("off")
| apache-2.0 |
mmottahedi/neuralnilm_prototype | scripts/e488.py | 2 | 6822 | from __future__ import print_function, division
import matplotlib
import logging
from sys import stdout
matplotlib.use('Agg') # Must be before importing matplotlib.pyplot or pylab!
from neuralnilm import (Net, RealApplianceSource,
BLSTMLayer, DimshuffleLayer,
BidirectionalRecurrentLayer)
from neuralnilm.source import (standardise, discretize, fdiff, power_and_fdiff,
RandomSegments, RandomSegmentsInMemory,
SameLocation)
from neuralnilm.experiment import run_experiment, init_experiment
from neuralnilm.net import TrainingError
from neuralnilm.layers import (MixtureDensityLayer, DeConv1DLayer,
SharedWeightsDenseLayer)
from neuralnilm.objectives import (scaled_cost, mdn_nll,
scaled_cost_ignore_inactive, ignore_inactive,
scaled_cost3)
from neuralnilm.plot import MDNPlotter, CentralOutputPlotter, Plotter
from neuralnilm.updates import clipped_nesterov_momentum
from neuralnilm.disaggregate import disaggregate
from lasagne.nonlinearities import sigmoid, rectify, tanh, identity
from lasagne.objectives import mse, binary_crossentropy
from lasagne.init import Uniform, Normal, Identity
from lasagne.layers import (LSTMLayer, DenseLayer, Conv1DLayer,
ReshapeLayer, FeaturePoolLayer, RecurrentLayer)
from lasagne.layers.batch_norm import BatchNormLayer
from lasagne.updates import nesterov_momentum, momentum
from functools import partial
import os
import __main__
from copy import deepcopy
from math import sqrt
import numpy as np
import theano.tensor as T
import gc
"""
447: first attempt at disaggregation
"""
NAME = os.path.splitext(os.path.split(__main__.__file__)[1])[0]
#PATH = "/homes/dk3810/workspace/python/neuralnilm/figures"
PATH = "/data/dk3810/figures"
SAVE_PLOT_INTERVAL = 1000
N_SEQ_PER_BATCH = 64
source_dict = dict(
filename='/data/dk3810/ukdale.h5',
window=("2013-03-18", None),
train_buildings=[1, 2, 3, 4, 5],
validation_buildings=[1, 2, 3, 4, 5],
n_seq_per_batch=N_SEQ_PER_BATCH,
standardise_input=True,
standardise_targets=True,
independently_center_inputs=True,
subsample_target=8,
ignore_incomplete=True
# offset_probability=0.5,
# ignore_offset_activations=True
)
net_dict = dict(
save_plot_interval=SAVE_PLOT_INTERVAL,
# loss_function=partial(ignore_inactive, loss_func=mdn_nll, seq_length=SEQ_LENGTH),
# loss_function=lambda x, t: mdn_nll(x, t).mean(),
# loss_function=lambda x, t: (mse(x, t) * MASK).mean(),
loss_function=lambda x, t: mse(x, t).mean(),
# loss_function=lambda x, t: binary_crossentropy(x, t).mean(),
# loss_function=partial(scaled_cost, loss_func=mse),
# loss_function=ignore_inactive,
# loss_function=partial(scaled_cost3, ignore_inactive=False),
# updates_func=momentum,
updates_func=clipped_nesterov_momentum,
updates_kwargs={'clip_range': (0, 10)},
learning_rate=1e-2,
learning_rate_changes_by_iteration={
1000: 1e-3,
5000: 1e-4
},
do_save_activations=True,
auto_reshape=False,
# plotter=CentralOutputPlotter
plotter=Plotter(n_seq_to_plot=32)
)
def exp_a(name, target_appliance, seq_length):
global source
source_dict_copy = deepcopy(source_dict)
source_dict_copy.update(dict(
target_appliance=target_appliance,
logger=logging.getLogger(name),
seq_length=seq_length
))
source = RandomSegmentsInMemory(**source_dict_copy)
net_dict_copy = deepcopy(net_dict)
net_dict_copy.update(dict(
experiment_name=name,
source=source
))
NUM_FILTERS = 4
target_seq_length = seq_length // source.subsample_target
net_dict_copy['layers_config'] = [
{
'type': DimshuffleLayer,
'pattern': (0, 2, 1) # (batch, features, time)
},
{
'label': 'conv0',
'type': Conv1DLayer, # convolve over the time axis
'num_filters': NUM_FILTERS,
'filter_length': 4,
'stride': 1,
'nonlinearity': None,
'border_mode': 'valid'
},
{
'type': DimshuffleLayer,
'pattern': (0, 2, 1) # back to (batch, time, features)
},
{
'label': 'dense0',
'type': DenseLayer,
'num_units': (seq_length - 3) * NUM_FILTERS,
'nonlinearity': rectify
},
{
'label': 'dense2',
'type': DenseLayer,
'num_units': 128,
'nonlinearity': rectify
},
{
'type': DenseLayer,
'num_units': (target_seq_length - 3) * NUM_FILTERS,
'nonlinearity': rectify
},
{
'type': ReshapeLayer,
'shape': (N_SEQ_PER_BATCH, target_seq_length - 3, NUM_FILTERS)
},
{
'type': DimshuffleLayer,
'pattern': (0, 2, 1) # (batch, features, time)
},
{
'type': DeConv1DLayer,
'num_output_channels': 1,
'filter_length': 4,
'stride': 1,
'nonlinearity': None,
'border_mode': 'full'
},
{
'type': DimshuffleLayer,
'pattern': (0, 2, 1) # back to (batch, time, features)
}
]
net = Net(**net_dict_copy)
return net
def main():
APPLIANCES = [
('a', ['fridge freezer', 'fridge'], 512),
('b', "'coffee maker'", 512),
('c', "'dish washer'", 2000),
('d', "'hair dryer'", 256),
('e', "'kettle'", 256),
('f', "'oven'", 2000),
('g', "'toaster'", 256),
('h', "'light'", 2000),
('i', ['washer dryer', 'washing machine'], 1500)
]
for experiment, appliance, seq_length in APPLIANCES[:1]:
full_exp_name = NAME + experiment
func_call = init_experiment(PATH, 'a', full_exp_name)
func_call = func_call[:-1] + ", {}, {})".format(appliance, seq_length)
logger = logging.getLogger(full_exp_name)
try:
net = eval(func_call)
run_experiment(net, epochs=None)
except KeyboardInterrupt:
logger.info("KeyboardInterrupt")
break
except Exception as exception:
logger.exception("Exception")
# raise
else:
del net.source
del net
gc.collect()
finally:
logging.shutdown()
if __name__ == "__main__":
main()
"""
Emacs variables
Local Variables:
compile-command: "cp /home/jack/workspace/python/neuralnilm/scripts/e488.py /mnt/sshfs/imperial/workspace/python/neuralnilm/scripts/"
End:
"""
| mit |
astroclark/bhextractor | bin/libbhex_posteriors.py | 1 | 9464 | #!/usr/bin/env python
# -*- coding: utf-8 -*-
# Copyright (C) 2015-2016 James Clark <james.clark@ligo.org>
#
# This program is free software; you can redistribute it and/or modify
# it under the terms of the GNU General Public License as published by
# the Free Software Foundation; either version 2 of the License, or
# (at your option) any later version.
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License along
# with this program; if not, write to the Free Software Foundation, Inc.,
# 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA.
"""
bhextractor_reconstruct.py
Loads a posterior samples file into a BayesPostProc results object and builds
the waveform. If the injection is provided, matches for each sampled point are
also returned.
"""
from __future__ import division
import os
import sys
import matplotlib
matplotlib.use("Agg")
import numpy as np
from matplotlib import pyplot as pl
from optparse import OptionParser
from glue.ligolw import lsctables, table, utils, ligolw, ilwd
table.use_in(ligolw.LIGOLWContentHandler)
lsctables.use_in(ligolw.LIGOLWContentHandler)
import lal
import lalmetaio
import lalinspiral
import pycbc.types
import pycbc.filter
import pycbc.inject
from pycbc.psd import aLIGOZeroDetHighPower
from pylal import Fr
from pylal import bayespputils as bppu
import triangle
from bhex_utils import bhex_wavedata as bwave
from bhex_utils import bhex_pca as bpca
def parser():
# --- Command line input
parser = OptionParser()
parser.add_option("-f", "--sim-inspiral", type=str, default=None)
parser.add_option("-n", "--npcs", type=int, default=1)
parser.add_option("-t", "--event", type=int, default=0)
parser.add_option("-d", "--delta-t", type=float, default=1./512)
parser.add_option("-L", "--datalen", type=float, default=4.0)
(opts,args) = parser.parse_args()
if len(args)==0:
print >> sys.stderr, "Must supply a posterior samples file as a commandline argument"
sys.exit()
if not os.path.isfile(args[0]):
print >> sys.stderr, "posterior samples file requested: %s does not exist"%args[0]
sys.exit()
if opts.sim_inspiral is not None:
if not os.path.isfile(opts.sim_inspiral):
print >> sys.stderr, "sim-inspiral file not found at: %s"%opts.sim_inspiral
sys.exit()
return opts, args
def compute_match(reconstruction, injection, delta_t=1./512):
rec_td = pycbc.types.TimeSeries(reconstruction, delta_t=delta_t)
inj_td = pycbc.types.TimeSeries(injection, delta_t=delta_t)
rec_fd = rec_td.to_frequencyseries()
inj_fd = inj_td.to_frequencyseries()
psd = aLIGOZeroDetHighPower(len(rec_fd), delta_f=inj_fd.delta_f,
low_freq_cutoff=1)
return pycbc.filter.match(rec_fd, inj_fd, psd=psd,
low_frequency_cutoff=10)[0]
def copy_sim_inspiral( row ):
"""
Turn a lsctables.SimInspiral into a SWIG wrapped lalburst.SimInspiral
"""
swigrow = lalmetaio.SimInspiralTable()
for simattr in lsctables.SimInspiralTable.validcolumns.keys():
if simattr in ["waveform", "source", "numrel_data", "taper"]:
# unicode -> char* doesn't work
setattr( swigrow, simattr, str(getattr(row, simattr)) )
else:
setattr( swigrow, simattr, getattr(row, simattr) )
# FIXME: This doesn't get copied properly, and so is done manually here.
swigrow.geocent_end_time = lal.LIGOTimeGPS(row.geocent_end_time, row.geocent_end_time_ns)
return swigrow
def get_sims( sim_inspiral_file ):
"""
Return a list of swig-wrapped siminspiral table entries
"""
xmldoc = utils.load_filename(sim_inspiral_file,
contenthandler=ligolw.LIGOLWContentHandler)
sims = []
try:
sim_insp = table.get_table( xmldoc,
lsctables.SimInspiralTable.tableName )
sims.extend( map(copy_sim_inspiral, sim_insp) )
except ValueError:
if opts.verbose:
print >> sys.stderr, "No SimInspiral table found, \
skipping..."
return sims
# --- MAIN
def main():
opts, args = parser()
posfile = args[0]
if opts.sim_inspiral is not None:
# Build the injected waveform
#
# Parse sim_inspiral table
#
sims = get_sims(opts.sim_inspiral)
# Sim_inspiral row for this event:
this_sim = sims[opts.event]
#
# # Retrieve the injected strain data
# Hp, Hc = lalinspiral.NRInjectionFromSimInspiral(this_sim, opts.delta_t)
# hp = pycbc.types.TimeSeries(Hp.data.data[:], delta_t=Hp.delta_t,
# epoch=Hp.epoch)
# hc = pycbc.types.TimeSeries(Hc.data.data[:], delta_t=Hc.delta_t,
# epoch=Hc.epoch)
# Injection set object with details (and methods) for injections
injSet = pycbc.inject.InjectionSet(opts.sim_inspiral)
h1_epoch = this_sim.h_end_time + 1e-9*this_sim.h_end_time_ns \
-0.5*opts.datalen
h1_injection = pycbc.types.TimeSeries(
np.zeros(opts.datalen/opts.delta_t),
delta_t=opts.delta_t, epoch=h1_epoch
)
h1_injection = injSet.apply(h1_injection, 'H1')
#
# Load and parse posterior samples file
#
peparser = bppu.PEOutputParser('common')
resultsObj = peparser.parse(open(posfile, 'r'))
posterior = bppu.Posterior(resultsObj)
return this_sim
sys.exit()
#
# Load PC data
#
try:
pcs_path=os.environ['BHEX_PREFIX']+"/data/PCA_data"
except KeyError:
print >> sys.stderr, "BHEX_PREFIX environment variable appears to be un-set"
print >> sys.stdout, "Reading PCA file"
pcadata = pickle.load(open("%s/%s-PCA.pickle"%(pcs_path,
opts.catalogue),'r'))
# basis_functions = pcadata.pca_plus.components_
print >> sys.stdout, "Read PCA file"
#
# Compute matches
#
amp_betas = pcadata.projection_amp[opts.injection_name]
phase_betas = pcadata.projection_amp[opts.injection_name]
if opts.injection_file is not None:
print 'Loading injection file for match calculations (from LIB)'
injwav = np.loadtxt(opts.injection_file)[:,1]
else:
print 'Loading injection file for match calculations (from PCs)'
reconstructed_amp = bhex.reconstruct_waveform(pcadata.pca_amp,
amp_betas, len(amp_betas), mtotal_ref=opts.mtotal_ref,
mtotal_target=opts.target_mass)
reconstructed_phase = bhex.reconstruct_waveform(pcadata.pca_phase,
phase_betas, len(phase_betas), mtotal_ref=opts.mtotal_ref,
mtotal_target=opts.target_mass)
injwav = reconstructed_amp * np.exp(1j*reconstructed_phase)
print 'Computing matches: '
#
# Reconstruct the waveforms from the samples
#
reconstruction_results = reconstructions(posterior, pcadata, opts.npcs,
injwav)
# Get nominal reconstruction for this number of PCs
reconstructed_amp = bhex.reconstruct_waveform(pcadata.pca_amp,
amp_betas, opts.npcs, mtotal_ref=opts.mtotal_ref,
mtotal_target=opts.target_mass)
reconstructed_phase = bhex.reconstruct_waveform(pcadata.pca_phase,
phase_betas, opts.npcs, mtotal_ref=opts.mtotal_ref,
mtotal_target=opts.target_mass)
nomwav = reconstructed_amp*np.exp(1j*reconstructed_phase)
print compute_match(np.real(nomwav), np.real(injwav), delta_t=1.0/512)
#
# Plots
#
nother=7 # increase this for new parameters
samples = np.zeros(shape=(len(posterior), 2*opts.npcs+nother)) #x2 for amp/phase
samples[:,0] = np.concatenate(posterior['mtotal'].samples)
#samples[:,0] = reconstruction_results['matches']
samples[:,1] = reconstruction_results['matches']
samples[:,2] = np.concatenate(posterior['time'].samples) - opts.trig_time
samples[:,3] = np.concatenate(posterior['hrss'].samples)
samples[:,4] = np.concatenate(posterior['theta_jn'].samples)
samples[:,5] = np.concatenate(posterior['phi_orb'].samples)
samples[:,6] = np.concatenate(posterior['psi'].samples)
#samples[:,7] = np.concatenate(posterior['ra'].samples)
#samples[:,8] = np.concatenate(posterior['dec'].samples)
b=1
for n in xrange(nother,opts.npcs+nother):
samples[:,n] = np.concatenate(posterior['amp_beta'+str(b)].samples)
b+=1
b=1
for n in xrange(opts.npcs+nother,2*opts.npcs+nother):
samples[:,n] = np.concatenate(posterior['phase_beta'+str(b)].samples)
b+=1
labels=['M$_{\\rm total}$', 'Match', 'Time', 'hrss', 'inclination', 'phase',
'$\\Psi$', 'A,$\\beta_1$', '$\\Phi$,$\\beta_1$', 'A,$\\beta_2$',
'$\\Phi$,$\\beta_2$', 'A,$\\beta_3$', '$\\Phi$,$\\beta_3$']
#'$\\Psi$', 'ra', 'dec', 'A,$\\beta_1$', '$\\Phi$,$\\beta_1$', 'A,$\\beta_2$',
trifig = triangle.corner(samples, labels=labels[:2*opts.npcs+nother],
quantiles=[0.25, 0.5, 0.75])
trifig.savefig('%s'%(args[0].replace('.dat','.png')))
return reconstruction_results, posterior
#
# End definitions
#
if __name__ == "__main__":
ans = main()
| gpl-2.0 |
jstac/recursive_utility_code | python/long_run_risk/src/stability_plots.py | 1 | 1055 |
import matplotlib.pyplot as plt
import numpy as np
def stability_plot(R,
x, y,
xlb, ylb,
txt_flag="by",
dot_loc=None,
coords=(-225, 30)):
if txt_flag == "by":
text = "Bansal and Yaron"
else:
text = "Schorfheide, Song and Yaron"
param1_value, param2_value = dot_loc
fig, ax = plt.subplots(figsize=(10, 5.7))
cs1 = ax.contourf(x, y, R.T, alpha=0.5)
ctr1 = ax.contour(x, y, R.T, levels=[1.0])
plt.clabel(ctr1, inline=1, fontsize=13)
plt.colorbar(cs1, ax=ax)
ax.annotate(text,
xy=(param1_value, param2_value),
xycoords="data",
xytext=coords,
textcoords="offset points",
fontsize=12,
arrowprops={"arrowstyle" : "->"})
ax.plot([param1_value], [param2_value], "ko", alpha=0.6)
ax.set_xlabel(xlb, fontsize=16)
ax.set_ylabel(ylb, fontsize=16)
plt.savefig("temp.pdf")
plt.show()
| mit |
cms-btv-pog/rootpy | rootpy/plotting/tests/test_root2matplotlib.py | 3 | 2304 | # Copyright 2012 the rootpy developers
# distributed under the terms of the GNU General Public License
from rootpy.plotting import Hist, Hist2D, HistStack, Graph
from nose.plugins.skip import SkipTest
from nose.tools import with_setup
def setup_func():
try:
import matplotlib
except ImportError:
raise SkipTest("matplotlib is not importable")
matplotlib.use('Agg')
from matplotlib import pyplot
pyplot.ioff()
@with_setup(setup_func)
def test_errorbar():
from rootpy.plotting import root2matplotlib as rplt
h = Hist(100, -5, 5)
h.FillRandom('gaus')
g = Graph(h)
rplt.errorbar(g)
rplt.errorbar(h)
@with_setup(setup_func)
def test_bar():
from rootpy.plotting import root2matplotlib as rplt
h = Hist(100, -5, 5)
h.FillRandom('gaus')
rplt.bar(h)
# stack
h1 = h.Clone()
stack = HistStack([h, h1])
rplt.bar(stack)
rplt.bar([h, h1])
@with_setup(setup_func)
def test_hist():
from rootpy.plotting import root2matplotlib as rplt
h = Hist(100, -5, 5)
h.FillRandom('gaus')
rplt.hist(h)
# stack
h1 = h.Clone()
stack = HistStack([h, h1])
rplt.hist(stack)
rplt.hist([h, h1])
@with_setup(setup_func)
def test_hist2d():
from rootpy.plotting import root2matplotlib as rplt
from matplotlib import pyplot
import numpy as np
if not hasattr(pyplot, 'hist2d'):
raise SkipTest("matplotlib is too old")
a = Hist2D(100, -3, 3, 100, 0, 6)
a.fill_array(np.random.multivariate_normal(
mean=(0, 3),
cov=[[1, .5], [.5, 1]],
size=(1000,)))
rplt.hist2d(a)
@with_setup(setup_func)
def test_imshow():
from rootpy.plotting import root2matplotlib as rplt
import numpy as np
a = Hist2D(100, -3, 3, 100, 0, 6)
a.fill_array(np.random.multivariate_normal(
mean=(0, 3),
cov=[[1, .5], [.5, 1]],
size=(1000,)))
rplt.imshow(a)
@with_setup(setup_func)
def test_contour():
from rootpy.plotting import root2matplotlib as rplt
import numpy as np
a = Hist2D(100, -3, 3, 100, 0, 6)
a.fill_array(np.random.multivariate_normal(
mean=(0, 3),
cov=[[1, .5], [.5, 1]],
size=(1000,)))
rplt.contour(a)
if __name__ == "__main__":
import nose
nose.runmodule()
| gpl-3.0 |
wdurhamh/statsmodels | statsmodels/examples/ex_kernel_regression2.py | 34 | 1511 | # -*- coding: utf-8 -*-
"""
Created on Wed Jan 02 13:43:44 2013
Author: Josef Perktold
"""
from __future__ import print_function
import numpy as np
import numpy.testing as npt
import statsmodels.nonparametric.api as nparam
if __name__ == '__main__':
np.random.seed(500)
nobs = [250, 1000][0]
sig_fac = 1
x = np.random.uniform(-2, 2, size=nobs)
x.sort()
y_true = np.sin(x*5)/x + 2*x
y = y_true + sig_fac * (np.sqrt(np.abs(3+x))) * np.random.normal(size=nobs)
model = nparam.KernelReg(endog=[y],
exog=[x], reg_type='lc',
var_type='c', bw='cv_ls',
defaults=nparam.EstimatorSettings(efficient=True))
sm_bw = model.bw
sm_mean, sm_mfx = model.fit()
model1 = nparam.KernelReg(endog=[y],
exog=[x], reg_type='lc',
var_type='c', bw='cv_ls')
mean1, mfx1 = model1.fit()
model2 = nparam.KernelReg(endog=[y],
exog=[x], reg_type='ll',
var_type='c', bw='cv_ls')
mean2, mfx2 = model2.fit()
print(model.bw)
print(model1.bw)
print(model2.bw)
import matplotlib.pyplot as plt
fig = plt.figure()
ax = fig.add_subplot(1,1,1)
ax.plot(x, y, 'o', alpha=0.5)
ax.plot(x, y_true, lw=2, label='DGP mean')
ax.plot(x, sm_mean, lw=2, label='kernel mean')
ax.plot(x, mean2, lw=2, label='kernel mean')
ax.legend()
plt.show()
| bsd-3-clause |
wanderine/nipype | nipype/algorithms/modelgen.py | 1 | 34772 | # emacs: -*- mode: python; py-indent-offset: 4; indent-tabs-mode: nil -*-
# vi: set ft=python sts=4 ts=4 sw=4 et:
"""
The modelgen module provides classes for specifying designs for individual
subject analysis of task-based fMRI experiments. In particular it also includes
algorithms for generating regressors for sparse and sparse-clustered acquisition
experiments.
These functions include:
* SpecifyModel: allows specification of sparse and non-sparse models
Change directory to provide relative paths for doctests
>>> import os
>>> filepath = os.path.dirname( os.path.realpath( __file__ ) )
>>> datadir = os.path.realpath(os.path.join(filepath, '../testing/data'))
>>> os.chdir(datadir)
"""
from copy import deepcopy
import os
from nibabel import load
import numpy as np
from scipy.special import gammaln
from nipype.interfaces.base import (BaseInterface, TraitedSpec, InputMultiPath,
traits, File, Bunch, BaseInterfaceInputSpec,
isdefined)
from nipype.utils.filemanip import filename_to_list
from .. import config, logging
from nipype.external import six
iflogger = logging.getLogger('interface')
def gcd(a, b):
"""Returns the greatest common divisor of two integers
uses Euclid's algorithm
>>> gcd(4, 5)
1
>>> gcd(4, 8)
4
>>> gcd(22, 55)
11
"""
while b > 0: a, b = b, a % b
return a
def spm_hrf(RT, P=None, fMRI_T=16):
""" python implementation of spm_hrf
see spm_hrf for implementation details
% RT - scan repeat time
% p - parameters of the response function (two gamma
% functions)
% defaults (seconds)
% p(0) - delay of response (relative to onset) 6
% p(1) - delay of undershoot (relative to onset) 16
% p(2) - dispersion of response 1
% p(3) - dispersion of undershoot 1
% p(4) - ratio of response to undershoot 6
% p(5) - onset (seconds) 0
% p(6) - length of kernel (seconds) 32
%
% hrf - hemodynamic response function
% p - parameters of the response function
the following code using scipy.stats.distributions.gamma
doesn't return the same result as the spm_Gpdf function ::
hrf = gamma.pdf(u, p[0]/p[2], scale=dt/p[2]) -
gamma.pdf(u, p[1]/p[3], scale=dt/p[3])/p[4]
>>> print spm_hrf(2)
[ 0.00000000e+00 8.65660810e-02 3.74888236e-01 3.84923382e-01
2.16117316e-01 7.68695653e-02 1.62017720e-03 -3.06078117e-02
-3.73060781e-02 -3.08373716e-02 -2.05161334e-02 -1.16441637e-02
-5.82063147e-03 -2.61854250e-03 -1.07732374e-03 -4.10443522e-04
-1.46257507e-04]
"""
p = np.array([6, 16, 1, 1, 6, 0, 32], dtype=float)
if P is not None:
p[0:len(P)] = P
_spm_Gpdf = lambda x, h, l: np.exp(h * np.log(l) + (h - 1) * np.log(x) - (l * x) - gammaln(h))
# modelled hemodynamic response function - {mixture of Gammas}
dt = RT / float(fMRI_T)
u = np.arange(0, int(p[6] / dt + 1)) - p[5] / dt
hrf = _spm_Gpdf(u, p[0] / p[2], dt / p[2]) - _spm_Gpdf(u, p[1] / p[3],
dt / p[3]) / p[4]
idx = np.arange(0, int((p[6] / RT) + 1)) * fMRI_T
hrf = hrf[idx]
hrf = hrf / np.sum(hrf)
return hrf
def orth(x_in, y_in):
"""Orthoganlize y_in with respect to x_in
>>> orth_expected = np.array([1.7142857142857144, 0.42857142857142883, \
-0.85714285714285676])
>>> err = np.abs(np.array(orth([1, 2, 3],[4, 5, 6]) - orth_expected))
>>> all(err < np.finfo(float).eps)
True
"""
x = np.array(x_in)[:, None]
y = np.array(y_in)[:, None]
y = y - np.dot(x, np.dot(np.linalg.inv(np.dot(x.T, x)), np.dot(x.T, y)))
if np.linalg.norm(y, 1) > np.exp(-32):
y = y[:, 0].tolist()
else:
y = y_in
return y
def scale_timings(timelist, input_units, output_units, time_repetition):
"""Scales timings given input and output units (scans/secs)
Parameters
----------
timelist: list of times to scale
input_units: 'secs' or 'scans'
output_units: Ibid.
time_repetition: float in seconds
"""
if input_units==output_units:
_scalefactor = 1.
if (input_units == 'scans') and (output_units == 'secs'):
_scalefactor = time_repetition
if (input_units == 'secs') and (output_units == 'scans'):
_scalefactor = 1./time_repetition
timelist = [np.max([0., _scalefactor * t]) for t in timelist]
return timelist
def gen_info(run_event_files):
"""Generate subject_info structure from a list of event files
"""
info = []
for i, event_files in enumerate(run_event_files):
runinfo = Bunch(conditions=[], onsets=[], durations=[], amplitudes=[])
for event_file in event_files:
_, name = os.path.split(event_file)
if '.run' in name:
name, _ = name.split('.run%03d' % (i+1))
elif '.txt' in name:
name, _ = name.split('.txt')
runinfo.conditions.append(name)
event_info = np.atleast_2d(np.loadtxt(event_file))
runinfo.onsets.append(event_info[:, 0].tolist())
if event_info.shape[1] > 1:
runinfo.durations.append(event_info[:, 1].tolist())
else:
runinfo.durations.append([0])
if event_info.shape[1] > 2:
runinfo.amplitudes.append(event_info[:, 2].tolist())
else:
delattr(runinfo, 'amplitudes')
info.append(runinfo)
return info
class SpecifyModelInputSpec(BaseInterfaceInputSpec):
subject_info = InputMultiPath(Bunch, mandatory=True, xor=['subject_info',
'event_files'],
desc=("Bunch or List(Bunch) subject specific condition information. "
"see :ref:`SpecifyModel` or SpecifyModel.__doc__ for details"))
event_files = InputMultiPath(traits.List(File(exists=True)), mandatory=True,
xor=['subject_info', 'event_files'],
desc=('list of event description files 1, 2 or 3 column format '
'corresponding to onsets, durations and amplitudes'))
realignment_parameters = InputMultiPath(File(exists=True),
desc="Realignment parameters returned by motion correction algorithm",
copyfile=False)
outlier_files = InputMultiPath(File(exists=True),
desc="Files containing scan outlier indices that should be tossed",
copyfile=False)
functional_runs = InputMultiPath(traits.Either(traits.List(File(exists=True)),
File(exists=True)),
mandatory=True,
desc=("Data files for model. List of 4D files or list of list of 3D "
"files per session"), copyfile=False)
input_units = traits.Enum('secs', 'scans', mandatory=True,
desc=("Units of event onsets and durations (secs or scans). Output "
"units are always in secs"))
high_pass_filter_cutoff = traits.Float(mandatory=True,
desc="High-pass filter cutoff in secs")
time_repetition = traits.Float(mandatory=True,
desc=("Time between the start of one volume to the start of "
"the next image volume."))
# Not implemented yet
#polynomial_order = traits.Range(0, low=0,
# desc ="Number of polynomial functions to model high pass filter.")
class SpecifyModelOutputSpec(TraitedSpec):
session_info = traits.Any(desc="session info for level1designs")
class SpecifyModel(BaseInterface):
"""Makes a model specification compatible with spm/fsl designers.
The subject_info field should contain paradigm information in the form of
a Bunch or a list of Bunch. The Bunch should contain the following
information::
[Mandatory]
- conditions : list of names
- onsets : lists of onsets corresponding to each condition
- durations : lists of durations corresponding to each condition. Should be
left to a single 0 if all events are being modelled as impulses.
[Optional]
- regressor_names : list of str
list of names corresponding to each column. Should be None if
automatically assigned.
- regressors : list of lists
values for each regressor - must correspond to the number of
volumes in the functional run
- amplitudes : lists of amplitudes for each event. This will be ignored by
SPM's Level1Design.
The following two (tmod, pmod) will be ignored by any Level1Design class
other than SPM:
- tmod : lists of conditions that should be temporally modulated. Should
default to None if not being used.
- pmod : list of Bunch corresponding to conditions
- name : name of parametric modulator
- param : values of the modulator
- poly : degree of modulation
Alternatively, you can provide information through event files.
The event files have to be in 1, 2 or 3 column format with the columns
corresponding to Onsets, Durations and Amplitudes and they have to have the
name event_name.runXXX... e.g.: Words.run001.txt. The event_name part will
be used to create the condition names.
Examples
--------
>>> from nipype.interfaces.base import Bunch
>>> s = SpecifyModel()
>>> s.inputs.input_units = 'secs'
>>> s.inputs.functional_runs = ['functional2.nii', 'functional3.nii']
>>> s.inputs.time_repetition = 6
>>> s.inputs.high_pass_filter_cutoff = 128.
>>> info = [Bunch(conditions=['cond1'], onsets=[[2, 50, 100, 180]],\
durations=[[1]]), \
Bunch(conditions=['cond1'], onsets=[[30, 40, 100, 150]], \
durations=[[1]])]
>>> s.inputs.subject_info = info
Using pmod:
>>> info = [Bunch(conditions=['cond1', 'cond2'], \
onsets=[[2, 50],[100, 180]], durations=[[0],[0]], \
pmod=[Bunch(name=['amp'], poly=[2], param=[[1, 2]]),\
None]), \
Bunch(conditions=['cond1', 'cond2'], \
onsets=[[20, 120],[80, 160]], durations=[[0],[0]], \
pmod=[Bunch(name=['amp'], poly=[2], param=[[1, 2]]), \
None])]
>>> s.inputs.subject_info = info
"""
input_spec = SpecifyModelInputSpec
output_spec = SpecifyModelOutputSpec
def _generate_standard_design(self, infolist,
functional_runs=None,
realignment_parameters=None,
outliers=None):
""" Generates a standard design matrix paradigm given information about
each run
"""
sessinfo = []
output_units = 'secs'
if 'output_units' in self.inputs.traits():
output_units = self.inputs.output_units
for i, info in enumerate(infolist):
sessinfo.insert(i, dict(cond=[]))
if isdefined(self.inputs.high_pass_filter_cutoff):
sessinfo[i]['hpf'] = \
np.float(self.inputs.high_pass_filter_cutoff)
if hasattr(info, 'conditions') and info.conditions is not None:
for cid, cond in enumerate(info.conditions):
sessinfo[i]['cond'].insert(cid, dict())
sessinfo[i]['cond'][cid]['name'] = info.conditions[cid]
scaled_onset = scale_timings(info.onsets[cid],
self.inputs.input_units,
output_units,
self.inputs.time_repetition)
sessinfo[i]['cond'][cid]['onset'] = scaled_onset
scaled_duration = scale_timings(info.durations[cid],
self.inputs.input_units,
output_units,
self.inputs.time_repetition)
sessinfo[i]['cond'][cid]['duration'] = scaled_duration
if hasattr(info, 'amplitudes') and info.amplitudes:
sessinfo[i]['cond'][cid]['amplitudes'] = \
info.amplitudes[cid]
if hasattr(info, 'tmod') and info.tmod and \
len(info.tmod) > cid:
sessinfo[i]['cond'][cid]['tmod'] = info.tmod[cid]
if hasattr(info, 'pmod') and info.pmod and \
len(info.pmod) > cid:
if info.pmod[cid]:
sessinfo[i]['cond'][cid]['pmod'] = []
for j, name in enumerate(info.pmod[cid].name):
sessinfo[i]['cond'][cid]['pmod'].insert(j, {})
sessinfo[i]['cond'][cid]['pmod'][j]['name'] = \
name
sessinfo[i]['cond'][cid]['pmod'][j]['poly'] = \
info.pmod[cid].poly[j]
sessinfo[i]['cond'][cid]['pmod'][j]['param'] = \
info.pmod[cid].param[j]
sessinfo[i]['regress']= []
if hasattr(info, 'regressors') and info.regressors is not None:
for j, r in enumerate(info.regressors):
sessinfo[i]['regress'].insert(j, dict(name='', val=[]))
if hasattr(info, 'regressor_names') and \
info.regressor_names is not None:
sessinfo[i]['regress'][j]['name'] = \
info.regressor_names[j]
else:
sessinfo[i]['regress'][j]['name'] = 'UR%d' % (j+1)
sessinfo[i]['regress'][j]['val'] = info.regressors[j]
sessinfo[i]['scans'] = functional_runs[i]
if realignment_parameters is not None:
for i, rp in enumerate(realignment_parameters):
mc = realignment_parameters[i]
for col in range(mc.shape[1]):
colidx = len(sessinfo[i]['regress'])
sessinfo[i]['regress'].insert(colidx, dict(name='', val=[]))
sessinfo[i]['regress'][colidx]['name'] = 'Realign%d' % (col + 1)
sessinfo[i]['regress'][colidx]['val'] = mc[:, col].tolist()
if outliers is not None:
for i, out in enumerate(outliers):
numscans = 0
for f in filename_to_list(sessinfo[i]['scans']):
shape = load(f).get_shape()
if len(shape) == 3 or shape[3] == 1:
iflogger.warning(("You are using 3D instead of 4D "
"files. Are you sure this was "
"intended?"))
numscans += 1
else:
numscans += shape[3]
for j, scanno in enumerate(out):
colidx = len(sessinfo[i]['regress'])
sessinfo[i]['regress'].insert(colidx, dict(name='', val=[]))
sessinfo[i]['regress'][colidx]['name'] = 'Outlier%d'%(j+1)
sessinfo[i]['regress'][colidx]['val'] = \
np.zeros((1, numscans))[0].tolist()
sessinfo[i]['regress'][colidx]['val'][int(scanno)] = 1
return sessinfo
def _generate_design(self, infolist=None):
"""Generate design specification for a typical fmri paradigm
"""
realignment_parameters = []
if isdefined(self.inputs.realignment_parameters):
for parfile in self.inputs.realignment_parameters:
realignment_parameters.append(np.loadtxt(parfile))
outliers = []
if isdefined(self.inputs.outlier_files):
for filename in self.inputs.outlier_files:
try:
outindices = np.loadtxt(filename, dtype=int)
except IOError:
outliers.append([])
else:
if outindices.size == 1:
outliers.append([outindices.tolist()])
else:
outliers.append(outindices.tolist())
if infolist is None:
if isdefined(self.inputs.subject_info):
infolist = self.inputs.subject_info
else:
infolist = gen_info(self.inputs.event_files)
self._sessinfo = self._generate_standard_design(infolist,
functional_runs=self.inputs.functional_runs,
realignment_parameters=realignment_parameters,
outliers=outliers)
def _run_interface(self, runtime):
"""
"""
self._sessioninfo = None
self._generate_design()
return runtime
def _list_outputs(self):
outputs = self._outputs().get()
if not hasattr(self, '_sessinfo'):
self._generate_design()
outputs['session_info'] = self._sessinfo
return outputs
class SpecifySPMModelInputSpec(SpecifyModelInputSpec):
concatenate_runs = traits.Bool(False, usedefault=True,
desc="Concatenate all runs to look like a single session.")
output_units = traits.Enum('secs', 'scans', usedefault=True,
desc="Units of design event onsets and durations (secs or scans)")
class SpecifySPMModel(SpecifyModel):
"""Adds SPM specific options to SpecifyModel
adds:
- concatenate_runs
- output_units
Examples
--------
>>> from nipype.interfaces.base import Bunch
>>> s = SpecifySPMModel()
>>> s.inputs.input_units = 'secs'
>>> s.inputs.output_units = 'scans'
>>> s.inputs.high_pass_filter_cutoff = 128.
>>> s.inputs.functional_runs = ['functional2.nii', 'functional3.nii']
>>> s.inputs.time_repetition = 6
>>> s.inputs.concatenate_runs = True
>>> info = [Bunch(conditions=['cond1'], onsets=[[2, 50, 100, 180]], \
durations=[[1]]), \
Bunch(conditions=['cond1'], onsets=[[30, 40, 100, 150]], \
durations=[[1]])]
>>> s.inputs.subject_info = info
"""
input_spec = SpecifySPMModelInputSpec
def _concatenate_info(self, infolist):
nscans = []
for i, f in enumerate(self.inputs.functional_runs):
if isinstance(f, list):
numscans = len(f)
elif isinstance(f, six.string_types):
img = load(f)
numscans = img.get_shape()[3]
else:
raise Exception('Functional input not specified correctly')
nscans.insert(i, numscans)
# now combine all fields into 1
# names, onsets, durations, amplitudes, pmod, tmod, regressor_names,
# regressors
infoout = infolist[0]
for i, info in enumerate(infolist[1:]):
#info.[conditions, tmod] remain the same
if info.onsets:
for j, val in enumerate(info.onsets):
if self.inputs.input_units == 'secs':
onsets = np.array(info.onsets[j]) +\
self.inputs.time_repetition * \
sum(nscans[0:(i + 1)])
infoout.onsets[j].extend(onsets.tolist())
else:
onsets = np.array(info.onsets[j]) + \
sum(nscans[0:(i + 1)])
infoout.onsets[j].extend(onsets.tolist())
for j, val in enumerate(info.durations):
if len(val) > 1:
infoout.durations[j].extend(info.durations[j])
if hasattr(info, 'amplitudes') and info.amplitudes:
for j, val in enumerate(info.amplitudes):
infoout.amplitudes[j].extend(info.amplitudes[j])
if hasattr(info, 'pmod') and info.pmod:
for j, val in enumerate(info.pmod):
if val:
for key, data in enumerate(val.param):
infoout.pmod[j].param[key].extend(data)
if hasattr(info, 'regressors') and info.regressors:
#assumes same ordering of regressors across different
#runs and the same names for the regressors
for j, v in enumerate(info.regressors):
infoout.regressors[j].extend(info.regressors[j])
#insert session regressors
if not hasattr(infoout, 'regressors') or not infoout.regressors:
infoout.regressors = []
onelist = np.zeros((1, sum(nscans)))
onelist[0, sum(nscans[0:i]):sum(nscans[0:(i + 1)])] = 1
infoout.regressors.insert(len(infoout.regressors),
onelist.tolist()[0])
return [infoout], nscans
def _generate_design(self, infolist=None):
if not isdefined(self.inputs.concatenate_runs) or \
not self.inputs.concatenate_runs:
super(SpecifySPMModel, self)._generate_design(infolist=infolist)
return
if isdefined(self.inputs.subject_info):
infolist = self.inputs.subject_info
else:
infolist = gen_info(self.inputs.event_files)
concatlist, nscans = self._concatenate_info(infolist)
functional_runs = [filename_to_list(self.inputs.functional_runs)]
realignment_parameters = []
if isdefined(self.inputs.realignment_parameters):
realignment_parameters = []
for parfile in self.inputs.realignment_parameters:
mc = np.loadtxt(parfile)
if not realignment_parameters:
realignment_parameters.insert(0, mc)
else:
realignment_parameters[0] = \
np.concatenate((realignment_parameters[0], mc))
outliers = []
if isdefined(self.inputs.outlier_files):
outliers = [[]]
for i, filename in enumerate(self.inputs.outlier_files):
try:
out = np.loadtxt(filename, dtype=int)
except IOError:
out = np.array([])
if out.size > 0:
if out.size == 1:
outliers[0].extend([(np.array(out) +
sum(nscans[0:i])).tolist()])
else:
outliers[0].extend((np.array(out) +
sum(nscans[0:i])).tolist())
self._sessinfo = self._generate_standard_design(concatlist,
functional_runs=functional_runs,
realignment_parameters=realignment_parameters,
outliers=outliers)
class SpecifySparseModelInputSpec(SpecifyModelInputSpec):
time_acquisition = traits.Float(0, mandatory=True,
desc="Time in seconds to acquire a single image volume")
volumes_in_cluster=traits.Range(1, usedefault=True,
desc="Number of scan volumes in a cluster")
model_hrf = traits.Bool(desc="model sparse events with hrf")
stimuli_as_impulses = traits.Bool(True,
desc="Treat each stimulus to be impulse like.",
usedefault=True)
use_temporal_deriv = traits.Bool(requires=['model_hrf'],
desc="Create a temporal derivative in addition to regular regressor")
scale_regressors = traits.Bool(True, desc="Scale regressors by the peak",
usedefault=True)
scan_onset = traits.Float(0.0,
desc="Start of scanning relative to onset of run in secs",
usedefault=True)
save_plot = traits.Bool(desc=('save plot of sparse design calculation '
'(Requires matplotlib)'))
class SpecifySparseModelOutputSpec(SpecifyModelOutputSpec):
sparse_png_file = File(desc='PNG file showing sparse design')
sparse_svg_file = File(desc='SVG file showing sparse design')
class SpecifySparseModel(SpecifyModel):
""" Specify a sparse model that is compatible with spm/fsl designers
References
----------
.. [1] Perrachione TK and Ghosh SS (2013) Optimized design and analysis of
sparse-sampling fMRI experiments. Front. Neurosci. 7:55
http://journal.frontiersin.org/Journal/10.3389/fnins.2013.00055/abstract
Examples
--------
>>> from nipype.interfaces.base import Bunch
>>> s = SpecifySparseModel()
>>> s.inputs.input_units = 'secs'
>>> s.inputs.functional_runs = ['functional2.nii', 'functional3.nii']
>>> s.inputs.time_repetition = 6
>>> s.inputs.time_acquisition = 2
>>> s.inputs.high_pass_filter_cutoff = 128.
>>> s.inputs.model_hrf = True
>>> info = [Bunch(conditions=['cond1'], onsets=[[2, 50, 100, 180]], \
durations=[[1]]), \
Bunch(conditions=['cond1'], onsets=[[30, 40, 100, 150]], \
durations=[[1]])]
>>> s.inputs.subject_info = info
"""
input_spec = SpecifySparseModelInputSpec
output_spec = SpecifySparseModelOutputSpec
def _gen_regress(self, i_onsets, i_durations, i_amplitudes, nscans):
"""Generates a regressor for a sparse/clustered-sparse acquisition
"""
bplot = False
if isdefined(self.inputs.save_plot) and self.inputs.save_plot:
bplot=True
import matplotlib
matplotlib.use(config.get("execution", "matplotlib_backend"))
import matplotlib.pyplot as plt
TR = np.round(self.inputs.time_repetition * 1000) # in ms
if self.inputs.time_acquisition:
TA = np.round(self.inputs.time_acquisition * 1000) # in ms
else:
TA = TR # in ms
nvol = self.inputs.volumes_in_cluster
SCANONSET = np.round(self.inputs.scan_onset * 1000)
total_time = TR * (nscans - nvol) / nvol + TA * nvol + SCANONSET
SILENCE = TR - TA * nvol
dt = TA / 10.0
durations = np.round(np.array(i_durations) * 1000)
if len(durations) == 1:
durations = durations*np.ones((len(i_onsets)))
onsets = np.round(np.array(i_onsets) * 1000)
dttemp = gcd(TA, gcd(SILENCE, TR))
if dt < dttemp:
if dttemp % dt != 0:
dt = float(gcd(dttemp, dt))
if dt < 1:
raise Exception("Time multiple less than 1 ms")
iflogger.info("Setting dt = %d ms\n" % dt)
npts = int(np.ceil(total_time / dt))
times = np.arange(0, total_time, dt) * 1e-3
timeline = np.zeros((npts))
timeline2 = np.zeros((npts))
if isdefined(self.inputs.model_hrf) and self.inputs.model_hrf:
hrf = spm_hrf(dt * 1e-3)
reg_scale = 1.0
if self.inputs.scale_regressors:
boxcar = np.zeros((50.0 * 1e3 / dt))
if self.inputs.stimuli_as_impulses:
boxcar[1.0 * 1e3 / dt] = 1.0
reg_scale = float(TA / dt)
else:
boxcar[(1.0 * 1e3 / dt):(2.0 * 1e3 / dt)] = 1.0
if isdefined(self.inputs.model_hrf) and self.inputs.model_hrf:
response = np.convolve(boxcar, hrf)
reg_scale = 1.0 / response.max()
iflogger.info('response sum: %.4f max: %.4f' % (response.sum(),
response.max()))
iflogger.info('reg_scale: %.4f' % reg_scale)
for i, t in enumerate(onsets):
idx = int(np.round(t / dt))
if i_amplitudes:
if len(i_amplitudes) > 1:
timeline2[idx] = i_amplitudes[i]
else:
timeline2[idx] = i_amplitudes[0]
else:
timeline2[idx] = 1
if bplot:
plt.subplot(4, 1, 1)
plt.plot(times, timeline2)
if not self.inputs.stimuli_as_impulses:
if durations[i] == 0:
durations[i] = TA * nvol
stimdur = np.ones((int(durations[i] / dt)))
timeline2 = np.convolve(timeline2, stimdur)[0:len(timeline2)]
timeline += timeline2
timeline2[:] = 0
if bplot:
plt.subplot(4, 1, 2)
plt.plot(times, timeline)
if isdefined(self.inputs.model_hrf) and self.inputs.model_hrf:
timeline = np.convolve(timeline, hrf)[0:len(timeline)]
if isdefined(self.inputs.use_temporal_deriv) and \
self.inputs.use_temporal_deriv:
#create temporal deriv
timederiv = np.concatenate(([0], np.diff(timeline)))
if bplot:
plt.subplot(4, 1, 3)
plt.plot(times, timeline)
if isdefined(self.inputs.use_temporal_deriv) and \
self.inputs.use_temporal_deriv:
plt.plot(times, timederiv)
# sample timeline
timeline2 = np.zeros((npts))
reg = []
regderiv = []
for i, trial in enumerate(np.arange(nscans)/nvol):
scanstart = int((SCANONSET + trial * TR + (i % nvol) * TA) / dt)
scanidx = scanstart+np.arange(int(TA/dt))
timeline2[scanidx] = np.max(timeline)
reg.insert(i, np.mean(timeline[scanidx]) * reg_scale)
if isdefined(self.inputs.use_temporal_deriv) and \
self.inputs.use_temporal_deriv:
regderiv.insert(i, np.mean(timederiv[scanidx]) * reg_scale)
if isdefined(self.inputs.use_temporal_deriv) and \
self.inputs.use_temporal_deriv:
iflogger.info('orthoganlizing derivative w.r.t. main regressor')
regderiv = orth(reg, regderiv)
if bplot:
plt.subplot(4, 1, 3)
plt.plot(times, timeline2)
plt.subplot(4, 1, 4)
plt.bar(np.arange(len(reg)), reg, width=0.5)
plt.savefig('sparse.png')
plt.savefig('sparse.svg')
if regderiv:
return [reg, regderiv]
else:
return reg
def _cond_to_regress(self, info, nscans):
"""Converts condition information to full regressors
"""
reg = []
regnames = []
for i, cond in enumerate(info.conditions):
if hasattr(info, 'amplitudes') and info.amplitudes:
amplitudes = info.amplitudes[i]
else:
amplitudes = None
regnames.insert(len(regnames), cond)
scaled_onsets = scale_timings(info.onsets[i],
self.inputs.input_units,
'secs',
self.inputs.time_repetition)
scaled_durations = scale_timings(info.durations[i],
self.inputs.input_units,
'secs',
self.inputs.time_repetition)
regressor = self._gen_regress(scaled_onsets,
scaled_durations,
amplitudes,
nscans)
if isdefined(self.inputs.use_temporal_deriv) and \
self.inputs.use_temporal_deriv:
reg.insert(len(reg), regressor[0])
regnames.insert(len(regnames), cond + '_D')
reg.insert(len(reg), regressor[1])
else:
reg.insert(len(reg), regressor)
# need to deal with temporal and parametric modulators
# for sparse-clustered acquisitions enter T1-effect regressors
nvol = self.inputs.volumes_in_cluster
if nvol > 1:
for i in range(nvol-1):
treg = np.zeros((nscans/nvol, nvol))
treg[:, i] = 1
reg.insert(len(reg), treg.ravel().tolist())
regnames.insert(len(regnames), 'T1effect_%d' % i)
return reg, regnames
def _generate_clustered_design(self, infolist):
"""Generates condition information for sparse-clustered
designs.
"""
infoout = deepcopy(infolist)
for i, info in enumerate(infolist):
infoout[i].conditions = None
infoout[i].onsets = None
infoout[i].durations = None
if info.conditions:
img = load(self.inputs.functional_runs[i])
nscans = img.get_shape()[3]
reg, regnames = self._cond_to_regress(info, nscans)
if hasattr(infoout[i], 'regressors') and infoout[i].regressors:
if not infoout[i].regressor_names:
infoout[i].regressor_names = \
['R%d'%j for j in range(len(infoout[i].regressors))]
else:
infoout[i].regressors = []
infoout[i].regressor_names = []
for j, r in enumerate(reg):
regidx = len(infoout[i].regressors)
infoout[i].regressor_names.insert(regidx, regnames[j])
infoout[i].regressors.insert(regidx, r)
return infoout
def _generate_design(self, infolist=None):
if isdefined(self.inputs.subject_info):
infolist = self.inputs.subject_info
else:
infolist = gen_info(self.inputs.event_files)
sparselist = self._generate_clustered_design(infolist)
super(SpecifySparseModel, self)._generate_design(infolist = sparselist)
def _list_outputs(self):
outputs = self._outputs().get()
if not hasattr(self, '_sessinfo'):
self._generate_design()
outputs['session_info'] = self._sessinfo
if isdefined(self.inputs.save_plot) and self.inputs.save_plot:
outputs['sparse_png_file'] = os.path.join(os.getcwd(), 'sparse.png')
outputs['sparse_svg_file'] = os.path.join(os.getcwd(), 'sparse.svg')
return outputs
| bsd-3-clause |
kubeflow/kfserving | docs/samples/v1beta1/transformer/feast/driver_transformer/__main__.py | 1 | 1914 | # Copyright 2019 kubeflow.org.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
import argparse
import kfserving
from .driver_transformer import DriverTransformer
import logging
logging.basicConfig(level=kfserving.constants.KFSERVING_LOGLEVEL)
DEFAULT_MODEL_NAME = "sklearn-driver-transformer"
parser = argparse.ArgumentParser(parents=[kfserving.kfserver.parser])
parser.add_argument(
"--predictor_host",
help="The URL for the model predict function", required=True
)
parser.add_argument(
"--model_name", default=DEFAULT_MODEL_NAME,
help='The name that the model is served under.')
parser.add_argument(
"--feast_serving_url",
type=str,
help="The url of the Feast Serving Service.", required=True)
parser.add_argument(
"--entity_ids",
type=str, nargs="+",
help="A list of entity ids to use as keys in the feature store.",
required=True)
parser.add_argument(
"--feature_refs",
type=str, nargs="+",
help="A list of features to retrieve from the feature store.",
required=True)
args, _ = parser.parse_known_args()
if __name__ == "__main__":
transformer = DriverTransformer(
name=args.model_name,
predictor_host=args.predictor_host,
feast_serving_url=args.feast_serving_url,
entity_ids=args.entity_ids,
feature_refs=args.feature_refs)
kfserver = kfserving.KFServer()
kfserver.start(models=[transformer])
| apache-2.0 |
stevenzhang18/Indeed-Flask | lib/pandas/io/stata.py | 9 | 78805 | """
Module contains tools for processing Stata files into DataFrames
The StataReader below was originally written by Joe Presbrey as part of PyDTA.
It has been extended and improved by Skipper Seabold from the Statsmodels
project who also developed the StataWriter and was finally added to pandas in
a once again improved version.
You can find more information on http://presbrey.mit.edu/PyDTA and
http://statsmodels.sourceforge.net/devel/
"""
import numpy as np
import sys
import struct
from dateutil.relativedelta import relativedelta
from pandas.core.base import StringMixin
from pandas.core.categorical import Categorical
from pandas.core.frame import DataFrame
from pandas.core.series import Series
import datetime
from pandas import compat, to_timedelta, to_datetime, isnull, DatetimeIndex
from pandas.compat import lrange, lmap, lzip, text_type, string_types, range, \
zip, BytesIO
from pandas.util.decorators import Appender
import pandas as pd
import pandas.core.common as com
from pandas.io.common import get_filepath_or_buffer
from pandas.lib import max_len_string_array, infer_dtype
from pandas.tslib import NaT, Timestamp
_version_error = "Version of given Stata file is not 104, 105, 108, 113 (Stata 8/9), 114 (Stata 10/11), 115 (Stata 12), 117 (Stata 13), or 118 (Stata 14)"
_statafile_processing_params1 = """\
convert_dates : boolean, defaults to True
Convert date variables to DataFrame time values
convert_categoricals : boolean, defaults to True
Read value labels and convert columns to Categorical/Factor variables"""
_encoding_params = """\
encoding : string, None or encoding
Encoding used to parse the files. Note that Stata doesn't
support unicode. None defaults to iso-8859-1."""
_statafile_processing_params2 = """\
index : identifier of index column
identifier of column that should be used as index of the DataFrame
convert_missing : boolean, defaults to False
Flag indicating whether to convert missing values to their Stata
representations. If False, missing values are replaced with nans.
If True, columns containing missing values are returned with
object data types and missing values are represented by
StataMissingValue objects.
preserve_dtypes : boolean, defaults to True
Preserve Stata datatypes. If False, numeric data are upcast to pandas
default types for foreign data (float64 or int64)
columns : list or None
Columns to retain. Columns will be returned in the given order. None
returns all columns
order_categoricals : boolean, defaults to True
Flag indicating whether converted categorical data are ordered."""
_chunksize_params = """\
chunksize : int, default None
Return StataReader object for iterations, returns chunks with
given number of lines"""
_iterator_params = """\
iterator : boolean, default False
Return StataReader object"""
_read_stata_doc = """Read Stata file into DataFrame
Parameters
----------
filepath_or_buffer : string or file-like object
Path to .dta file or object implementing a binary read() functions
%s
%s
%s
%s
%s
Returns
-------
DataFrame or StataReader
Examples
--------
Read a Stata dta file:
>> df = pandas.read_stata('filename.dta')
Read a Stata dta file in 10,000 line chunks:
>> itr = pandas.read_stata('filename.dta', chunksize=10000)
>> for chunk in itr:
>> do_something(chunk)
""" % (_statafile_processing_params1, _encoding_params,
_statafile_processing_params2, _chunksize_params,
_iterator_params)
_data_method_doc = """Reads observations from Stata file, converting them into a dataframe
This is a legacy method. Use `read` in new code.
Parameters
----------
%s
%s
Returns
-------
DataFrame
""" % (_statafile_processing_params1, _statafile_processing_params2)
_read_method_doc = """\
Reads observations from Stata file, converting them into a dataframe
Parameters
----------
nrows : int
Number of lines to read from data file, if None read whole file.
%s
%s
Returns
-------
DataFrame
""" % (_statafile_processing_params1, _statafile_processing_params2)
_stata_reader_doc = """\
Class for reading Stata dta files.
Parameters
----------
path_or_buf : string or file-like object
Path to .dta file or object implementing a binary read() functions
%s
%s
%s
%s
""" % (_statafile_processing_params1, _statafile_processing_params2,
_encoding_params, _chunksize_params)
@Appender(_read_stata_doc)
def read_stata(filepath_or_buffer, convert_dates=True,
convert_categoricals=True, encoding=None, index=None,
convert_missing=False, preserve_dtypes=True, columns=None,
order_categoricals=True, chunksize=None, iterator=False):
reader = StataReader(filepath_or_buffer,
convert_dates=convert_dates,
convert_categoricals=convert_categoricals,
index=index, convert_missing=convert_missing,
preserve_dtypes=preserve_dtypes,
columns=columns,
order_categoricals=order_categoricals,
chunksize=chunksize, encoding=encoding)
if iterator or chunksize:
return reader
return reader.read()
_date_formats = ["%tc", "%tC", "%td", "%d", "%tw", "%tm", "%tq", "%th", "%ty"]
stata_epoch = datetime.datetime(1960, 1, 1)
def _stata_elapsed_date_to_datetime_vec(dates, fmt):
"""
Convert from SIF to datetime. http://www.stata.com/help.cgi?datetime
Parameters
----------
dates : Series
The Stata Internal Format date to convert to datetime according to fmt
fmt : str
The format to convert to. Can be, tc, td, tw, tm, tq, th, ty
Returns
Returns
-------
converted : Series
The converted dates
Examples
--------
>>> import pandas as pd
>>> dates = pd.Series([52])
>>> _stata_elapsed_date_to_datetime_vec(dates , "%tw")
0 1961-01-01
dtype: datetime64[ns]
Notes
-----
datetime/c - tc
milliseconds since 01jan1960 00:00:00.000, assuming 86,400 s/day
datetime/C - tC - NOT IMPLEMENTED
milliseconds since 01jan1960 00:00:00.000, adjusted for leap seconds
date - td
days since 01jan1960 (01jan1960 = 0)
weekly date - tw
weeks since 1960w1
This assumes 52 weeks in a year, then adds 7 * remainder of the weeks.
The datetime value is the start of the week in terms of days in the
year, not ISO calendar weeks.
monthly date - tm
months since 1960m1
quarterly date - tq
quarters since 1960q1
half-yearly date - th
half-years since 1960h1 yearly
date - ty
years since 0000
If you don't have pandas with datetime support, then you can't do
milliseconds accurately.
"""
MIN_YEAR, MAX_YEAR = Timestamp.min.year, Timestamp.max.year
MAX_DAY_DELTA = (Timestamp.max - datetime.datetime(1960, 1, 1)).days
MIN_DAY_DELTA = (Timestamp.min - datetime.datetime(1960, 1, 1)).days
MIN_MS_DELTA = MIN_DAY_DELTA * 24 * 3600 * 1000
MAX_MS_DELTA = MAX_DAY_DELTA * 24 * 3600 * 1000
def convert_year_month_safe(year, month):
"""
Convert year and month to datetimes, using pandas vectorized versions
when the date range falls within the range supported by pandas. Other
wise it falls back to a slower but more robust method using datetime.
"""
if year.max() < MAX_YEAR and year.min() > MIN_YEAR:
return to_datetime(100 * year + month, format='%Y%m')
else:
index = getattr(year, 'index', None)
return Series(
[datetime.datetime(y, m, 1) for y, m in zip(year, month)],
index=index)
def convert_year_days_safe(year, days):
"""
Converts year (e.g. 1999) and days since the start of the year to a
datetime or datetime64 Series
"""
if year.max() < (MAX_YEAR - 1) and year.min() > MIN_YEAR:
return to_datetime(year, format='%Y') + to_timedelta(days, unit='d')
else:
index = getattr(year, 'index', None)
value = [datetime.datetime(y, 1, 1) + relativedelta(days=int(d)) for
y, d in zip(year, days)]
return Series(value, index=index)
def convert_delta_safe(base, deltas, unit):
"""
Convert base dates and deltas to datetimes, using pandas vectorized
versions if the deltas satisfy restrictions required to be expressed
as dates in pandas.
"""
index = getattr(deltas, 'index', None)
if unit == 'd':
if deltas.max() > MAX_DAY_DELTA or deltas.min() < MIN_DAY_DELTA:
values = [base + relativedelta(days=int(d)) for d in deltas]
return Series(values, index=index)
elif unit == 'ms':
if deltas.max() > MAX_MS_DELTA or deltas.min() < MIN_MS_DELTA:
values = [base + relativedelta(microseconds=(int(d) * 1000)) for
d in deltas]
return Series(values, index=index)
else:
raise ValueError('format not understood')
base = to_datetime(base)
deltas = to_timedelta(deltas, unit=unit)
return base + deltas
# TODO: If/when pandas supports more than datetime64[ns], this should be improved to use correct range, e.g. datetime[Y] for yearly
bad_locs = np.isnan(dates)
has_bad_values = False
if bad_locs.any():
has_bad_values = True
data_col = Series(dates)
data_col[bad_locs] = 1.0 # Replace with NaT
dates = dates.astype(np.int64)
if fmt in ["%tc", "tc"]: # Delta ms relative to base
base = stata_epoch
ms = dates
conv_dates = convert_delta_safe(base, ms, 'ms')
elif fmt in ["%tC", "tC"]:
from warnings import warn
warn("Encountered %tC format. Leaving in Stata Internal Format.")
conv_dates = Series(dates, dtype=np.object)
if has_bad_values:
conv_dates[bad_locs] = pd.NaT
return conv_dates
elif fmt in ["%td", "td", "%d", "d"]: # Delta days relative to base
base = stata_epoch
days = dates
conv_dates = convert_delta_safe(base, days, 'd')
elif fmt in ["%tw", "tw"]: # does not count leap days - 7 days is a week
year = stata_epoch.year + dates // 52
days = (dates % 52) * 7
conv_dates = convert_year_days_safe(year, days)
elif fmt in ["%tm", "tm"]: # Delta months relative to base
year = stata_epoch.year + dates // 12
month = (dates % 12) + 1
conv_dates = convert_year_month_safe(year, month)
elif fmt in ["%tq", "tq"]: # Delta quarters relative to base
year = stata_epoch.year + dates // 4
month = (dates % 4) * 3 + 1
conv_dates = convert_year_month_safe(year, month)
elif fmt in ["%th", "th"]: # Delta half-years relative to base
year = stata_epoch.year + dates // 2
month = (dates % 2) * 6 + 1
conv_dates = convert_year_month_safe(year, month)
elif fmt in ["%ty", "ty"]: # Years -- not delta
year = dates
month = np.ones_like(dates)
conv_dates = convert_year_month_safe(year, month)
else:
raise ValueError("Date fmt %s not understood" % fmt)
if has_bad_values: # Restore NaT for bad values
conv_dates[bad_locs] = NaT
return conv_dates
def _datetime_to_stata_elapsed_vec(dates, fmt):
"""
Convert from datetime to SIF. http://www.stata.com/help.cgi?datetime
Parameters
----------
dates : Series
Series or array containing datetime.datetime or datetime64[ns] to
convert to the Stata Internal Format given by fmt
fmt : str
The format to convert to. Can be, tc, td, tw, tm, tq, th, ty
"""
index = dates.index
NS_PER_DAY = 24 * 3600 * 1000 * 1000 * 1000
US_PER_DAY = NS_PER_DAY / 1000
def parse_dates_safe(dates, delta=False, year=False, days=False):
d = {}
if com.is_datetime64_dtype(dates.values):
if delta:
delta = dates - stata_epoch
d['delta'] = delta.values.astype(
np.int64) // 1000 # microseconds
if days or year:
dates = DatetimeIndex(dates)
d['year'], d['month'] = dates.year, dates.month
if days:
days = (dates.astype(np.int64) -
to_datetime(d['year'], format='%Y').astype(np.int64))
d['days'] = days // NS_PER_DAY
elif infer_dtype(dates) == 'datetime':
if delta:
delta = dates.values - stata_epoch
f = lambda x: \
US_PER_DAY * x.days + 1000000 * x.seconds + x.microseconds
v = np.vectorize(f)
d['delta'] = v(delta)
if year:
year_month = dates.apply(lambda x: 100 * x.year + x.month)
d['year'] = year_month.values // 100
d['month'] = (year_month.values - d['year'] * 100)
if days:
f = lambda x: (x - datetime.datetime(x.year, 1, 1)).days
v = np.vectorize(f)
d['days'] = v(dates)
else:
raise ValueError('Columns containing dates must contain either '
'datetime64, datetime.datetime or null values.')
return DataFrame(d, index=index)
bad_loc = isnull(dates)
index = dates.index
if bad_loc.any():
dates = Series(dates)
if com.is_datetime64_dtype(dates):
dates[bad_loc] = to_datetime(stata_epoch)
else:
dates[bad_loc] = stata_epoch
if fmt in ["%tc", "tc"]:
d = parse_dates_safe(dates, delta=True)
conv_dates = d.delta / 1000
elif fmt in ["%tC", "tC"]:
from warnings import warn
warn("Stata Internal Format tC not supported.")
conv_dates = dates
elif fmt in ["%td", "td"]:
d = parse_dates_safe(dates, delta=True)
conv_dates = d.delta // US_PER_DAY
elif fmt in ["%tw", "tw"]:
d = parse_dates_safe(dates, year=True, days=True)
conv_dates = (52 * (d.year - stata_epoch.year) + d.days // 7)
elif fmt in ["%tm", "tm"]:
d = parse_dates_safe(dates, year=True)
conv_dates = (12 * (d.year - stata_epoch.year) + d.month - 1)
elif fmt in ["%tq", "tq"]:
d = parse_dates_safe(dates, year=True)
conv_dates = 4 * (d.year - stata_epoch.year) + (d.month - 1) // 3
elif fmt in ["%th", "th"]:
d = parse_dates_safe(dates, year=True)
conv_dates = 2 * (d.year - stata_epoch.year) + \
(d.month > 6).astype(np.int)
elif fmt in ["%ty", "ty"]:
d = parse_dates_safe(dates, year=True)
conv_dates = d.year
else:
raise ValueError("fmt %s not understood" % fmt)
conv_dates = Series(conv_dates, dtype=np.float64)
missing_value = struct.unpack('<d', b'\x00\x00\x00\x00\x00\x00\xe0\x7f')[0]
conv_dates[bad_loc] = missing_value
return Series(conv_dates, index=index)
excessive_string_length_error = """
Fixed width strings in Stata .dta files are limited to 244 (or fewer) characters.
Column '%s' does not satisfy this restriction.
"""
class PossiblePrecisionLoss(Warning):
pass
precision_loss_doc = """
Column converted from %s to %s, and some data are outside of the lossless
conversion range. This may result in a loss of precision in the saved data.
"""
class ValueLabelTypeMismatch(Warning):
pass
value_label_mismatch_doc = """
Stata value labels (pandas categories) must be strings. Column {0} contains
non-string labels which will be converted to strings. Please check that the
Stata data file created has not lost information due to duplicate labels.
"""
class InvalidColumnName(Warning):
pass
invalid_name_doc = """
Not all pandas column names were valid Stata variable names.
The following replacements have been made:
{0}
If this is not what you expect, please make sure you have Stata-compliant
column names in your DataFrame (strings only, max 32 characters, only alphanumerics and
underscores, no Stata reserved words)
"""
def _cast_to_stata_types(data):
"""Checks the dtypes of the columns of a pandas DataFrame for
compatibility with the data types and ranges supported by Stata, and
converts if necessary.
Parameters
----------
data : DataFrame
The DataFrame to check and convert
Notes
-----
Numeric columns in Stata must be one of int8, int16, int32, float32 or
float64, with some additional value restrictions. int8 and int16 columns
are checked for violations of the value restrictions and
upcast if needed. int64 data is not usable in Stata, and so it is
downcast to int32 whenever the value are in the int32 range, and
sidecast to float64 when larger than this range. If the int64 values
are outside of the range of those perfectly representable as float64 values,
a warning is raised.
bool columns are cast to int8. uint colums are converted to int of the same
size if there is no loss in precision, other wise are upcast to a larger
type. uint64 is currently not supported since it is concerted to object in
a DataFrame.
"""
ws = ''
# original, if small, if large
conversion_data = ((np.bool, np.int8, np.int8),
(np.uint8, np.int8, np.int16),
(np.uint16, np.int16, np.int32),
(np.uint32, np.int32, np.int64))
for col in data:
dtype = data[col].dtype
# Cast from unsupported types to supported types
for c_data in conversion_data:
if dtype == c_data[0]:
if data[col].max() <= np.iinfo(c_data[1]).max:
dtype = c_data[1]
else:
dtype = c_data[2]
if c_data[2] == np.float64: # Warn if necessary
if data[col].max() >= 2 ** 53:
ws = precision_loss_doc % ('uint64', 'float64')
data[col] = data[col].astype(dtype)
# Check values and upcast if necessary
if dtype == np.int8:
if data[col].max() > 100 or data[col].min() < -127:
data[col] = data[col].astype(np.int16)
elif dtype == np.int16:
if data[col].max() > 32740 or data[col].min() < -32767:
data[col] = data[col].astype(np.int32)
elif dtype == np.int64:
if data[col].max() <= 2147483620 and data[col].min() >= -2147483647:
data[col] = data[col].astype(np.int32)
else:
data[col] = data[col].astype(np.float64)
if data[col].max() >= 2 ** 53 or data[col].min() <= -2 ** 53:
ws = precision_loss_doc % ('int64', 'float64')
if ws:
import warnings
warnings.warn(ws, PossiblePrecisionLoss)
return data
class StataValueLabel(object):
"""
Parse a categorical column and prepare formatted output
Parameters
-----------
value : int8, int16, int32, float32 or float64
The Stata missing value code
Attributes
----------
string : string
String representation of the Stata missing value
value : int8, int16, int32, float32 or float64
The original encoded missing value
Methods
-------
generate_value_label
"""
def __init__(self, catarray):
self.labname = catarray.name
categories = catarray.cat.categories
self.value_labels = list(zip(np.arange(len(categories)), categories))
self.value_labels.sort(key=lambda x: x[0])
self.text_len = np.int32(0)
self.off = []
self.val = []
self.txt = []
self.n = 0
# Compute lengths and setup lists of offsets and labels
for vl in self.value_labels:
category = vl[1]
if not isinstance(category, string_types):
category = str(category)
import warnings
warnings.warn(value_label_mismatch_doc.format(catarray.name),
ValueLabelTypeMismatch)
self.off.append(self.text_len)
self.text_len += len(category) + 1 # +1 for the padding
self.val.append(vl[0])
self.txt.append(category)
self.n += 1
if self.text_len > 32000:
raise ValueError('Stata value labels for a single variable must '
'have a combined length less than 32,000 '
'characters.')
# Ensure int32
self.off = np.array(self.off, dtype=np.int32)
self.val = np.array(self.val, dtype=np.int32)
# Total length
self.len = 4 + 4 + 4 * self.n + 4 * self.n + self.text_len
def _encode(self, s):
"""
Python 3 compatability shim
"""
if compat.PY3:
return s.encode(self._encoding)
else:
return s
def generate_value_label(self, byteorder, encoding):
"""
Parameters
----------
byteorder : str
Byte order of the output
encoding : str
File encoding
Returns
-------
value_label : bytes
Bytes containing the formatted value label
"""
self._encoding = encoding
bio = BytesIO()
null_string = '\x00'
null_byte = b'\x00'
# len
bio.write(struct.pack(byteorder + 'i', self.len))
# labname
labname = self._encode(_pad_bytes(self.labname[:32], 33))
bio.write(labname)
# padding - 3 bytes
for i in range(3):
bio.write(struct.pack('c', null_byte))
# value_label_table
# n - int32
bio.write(struct.pack(byteorder + 'i', self.n))
# textlen - int32
bio.write(struct.pack(byteorder + 'i', self.text_len))
# off - int32 array (n elements)
for offset in self.off:
bio.write(struct.pack(byteorder + 'i', offset))
# val - int32 array (n elements)
for value in self.val:
bio.write(struct.pack(byteorder + 'i', value))
# txt - Text labels, null terminated
for text in self.txt:
bio.write(self._encode(text + null_string))
bio.seek(0)
return bio.read()
class StataMissingValue(StringMixin):
"""
An observation's missing value.
Parameters
-----------
value : int8, int16, int32, float32 or float64
The Stata missing value code
Attributes
----------
string : string
String representation of the Stata missing value
value : int8, int16, int32, float32 or float64
The original encoded missing value
Notes
-----
More information: <http://www.stata.com/help.cgi?missing>
Integer missing values make the code '.', '.a', ..., '.z' to the ranges
101 ... 127 (for int8), 32741 ... 32767 (for int16) and 2147483621 ...
2147483647 (for int32). Missing values for floating point data types are
more complex but the pattern is simple to discern from the following table.
np.float32 missing values (float in Stata)
0000007f .
0008007f .a
0010007f .b
...
00c0007f .x
00c8007f .y
00d0007f .z
np.float64 missing values (double in Stata)
000000000000e07f .
000000000001e07f .a
000000000002e07f .b
...
000000000018e07f .x
000000000019e07f .y
00000000001ae07f .z
"""
# Construct a dictionary of missing values
MISSING_VALUES = {}
bases = (101, 32741, 2147483621)
for b in bases:
# Conversion to long to avoid hash issues on 32 bit platforms #8968
MISSING_VALUES[compat.long(b)] = '.'
for i in range(1, 27):
MISSING_VALUES[compat.long(i + b)] = '.' + chr(96 + i)
float32_base = b'\x00\x00\x00\x7f'
increment = struct.unpack('<i', b'\x00\x08\x00\x00')[0]
for i in range(27):
value = struct.unpack('<f', float32_base)[0]
MISSING_VALUES[value] = '.'
if i > 0:
MISSING_VALUES[value] += chr(96 + i)
int_value = struct.unpack('<i', struct.pack('<f', value))[0] + increment
float32_base = struct.pack('<i', int_value)
float64_base = b'\x00\x00\x00\x00\x00\x00\xe0\x7f'
increment = struct.unpack('q', b'\x00\x00\x00\x00\x00\x01\x00\x00')[0]
for i in range(27):
value = struct.unpack('<d', float64_base)[0]
MISSING_VALUES[value] = '.'
if i > 0:
MISSING_VALUES[value] += chr(96 + i)
int_value = struct.unpack('q', struct.pack('<d', value))[0] + increment
float64_base = struct.pack('q', int_value)
BASE_MISSING_VALUES = {'int8': 101,
'int16': 32741,
'int32': 2147483621,
'float32': struct.unpack('<f', float32_base)[0],
'float64': struct.unpack('<d', float64_base)[0]}
def __init__(self, value):
self._value = value
# Conversion to long to avoid hash issues on 32 bit platforms #8968
value = compat.long(value) if value < 2147483648 else float(value)
self._str = self.MISSING_VALUES[value]
string = property(lambda self: self._str,
doc="The Stata representation of the missing value: "
"'.', '.a'..'.z'")
value = property(lambda self: self._value,
doc='The binary representation of the missing value.')
def __unicode__(self):
return self.string
def __repr__(self):
# not perfect :-/
return "%s(%s)" % (self.__class__, self)
def __eq__(self, other):
return (isinstance(other, self.__class__)
and self.string == other.string and self.value == other.value)
@classmethod
def get_base_missing_value(cls, dtype):
if dtype == np.int8:
value = cls.BASE_MISSING_VALUES['int8']
elif dtype == np.int16:
value = cls.BASE_MISSING_VALUES['int16']
elif dtype == np.int32:
value = cls.BASE_MISSING_VALUES['int32']
elif dtype == np.float32:
value = cls.BASE_MISSING_VALUES['float32']
elif dtype == np.float64:
value = cls.BASE_MISSING_VALUES['float64']
else:
raise ValueError('Unsupported dtype')
return value
class StataParser(object):
_default_encoding = 'iso-8859-1'
def __init__(self, encoding):
self._encoding = encoding
#type code.
#--------------------
#str1 1 = 0x01
#str2 2 = 0x02
#...
#str244 244 = 0xf4
#byte 251 = 0xfb (sic)
#int 252 = 0xfc
#long 253 = 0xfd
#float 254 = 0xfe
#double 255 = 0xff
#--------------------
#NOTE: the byte type seems to be reserved for categorical variables
# with a label, but the underlying variable is -127 to 100
# we're going to drop the label and cast to int
self.DTYPE_MAP = \
dict(
lzip(range(1, 245), ['a' + str(i) for i in range(1, 245)]) +
[
(251, np.int8),
(252, np.int16),
(253, np.int32),
(254, np.float32),
(255, np.float64)
]
)
self.DTYPE_MAP_XML = \
dict(
[
(32768, np.uint8), # Keys to GSO
(65526, np.float64),
(65527, np.float32),
(65528, np.int32),
(65529, np.int16),
(65530, np.int8)
]
)
self.TYPE_MAP = lrange(251) + list('bhlfd')
self.TYPE_MAP_XML = \
dict(
[
(32768, 'Q'), # Not really a Q, unclear how to handle byteswap
(65526, 'd'),
(65527, 'f'),
(65528, 'l'),
(65529, 'h'),
(65530, 'b')
]
)
#NOTE: technically, some of these are wrong. there are more numbers
# that can be represented. it's the 27 ABOVE and BELOW the max listed
# numeric data type in [U] 12.2.2 of the 11.2 manual
float32_min = b'\xff\xff\xff\xfe'
float32_max = b'\xff\xff\xff\x7e'
float64_min = b'\xff\xff\xff\xff\xff\xff\xef\xff'
float64_max = b'\xff\xff\xff\xff\xff\xff\xdf\x7f'
self.VALID_RANGE = \
{
'b': (-127, 100),
'h': (-32767, 32740),
'l': (-2147483647, 2147483620),
'f': (np.float32(struct.unpack('<f', float32_min)[0]),
np.float32(struct.unpack('<f', float32_max)[0])),
'd': (np.float64(struct.unpack('<d', float64_min)[0]),
np.float64(struct.unpack('<d', float64_max)[0]))
}
self.OLD_TYPE_MAPPING = \
{
'i': 252,
'f': 254,
'b': 251
}
# These missing values are the generic '.' in Stata, and are used
# to replace nans
self.MISSING_VALUES = \
{
'b': 101,
'h': 32741,
'l': 2147483621,
'f': np.float32(struct.unpack('<f', b'\x00\x00\x00\x7f')[0]),
'd': np.float64(struct.unpack('<d', b'\x00\x00\x00\x00\x00\x00\xe0\x7f')[0])
}
self.NUMPY_TYPE_MAP = \
{
'b': 'i1',
'h': 'i2',
'l': 'i4',
'f': 'f4',
'd': 'f8',
'Q': 'u8'
}
# Reserved words cannot be used as variable names
self.RESERVED_WORDS = ('aggregate', 'array', 'boolean', 'break',
'byte', 'case', 'catch', 'class', 'colvector',
'complex', 'const', 'continue', 'default',
'delegate', 'delete', 'do', 'double', 'else',
'eltypedef', 'end', 'enum', 'explicit',
'export', 'external', 'float', 'for', 'friend',
'function', 'global', 'goto', 'if', 'inline',
'int', 'local', 'long', 'NULL', 'pragma',
'protected', 'quad', 'rowvector', 'short',
'typedef', 'typename', 'virtual')
def _decode_bytes(self, str, errors=None):
if compat.PY3 or self._encoding is not None:
return str.decode(self._encoding, errors)
else:
return str
class StataReader(StataParser):
__doc__ = _stata_reader_doc
def __init__(self, path_or_buf, convert_dates=True,
convert_categoricals=True, index=None,
convert_missing=False, preserve_dtypes=True,
columns=None, order_categoricals=True,
encoding='iso-8859-1', chunksize=None):
super(StataReader, self).__init__(encoding)
self.col_sizes = ()
# Arguments to the reader (can be temporarily overridden in
# calls to read).
self._convert_dates = convert_dates
self._convert_categoricals = convert_categoricals
self._index = index
self._convert_missing = convert_missing
self._preserve_dtypes = preserve_dtypes
self._columns = columns
self._order_categoricals = order_categoricals
self._encoding = encoding
self._chunksize = chunksize
# State variables for the file
self._has_string_data = False
self._missing_values = False
self._can_read_value_labels = False
self._column_selector_set = False
self._value_labels_read = False
self._data_read = False
self._dtype = None
self._lines_read = 0
self._native_byteorder = _set_endianness(sys.byteorder)
if isinstance(path_or_buf, str):
path_or_buf, encoding, _ = get_filepath_or_buffer(
path_or_buf, encoding=self._default_encoding
)
if isinstance(path_or_buf, (str, compat.text_type, bytes)):
self.path_or_buf = open(path_or_buf, 'rb')
else:
# Copy to BytesIO, and ensure no encoding
contents = path_or_buf.read()
try:
contents = contents.encode(self._default_encoding)
except:
pass
self.path_or_buf = BytesIO(contents)
self._read_header()
def __enter__(self):
""" enter context manager """
return self
def __exit__(self, exc_type, exc_value, traceback):
""" exit context manager """
self.close()
def close(self):
""" close the handle if its open """
try:
self.path_or_buf.close()
except IOError:
pass
def _read_header(self):
first_char = self.path_or_buf.read(1)
if struct.unpack('c', first_char)[0] == b'<':
self._read_new_header(first_char)
else:
self._read_old_header(first_char)
self.has_string_data = len([x for x in self.typlist
if type(x) is int]) > 0
# calculate size of a data record
self.col_sizes = lmap(lambda x: self._calcsize(x), self.typlist)
# remove format details from %td
self.fmtlist = ["%td" if x.startswith("%td") else x for x in self.fmtlist]
def _read_new_header(self, first_char):
# The first part of the header is common to 117 and 118.
self.path_or_buf.read(27) # stata_dta><header><release>
self.format_version = int(self.path_or_buf.read(3))
if self.format_version not in [117, 118]:
raise ValueError(_version_error)
self.path_or_buf.read(21) # </release><byteorder>
self.byteorder = self.path_or_buf.read(3) == "MSF" and '>' or '<'
self.path_or_buf.read(15) # </byteorder><K>
self.nvar = struct.unpack(self.byteorder + 'H',
self.path_or_buf.read(2))[0]
self.path_or_buf.read(7) # </K><N>
self.nobs = self._get_nobs()
self.path_or_buf.read(11) # </N><label>
self.data_label = self._get_data_label()
self.path_or_buf.read(19) # </label><timestamp>
self.time_stamp = self._get_time_stamp()
self.path_or_buf.read(26) # </timestamp></header><map>
self.path_or_buf.read(8) # 0x0000000000000000
self.path_or_buf.read(8) # position of <map>
self._seek_vartypes = struct.unpack(
self.byteorder + 'q', self.path_or_buf.read(8))[0] + 16
self._seek_varnames = struct.unpack(
self.byteorder + 'q', self.path_or_buf.read(8))[0] + 10
self._seek_sortlist = struct.unpack(
self.byteorder + 'q', self.path_or_buf.read(8))[0] + 10
self._seek_formats = struct.unpack(
self.byteorder + 'q', self.path_or_buf.read(8))[0] + 9
self._seek_value_label_names = struct.unpack(
self.byteorder + 'q', self.path_or_buf.read(8))[0] + 19
# Requires version-specific treatment
self._seek_variable_labels = self._get_seek_variable_labels()
self.path_or_buf.read(8) # <characteristics>
self.data_location = struct.unpack(
self.byteorder + 'q', self.path_or_buf.read(8))[0] + 6
self.seek_strls = struct.unpack(
self.byteorder + 'q', self.path_or_buf.read(8))[0] + 7
self.seek_value_labels = struct.unpack(
self.byteorder + 'q', self.path_or_buf.read(8))[0] + 14
self.typlist, self.dtyplist = self._get_dtypes(self._seek_vartypes)
self.path_or_buf.seek(self._seek_varnames)
self.varlist = self._get_varlist()
self.path_or_buf.seek(self._seek_sortlist)
self.srtlist = struct.unpack(
self.byteorder + ('h' * (self.nvar + 1)),
self.path_or_buf.read(2 * (self.nvar + 1))
)[:-1]
self.path_or_buf.seek(self._seek_formats)
self.fmtlist = self._get_fmtlist()
self.path_or_buf.seek(self._seek_value_label_names)
self.lbllist = self._get_lbllist()
self.path_or_buf.seek(self._seek_variable_labels)
self.vlblist = self._get_vlblist()
# Get data type information, works for versions 117-118.
def _get_dtypes(self, seek_vartypes):
self.path_or_buf.seek(seek_vartypes)
raw_typlist = [struct.unpack(self.byteorder + 'H',
self.path_or_buf.read(2))[0]
for i in range(self.nvar)]
def f(typ):
if typ <= 2045:
return typ
try:
return self.TYPE_MAP_XML[typ]
except KeyError:
raise ValueError("cannot convert stata types [{0}]".
format(typ))
typlist = [f(x) for x in raw_typlist]
def f(typ):
if typ <= 2045:
return str(typ)
try:
return self.DTYPE_MAP_XML[typ]
except KeyError:
raise ValueError("cannot convert stata dtype [{0}]"
.format(typ))
dtyplist = [f(x) for x in raw_typlist]
return typlist, dtyplist
def _get_varlist(self):
if self.format_version == 117:
b = 33
elif self.format_version == 118:
b = 129
return [self._null_terminate(self.path_or_buf.read(b))
for i in range(self.nvar)]
# Returns the format list
def _get_fmtlist(self):
if self.format_version == 118:
b = 57
elif self.format_version > 113:
b = 49
elif self.format_version > 104:
b = 12
else:
b = 7
return [self._null_terminate(self.path_or_buf.read(b))
for i in range(self.nvar)]
# Returns the label list
def _get_lbllist(self):
if self.format_version >= 118:
b = 129
elif self.format_version > 108:
b = 33
else:
b = 9
return [self._null_terminate(self.path_or_buf.read(b))
for i in range(self.nvar)]
def _get_vlblist(self):
if self.format_version == 118:
vlblist = [self._decode(self.path_or_buf.read(321))
for i in range(self.nvar)]
elif self.format_version > 105:
vlblist = [self._null_terminate(self.path_or_buf.read(81))
for i in range(self.nvar)]
else:
vlblist = [self._null_terminate(self.path_or_buf.read(32))
for i in range(self.nvar)]
return vlblist
def _get_nobs(self):
if self.format_version == 118:
return struct.unpack(self.byteorder + 'Q',
self.path_or_buf.read(8))[0]
else:
return struct.unpack(self.byteorder + 'I',
self.path_or_buf.read(4))[0]
def _get_data_label(self):
if self.format_version == 118:
strlen = struct.unpack(self.byteorder + 'H', self.path_or_buf.read(2))[0]
return self._decode(self.path_or_buf.read(strlen))
elif self.format_version == 117:
strlen = struct.unpack('b', self.path_or_buf.read(1))[0]
return self._null_terminate(self.path_or_buf.read(strlen))
elif self.format_version > 105:
return self._null_terminate(self.path_or_buf.read(81))
else:
return self._null_terminate(self.path_or_buf.read(32))
def _get_time_stamp(self):
if self.format_version == 118:
strlen = struct.unpack('b', self.path_or_buf.read(1))[0]
return self.path_or_buf.read(strlen).decode("utf-8")
elif self.format_version == 117:
strlen = struct.unpack('b', self.path_or_buf.read(1))[0]
return self._null_terminate(self.path_or_buf.read(strlen))
elif self.format_version > 104:
return self._null_terminate(self.path_or_buf.read(18))
else:
raise ValueError()
def _get_seek_variable_labels(self):
if self.format_version == 117:
self.path_or_buf.read(8) # <variable_lables>, throw away
# Stata 117 data files do not follow the described format. This is
# a work around that uses the previous label, 33 bytes for each
# variable, 20 for the closing tag and 17 for the opening tag
return self._seek_value_label_names + (33*self.nvar) + 20 + 17
elif self.format_version == 118:
return struct.unpack(self.byteorder + 'q', self.path_or_buf.read(8))[0] + 17
else:
raise ValueError()
def _read_old_header(self, first_char):
self.format_version = struct.unpack('b', first_char)[0]
if self.format_version not in [104, 105, 108, 113, 114, 115]:
raise ValueError(_version_error)
self.byteorder = struct.unpack('b', self.path_or_buf.read(1))[0] == 0x1 and '>' or '<'
self.filetype = struct.unpack('b', self.path_or_buf.read(1))[0]
self.path_or_buf.read(1) # unused
self.nvar = struct.unpack(self.byteorder + 'H',
self.path_or_buf.read(2))[0]
self.nobs = self._get_nobs()
self.data_label = self._get_data_label()
self.time_stamp = self._get_time_stamp()
# descriptors
if self.format_version > 108:
typlist = [ord(self.path_or_buf.read(1))
for i in range(self.nvar)]
else:
typlist = [
self.OLD_TYPE_MAPPING[
self._decode_bytes(self.path_or_buf.read(1))
] for i in range(self.nvar)
]
try:
self.typlist = [self.TYPE_MAP[typ] for typ in typlist]
except:
raise ValueError("cannot convert stata types [{0}]"
.format(','.join(typlist)))
try:
self.dtyplist = [self.DTYPE_MAP[typ] for typ in typlist]
except:
raise ValueError("cannot convert stata dtypes [{0}]"
.format(','.join(typlist)))
if self.format_version > 108:
self.varlist = [self._null_terminate(self.path_or_buf.read(33))
for i in range(self.nvar)]
else:
self.varlist = [self._null_terminate(self.path_or_buf.read(9))
for i in range(self.nvar)]
self.srtlist = struct.unpack(
self.byteorder + ('h' * (self.nvar + 1)),
self.path_or_buf.read(2 * (self.nvar + 1))
)[:-1]
self.fmtlist = self._get_fmtlist()
self.lbllist = self._get_lbllist()
self.vlblist = self._get_vlblist()
# ignore expansion fields (Format 105 and later)
# When reading, read five bytes; the last four bytes now tell you
# the size of the next read, which you discard. You then continue
# like this until you read 5 bytes of zeros.
if self.format_version > 104:
while True:
data_type = struct.unpack(self.byteorder + 'b',
self.path_or_buf.read(1))[0]
if self.format_version > 108:
data_len = struct.unpack(self.byteorder + 'i',
self.path_or_buf.read(4))[0]
else:
data_len = struct.unpack(self.byteorder + 'h',
self.path_or_buf.read(2))[0]
if data_type == 0:
break
self.path_or_buf.read(data_len)
# necessary data to continue parsing
self.data_location = self.path_or_buf.tell()
def _calcsize(self, fmt):
return (type(fmt) is int and fmt
or struct.calcsize(self.byteorder + fmt))
def _decode(self, s):
s = s.partition(b"\0")[0]
return s.decode('utf-8')
def _null_terminate(self, s):
if compat.PY3 or self._encoding is not None: # have bytes not strings,
# so must decode
s = s.partition(b"\0")[0]
return s.decode(self._encoding or self._default_encoding)
else:
null_byte = "\0"
try:
return s.lstrip(null_byte)[:s.index(null_byte)]
except:
return s
def _read_value_labels(self):
if self.format_version <= 108:
# Value labels are not supported in version 108 and earlier.
return
if self._value_labels_read:
# Don't read twice
return
if self.format_version >= 117:
self.path_or_buf.seek(self.seek_value_labels)
else:
offset = self.nobs * self._dtype.itemsize
self.path_or_buf.seek(self.data_location + offset)
self._value_labels_read = True
self.value_label_dict = dict()
while True:
if self.format_version >= 117:
if self.path_or_buf.read(5) == b'</val': # <lbl>
break # end of variable label table
slength = self.path_or_buf.read(4)
if not slength:
break # end of variable label table (format < 117)
if self.format_version <= 117:
labname = self._null_terminate(self.path_or_buf.read(33))
else:
labname = self._decode(self.path_or_buf.read(129))
self.path_or_buf.read(3) # padding
n = struct.unpack(self.byteorder + 'I',
self.path_or_buf.read(4))[0]
txtlen = struct.unpack(self.byteorder + 'I',
self.path_or_buf.read(4))[0]
off = []
for i in range(n):
off.append(struct.unpack(self.byteorder + 'I',
self.path_or_buf.read(4))[0])
val = []
for i in range(n):
val.append(struct.unpack(self.byteorder + 'I',
self.path_or_buf.read(4))[0])
txt = self.path_or_buf.read(txtlen)
self.value_label_dict[labname] = dict()
for i in range(n):
if self.format_version <= 117:
self.value_label_dict[labname][val[i]] = (
self._null_terminate(txt[off[i]:])
)
else:
self.value_label_dict[labname][val[i]] = (
self._decode(txt[off[i]:])
)
if self.format_version >= 117:
self.path_or_buf.read(6) # </lbl>
self._value_labels_read = True
def _read_strls(self):
self.path_or_buf.seek(self.seek_strls)
self.GSO = {0 : ''}
while True:
if self.path_or_buf.read(3) != b'GSO':
break
if self.format_version == 117:
v_o = struct.unpack(self.byteorder + 'Q', self.path_or_buf.read(8))[0]
else:
buf = self.path_or_buf.read(12)
# Only tested on little endian file on little endian machine.
if self.byteorder == '<':
buf = buf[0:2] + buf[4:10]
else:
buf = buf[0:2] + buf[6:]
v_o = struct.unpack('Q', buf)[0]
typ = struct.unpack('B', self.path_or_buf.read(1))[0]
length = struct.unpack(self.byteorder + 'I',
self.path_or_buf.read(4))[0]
va = self.path_or_buf.read(length)
if typ == 130:
encoding = 'utf-8'
if self.format_version == 117:
encoding = self._encoding or self._default_encoding
va = va[0:-1].decode(encoding)
self.GSO[v_o] = va
# legacy
@Appender('DEPRECATED: ' + _data_method_doc)
def data(self, **kwargs):
import warnings
warnings.warn("'data' is deprecated, use 'read' instead")
if self._data_read:
raise Exception("Data has already been read.")
self._data_read = True
return self.read(None, **kwargs)
def __iter__(self):
try:
if self._chunksize:
while True:
yield self.read(self._chunksize)
else:
yield self.read()
except StopIteration:
pass
def get_chunk(self, size=None):
"""
Reads lines from Stata file and returns as dataframe
Parameters
----------
size : int, defaults to None
Number of lines to read. If None, reads whole file.
Returns
-------
DataFrame
"""
if size is None:
size = self._chunksize
return self.read(nrows=size)
@Appender(_read_method_doc)
def read(self, nrows=None, convert_dates=None,
convert_categoricals=None, index=None,
convert_missing=None, preserve_dtypes=None,
columns=None, order_categoricals=None):
# Handle empty file or chunk. If reading incrementally raise
# StopIteration. If reading the whole thing return an empty
# data frame.
if (self.nobs == 0) and (nrows is None):
self._can_read_value_labels = True
self._data_read = True
return DataFrame(columns=self.varlist)
# Handle options
if convert_dates is None:
convert_dates = self._convert_dates
if convert_categoricals is None:
convert_categoricals = self._convert_categoricals
if convert_missing is None:
convert_missing = self._convert_missing
if preserve_dtypes is None:
preserve_dtypes = self._preserve_dtypes
if columns is None:
columns = self._columns
if order_categoricals is None:
order_categoricals = self._order_categoricals
if nrows is None:
nrows = self.nobs
if (self.format_version >= 117) and (self._dtype is None):
self._can_read_value_labels = True
self._read_strls()
# Setup the dtype.
if self._dtype is None:
dtype = [] # Convert struct data types to numpy data type
for i, typ in enumerate(self.typlist):
if typ in self.NUMPY_TYPE_MAP:
dtype.append(('s' + str(i), self.byteorder + self.NUMPY_TYPE_MAP[typ]))
else:
dtype.append(('s' + str(i), 'S' + str(typ)))
dtype = np.dtype(dtype)
self._dtype = dtype
# Read data
dtype = self._dtype
max_read_len = (self.nobs - self._lines_read) * dtype.itemsize
read_len = nrows * dtype.itemsize
read_len = min(read_len, max_read_len)
if read_len <= 0:
# Iterator has finished, should never be here unless
# we are reading the file incrementally
self._read_value_labels()
raise StopIteration
offset = self._lines_read * dtype.itemsize
self.path_or_buf.seek(self.data_location + offset)
read_lines = min(nrows, self.nobs - self._lines_read)
data = np.frombuffer(self.path_or_buf.read(read_len), dtype=dtype,
count=read_lines)
self._lines_read += read_lines
if self._lines_read == self.nobs:
self._can_read_value_labels = True
self._data_read = True
# if necessary, swap the byte order to native here
if self.byteorder != self._native_byteorder:
data = data.byteswap().newbyteorder()
if convert_categoricals:
self._read_value_labels()
if len(data)==0:
data = DataFrame(columns=self.varlist, index=index)
else:
data = DataFrame.from_records(data, index=index)
data.columns = self.varlist
# If index is not specified, use actual row number rather than
# restarting at 0 for each chunk.
if index is None:
ix = np.arange(self._lines_read - read_lines, self._lines_read)
data = data.set_index(ix)
if columns is not None:
data = self._do_select_columns(data, columns)
# Decode strings
for col, typ in zip(data, self.typlist):
if type(typ) is int:
data[col] = data[col].apply(self._null_terminate, convert_dtype=True)
data = self._insert_strls(data)
cols_ = np.where(self.dtyplist)[0]
# Convert columns (if needed) to match input type
index = data.index
requires_type_conversion = False
data_formatted = []
for i in cols_:
if self.dtyplist[i] is not None:
col = data.columns[i]
dtype = data[col].dtype
if (dtype != np.dtype(object)) and (dtype != self.dtyplist[i]):
requires_type_conversion = True
data_formatted.append((col, Series(data[col], index, self.dtyplist[i])))
else:
data_formatted.append((col, data[col]))
if requires_type_conversion:
data = DataFrame.from_items(data_formatted)
del data_formatted
self._do_convert_missing(data, convert_missing)
if convert_dates:
cols = np.where(lmap(lambda x: x in _date_formats,
self.fmtlist))[0]
for i in cols:
col = data.columns[i]
data[col] = _stata_elapsed_date_to_datetime_vec(data[col], self.fmtlist[i])
if convert_categoricals and self.value_label_dict:
data = self._do_convert_categoricals(data, self.value_label_dict, self.lbllist,
order_categoricals)
if not preserve_dtypes:
retyped_data = []
convert = False
for col in data:
dtype = data[col].dtype
if dtype in (np.float16, np.float32):
dtype = np.float64
convert = True
elif dtype in (np.int8, np.int16, np.int32):
dtype = np.int64
convert = True
retyped_data.append((col, data[col].astype(dtype)))
if convert:
data = DataFrame.from_items(retyped_data)
return data
def _do_convert_missing(self, data, convert_missing):
# Check for missing values, and replace if found
for i, colname in enumerate(data):
fmt = self.typlist[i]
if fmt not in self.VALID_RANGE:
continue
nmin, nmax = self.VALID_RANGE[fmt]
series = data[colname]
missing = np.logical_or(series < nmin, series > nmax)
if not missing.any():
continue
if convert_missing: # Replacement follows Stata notation
missing_loc = np.argwhere(missing)
umissing, umissing_loc = np.unique(series[missing],
return_inverse=True)
replacement = Series(series, dtype=np.object)
for j, um in enumerate(umissing):
missing_value = StataMissingValue(um)
loc = missing_loc[umissing_loc == j]
replacement.iloc[loc] = missing_value
else: # All replacements are identical
dtype = series.dtype
if dtype not in (np.float32, np.float64):
dtype = np.float64
replacement = Series(series, dtype=dtype)
replacement[missing] = np.nan
data[colname] = replacement
def _insert_strls(self, data):
if not hasattr(self, 'GSO') or len(self.GSO) == 0:
return data
for i, typ in enumerate(self.typlist):
if typ != 'Q':
continue
data.iloc[:, i] = [self.GSO[k] for k in data.iloc[:, i]]
return data
def _do_select_columns(self, data, columns):
if not self._column_selector_set:
column_set = set(columns)
if len(column_set) != len(columns):
raise ValueError('columns contains duplicate entries')
unmatched = column_set.difference(data.columns)
if unmatched:
raise ValueError('The following columns were not found in the '
'Stata data set: ' +
', '.join(list(unmatched)))
# Copy information for retained columns for later processing
dtyplist = []
typlist = []
fmtlist = []
lbllist = []
for col in columns:
i = data.columns.get_loc(col)
dtyplist.append(self.dtyplist[i])
typlist.append(self.typlist[i])
fmtlist.append(self.fmtlist[i])
lbllist.append(self.lbllist[i])
self.dtyplist = dtyplist
self.typlist = typlist
self.fmtlist = fmtlist
self.lbllist = lbllist
self._column_selector_set = True
return data[columns]
def _do_convert_categoricals(self, data, value_label_dict, lbllist, order_categoricals):
"""
Converts categorical columns to Categorical type.
"""
value_labels = list(compat.iterkeys(value_label_dict))
cat_converted_data = []
for col, label in zip(data, lbllist):
if label in value_labels:
# Explicit call with ordered=True
cat_data = Categorical(data[col], ordered=order_categoricals)
categories = []
for category in cat_data.categories:
if category in value_label_dict[label]:
categories.append(value_label_dict[label][category])
else:
categories.append(category) # Partially labeled
cat_data.categories = categories
# TODO: is the next line needed above in the data(...) method?
cat_data = Series(cat_data, index=data.index)
cat_converted_data.append((col, cat_data))
else:
cat_converted_data.append((col, data[col]))
data = DataFrame.from_items(cat_converted_data)
return data
def data_label(self):
"""Returns data label of Stata file"""
return self.data_label
def variable_labels(self):
"""Returns variable labels as a dict, associating each variable name
with corresponding label
"""
return dict(zip(self.varlist, self.vlblist))
def value_labels(self):
"""Returns a dict, associating each variable name a dict, associating
each value its corresponding label
"""
if not self._value_labels_read:
self._read_value_labels()
return self.value_label_dict
def _open_file_binary_write(fname, encoding):
if hasattr(fname, 'write'):
#if 'b' not in fname.mode:
return fname
return open(fname, "wb")
def _set_endianness(endianness):
if endianness.lower() in ["<", "little"]:
return "<"
elif endianness.lower() in [">", "big"]:
return ">"
else: # pragma : no cover
raise ValueError("Endianness %s not understood" % endianness)
def _pad_bytes(name, length):
"""
Takes a char string and pads it wih null bytes until it's length chars
"""
return name + "\x00" * (length - len(name))
def _convert_datetime_to_stata_type(fmt):
"""
Converts from one of the stata date formats to a type in TYPE_MAP
"""
if fmt in ["tc", "%tc", "td", "%td", "tw", "%tw", "tm", "%tm", "tq",
"%tq", "th", "%th", "ty", "%ty"]:
return np.float64 # Stata expects doubles for SIFs
else:
raise ValueError("fmt %s not understood" % fmt)
def _maybe_convert_to_int_keys(convert_dates, varlist):
new_dict = {}
for key in convert_dates:
if not convert_dates[key].startswith("%"): # make sure proper fmts
convert_dates[key] = "%" + convert_dates[key]
if key in varlist:
new_dict.update({varlist.index(key): convert_dates[key]})
else:
if not isinstance(key, int):
raise ValueError(
"convert_dates key is not in varlist and is not an int"
)
new_dict.update({key: convert_dates[key]})
return new_dict
def _dtype_to_stata_type(dtype, column):
"""
Converts dtype types to stata types. Returns the byte of the given ordinal.
See TYPE_MAP and comments for an explanation. This is also explained in
the dta spec.
1 - 244 are strings of this length
Pandas Stata
251 - chr(251) - for int8 byte
252 - chr(252) - for int16 int
253 - chr(253) - for int32 long
254 - chr(254) - for float32 float
255 - chr(255) - for double double
If there are dates to convert, then dtype will already have the correct
type inserted.
"""
# TODO: expand to handle datetime to integer conversion
if dtype.type == np.string_:
return chr(dtype.itemsize)
elif dtype.type == np.object_: # try to coerce it to the biggest string
# not memory efficient, what else could we
# do?
itemsize = max_len_string_array(com._ensure_object(column.values))
return chr(max(itemsize, 1))
elif dtype == np.float64:
return chr(255)
elif dtype == np.float32:
return chr(254)
elif dtype == np.int32:
return chr(253)
elif dtype == np.int16:
return chr(252)
elif dtype == np.int8:
return chr(251)
else: # pragma : no cover
raise ValueError("Data type %s not currently understood. "
"Please report an error to the developers." % dtype)
def _dtype_to_default_stata_fmt(dtype, column):
"""
Maps numpy dtype to stata's default format for this type. Not terribly
important since users can change this in Stata. Semantics are
object -> "%DDs" where DD is the length of the string. If not a string,
raise ValueError
float64 -> "%10.0g"
float32 -> "%9.0g"
int64 -> "%9.0g"
int32 -> "%12.0g"
int16 -> "%8.0g"
int8 -> "%8.0g"
"""
# TODO: Refactor to combine type with format
# TODO: expand this to handle a default datetime format?
if dtype.type == np.object_:
inferred_dtype = infer_dtype(column.dropna())
if not (inferred_dtype in ('string', 'unicode')
or len(column) == 0):
raise ValueError('Writing general object arrays is not supported')
itemsize = max_len_string_array(com._ensure_object(column.values))
if itemsize > 244:
raise ValueError(excessive_string_length_error % column.name)
return "%" + str(max(itemsize, 1)) + "s"
elif dtype == np.float64:
return "%10.0g"
elif dtype == np.float32:
return "%9.0g"
elif dtype == np.int32:
return "%12.0g"
elif dtype == np.int8 or dtype == np.int16:
return "%8.0g"
else: # pragma : no cover
raise ValueError("Data type %s not currently understood. "
"Please report an error to the developers." % dtype)
class StataWriter(StataParser):
"""
A class for writing Stata binary dta files from array-like objects
Parameters
----------
fname : file path or buffer
Where to save the dta file.
data : array-like
Array-like input to save. Pandas objects are also accepted.
convert_dates : dict
Dictionary mapping column of datetime types to the stata internal
format that you want to use for the dates. Options are
'tc', 'td', 'tm', 'tw', 'th', 'tq', 'ty'. Column can be either a
number or a name.
encoding : str
Default is latin-1. Note that Stata does not support unicode.
byteorder : str
Can be ">", "<", "little", or "big". The default is None which uses
`sys.byteorder`
time_stamp : datetime
A date time to use when writing the file. Can be None, in which
case the current time is used.
dataset_label : str
A label for the data set. Should be 80 characters or smaller.
Returns
-------
writer : StataWriter instance
The StataWriter instance has a write_file method, which will
write the file to the given `fname`.
Examples
--------
>>> import pandas as pd
>>> data = pd.DataFrame([[1.0, 1]], columns=['a', 'b'])
>>> writer = StataWriter('./data_file.dta', data)
>>> writer.write_file()
Or with dates
>>> from datetime import datetime
>>> data = pd.DataFrame([[datetime(2000,1,1)]], columns=['date'])
>>> writer = StataWriter('./date_data_file.dta', data, {'date' : 'tw'})
>>> writer.write_file()
"""
def __init__(self, fname, data, convert_dates=None, write_index=True,
encoding="latin-1", byteorder=None, time_stamp=None,
data_label=None):
super(StataWriter, self).__init__(encoding)
self._convert_dates = convert_dates
self._write_index = write_index
self._time_stamp = time_stamp
self._data_label = data_label
# attach nobs, nvars, data, varlist, typlist
self._prepare_pandas(data)
if byteorder is None:
byteorder = sys.byteorder
self._byteorder = _set_endianness(byteorder)
self._file = _open_file_binary_write(
fname, self._encoding or self._default_encoding
)
self.type_converters = {253: np.int32, 252: np.int16, 251: np.int8}
def _write(self, to_write):
"""
Helper to call encode before writing to file for Python 3 compat.
"""
if compat.PY3:
self._file.write(to_write.encode(self._encoding or
self._default_encoding))
else:
self._file.write(to_write)
def _prepare_categoricals(self, data):
"""Check for categorical columns, retain categorical information for
Stata file and convert categorical data to int"""
is_cat = [com.is_categorical_dtype(data[col]) for col in data]
self._is_col_cat = is_cat
self._value_labels = []
if not any(is_cat):
return data
get_base_missing_value = StataMissingValue.get_base_missing_value
index = data.index
data_formatted = []
for col, col_is_cat in zip(data, is_cat):
if col_is_cat:
self._value_labels.append(StataValueLabel(data[col]))
dtype = data[col].cat.codes.dtype
if dtype == np.int64:
raise ValueError('It is not possible to export int64-based '
'categorical data to Stata.')
values = data[col].cat.codes.values.copy()
# Upcast if needed so that correct missing values can be set
if values.max() >= get_base_missing_value(dtype):
if dtype == np.int8:
dtype = np.int16
elif dtype == np.int16:
dtype = np.int32
else:
dtype = np.float64
values = np.array(values, dtype=dtype)
# Replace missing values with Stata missing value for type
values[values == -1] = get_base_missing_value(dtype)
data_formatted.append((col, values, index))
else:
data_formatted.append((col, data[col]))
return DataFrame.from_items(data_formatted)
def _replace_nans(self, data):
# return data
"""Checks floating point data columns for nans, and replaces these with
the generic Stata for missing value (.)"""
for c in data:
dtype = data[c].dtype
if dtype in (np.float32, np.float64):
if dtype == np.float32:
replacement = self.MISSING_VALUES['f']
else:
replacement = self.MISSING_VALUES['d']
data[c] = data[c].fillna(replacement)
return data
def _check_column_names(self, data):
"""Checks column names to ensure that they are valid Stata column names.
This includes checks for:
* Non-string names
* Stata keywords
* Variables that start with numbers
* Variables with names that are too long
When an illegal variable name is detected, it is converted, and if dates
are exported, the variable name is propogated to the date conversion
dictionary
"""
converted_names = []
columns = list(data.columns)
original_columns = columns[:]
duplicate_var_id = 0
for j, name in enumerate(columns):
orig_name = name
if not isinstance(name, string_types):
name = text_type(name)
for c in name:
if (c < 'A' or c > 'Z') and (c < 'a' or c > 'z') and \
(c < '0' or c > '9') and c != '_':
name = name.replace(c, '_')
# Variable name must not be a reserved word
if name in self.RESERVED_WORDS:
name = '_' + name
# Variable name may not start with a number
if name[0] >= '0' and name[0] <= '9':
name = '_' + name
name = name[:min(len(name), 32)]
if not name == orig_name:
# check for duplicates
while columns.count(name) > 0:
# prepend ascending number to avoid duplicates
name = '_' + str(duplicate_var_id) + name
name = name[:min(len(name), 32)]
duplicate_var_id += 1
# need to possibly encode the orig name if its unicode
try:
orig_name = orig_name.encode('utf-8')
except:
pass
converted_names.append('{0} -> {1}'.format(orig_name, name))
columns[j] = name
data.columns = columns
# Check date conversion, and fix key if needed
if self._convert_dates:
for c, o in zip(columns, original_columns):
if c != o:
self._convert_dates[c] = self._convert_dates[o]
del self._convert_dates[o]
if converted_names:
import warnings
ws = invalid_name_doc.format('\n '.join(converted_names))
warnings.warn(ws, InvalidColumnName)
return data
def _prepare_pandas(self, data):
#NOTE: we might need a different API / class for pandas objects so
# we can set different semantics - handle this with a PR to pandas.io
data = data.copy()
if self._write_index:
data = data.reset_index()
# Ensure column names are strings
data = self._check_column_names(data)
# Check columns for compatibility with stata, upcast if necessary
data = _cast_to_stata_types(data)
# Replace NaNs with Stata missing values
data = self._replace_nans(data)
# Convert categoricals to int data, and strip labels
data = self._prepare_categoricals(data)
self.nobs, self.nvar = data.shape
self.data = data
self.varlist = data.columns.tolist()
dtypes = data.dtypes
if self._convert_dates is not None:
self._convert_dates = _maybe_convert_to_int_keys(
self._convert_dates, self.varlist
)
for key in self._convert_dates:
new_type = _convert_datetime_to_stata_type(
self._convert_dates[key]
)
dtypes[key] = np.dtype(new_type)
self.typlist = []
self.fmtlist = []
for col, dtype in dtypes.iteritems():
self.fmtlist.append(_dtype_to_default_stata_fmt(dtype, data[col]))
self.typlist.append(_dtype_to_stata_type(dtype, data[col]))
# set the given format for the datetime cols
if self._convert_dates is not None:
for key in self._convert_dates:
self.fmtlist[key] = self._convert_dates[key]
def write_file(self):
self._write_header(time_stamp=self._time_stamp,
data_label=self._data_label)
self._write_descriptors()
self._write_variable_labels()
# write 5 zeros for expansion fields
self._write(_pad_bytes("", 5))
self._prepare_data()
self._write_data()
self._write_value_labels()
self._file.close()
def _write_value_labels(self):
for vl in self._value_labels:
self._file.write(vl.generate_value_label(self._byteorder,
self._encoding))
def _write_header(self, data_label=None, time_stamp=None):
byteorder = self._byteorder
# ds_format - just use 114
self._file.write(struct.pack("b", 114))
# byteorder
self._write(byteorder == ">" and "\x01" or "\x02")
# filetype
self._write("\x01")
# unused
self._write("\x00")
# number of vars, 2 bytes
self._file.write(struct.pack(byteorder+"h", self.nvar)[:2])
# number of obs, 4 bytes
self._file.write(struct.pack(byteorder+"i", self.nobs)[:4])
# data label 81 bytes, char, null terminated
if data_label is None:
self._file.write(self._null_terminate(_pad_bytes("", 80)))
else:
self._file.write(
self._null_terminate(_pad_bytes(data_label[:80], 80))
)
# time stamp, 18 bytes, char, null terminated
# format dd Mon yyyy hh:mm
if time_stamp is None:
time_stamp = datetime.datetime.now()
elif not isinstance(time_stamp, datetime.datetime):
raise ValueError("time_stamp should be datetime type")
self._file.write(
self._null_terminate(time_stamp.strftime("%d %b %Y %H:%M"))
)
def _write_descriptors(self, typlist=None, varlist=None, srtlist=None,
fmtlist=None, lbllist=None):
nvar = self.nvar
# typlist, length nvar, format byte array
for typ in self.typlist:
self._write(typ)
# varlist names are checked by _check_column_names
# varlist, requires null terminated
for name in self.varlist:
name = self._null_terminate(name, True)
name = _pad_bytes(name[:32], 33)
self._write(name)
# srtlist, 2*(nvar+1), int array, encoded by byteorder
srtlist = _pad_bytes("", (2*(nvar+1)))
self._write(srtlist)
# fmtlist, 49*nvar, char array
for fmt in self.fmtlist:
self._write(_pad_bytes(fmt, 49))
# lbllist, 33*nvar, char array
for i in range(nvar):
# Use variable name when categorical
if self._is_col_cat[i]:
name = self.varlist[i]
name = self._null_terminate(name, True)
name = _pad_bytes(name[:32], 33)
self._write(name)
else: # Default is empty label
self._write(_pad_bytes("", 33))
def _write_variable_labels(self, labels=None):
nvar = self.nvar
if labels is None:
for i in range(nvar):
self._write(_pad_bytes("", 81))
def _prepare_data(self):
data = self.data
typlist = self.typlist
convert_dates = self._convert_dates
# 1. Convert dates
if self._convert_dates is not None:
for i, col in enumerate(data):
if i in convert_dates:
data[col] = _datetime_to_stata_elapsed_vec(data[col],
self.fmtlist[i])
# 2. Convert bad string data to '' and pad to correct length
dtype = []
data_cols = []
has_strings = False
for i, col in enumerate(data):
typ = ord(typlist[i])
if typ <= 244:
has_strings = True
data[col] = data[col].fillna('').apply(_pad_bytes, args=(typ,))
stype = 'S%d' % typ
dtype.append(('c'+str(i), stype))
string = data[col].str.encode(self._encoding)
data_cols.append(string.values.astype(stype))
else:
dtype.append(('c'+str(i), data[col].dtype))
data_cols.append(data[col].values)
dtype = np.dtype(dtype)
if has_strings:
self.data = np.fromiter(zip(*data_cols), dtype=dtype)
else:
self.data = data.to_records(index=False)
def _write_data(self):
data = self.data
data.tofile(self._file)
def _null_terminate(self, s, as_string=False):
null_byte = '\x00'
if compat.PY3 and not as_string:
s += null_byte
return s.encode(self._encoding)
else:
s += null_byte
return s
| apache-2.0 |
mbayon/TFG-MachineLearning | venv/lib/python3.6/site-packages/pandas/tests/io/test_packers.py | 3 | 32058 | import pytest
from warnings import catch_warnings
import os
import datetime
import numpy as np
import sys
from distutils.version import LooseVersion
from pandas import compat
from pandas.compat import u, PY3
from pandas import (Series, DataFrame, Panel, MultiIndex, bdate_range,
date_range, period_range, Index, Categorical)
from pandas.errors import PerformanceWarning
from pandas.io.packers import to_msgpack, read_msgpack
import pandas.util.testing as tm
from pandas.util.testing import (ensure_clean,
assert_categorical_equal,
assert_frame_equal,
assert_index_equal,
assert_series_equal,
patch)
from pandas.tests.test_panel import assert_panel_equal
import pandas
from pandas import Timestamp, NaT
from pandas._libs.tslib import iNaT
nan = np.nan
try:
import blosc # NOQA
except ImportError:
_BLOSC_INSTALLED = False
else:
_BLOSC_INSTALLED = True
try:
import zlib # NOQA
except ImportError:
_ZLIB_INSTALLED = False
else:
_ZLIB_INSTALLED = True
@pytest.fixture(scope='module')
def current_packers_data():
# our current version packers data
from pandas.tests.io.generate_legacy_storage_files import (
create_msgpack_data)
return create_msgpack_data()
@pytest.fixture(scope='module')
def all_packers_data():
# our all of our current version packers data
from pandas.tests.io.generate_legacy_storage_files import (
create_data)
return create_data()
def check_arbitrary(a, b):
if isinstance(a, (list, tuple)) and isinstance(b, (list, tuple)):
assert(len(a) == len(b))
for a_, b_ in zip(a, b):
check_arbitrary(a_, b_)
elif isinstance(a, Panel):
assert_panel_equal(a, b)
elif isinstance(a, DataFrame):
assert_frame_equal(a, b)
elif isinstance(a, Series):
assert_series_equal(a, b)
elif isinstance(a, Index):
assert_index_equal(a, b)
elif isinstance(a, Categorical):
# Temp,
# Categorical.categories is changed from str to bytes in PY3
# maybe the same as GH 13591
if PY3 and b.categories.inferred_type == 'string':
pass
else:
tm.assert_categorical_equal(a, b)
elif a is NaT:
assert b is NaT
elif isinstance(a, Timestamp):
assert a == b
assert a.freq == b.freq
else:
assert(a == b)
class TestPackers(object):
def setup_method(self, method):
self.path = '__%s__.msg' % tm.rands(10)
def teardown_method(self, method):
pass
def encode_decode(self, x, compress=None, **kwargs):
with ensure_clean(self.path) as p:
to_msgpack(p, x, compress=compress, **kwargs)
return read_msgpack(p, **kwargs)
class TestAPI(TestPackers):
def test_string_io(self):
df = DataFrame(np.random.randn(10, 2))
s = df.to_msgpack(None)
result = read_msgpack(s)
tm.assert_frame_equal(result, df)
s = df.to_msgpack()
result = read_msgpack(s)
tm.assert_frame_equal(result, df)
s = df.to_msgpack()
result = read_msgpack(compat.BytesIO(s))
tm.assert_frame_equal(result, df)
s = to_msgpack(None, df)
result = read_msgpack(s)
tm.assert_frame_equal(result, df)
with ensure_clean(self.path) as p:
s = df.to_msgpack()
fh = open(p, 'wb')
fh.write(s)
fh.close()
result = read_msgpack(p)
tm.assert_frame_equal(result, df)
@pytest.mark.xfail(reason="msgpack currently doesn't work with pathlib")
def test_path_pathlib(self):
df = tm.makeDataFrame()
result = tm.round_trip_pathlib(df.to_msgpack, read_msgpack)
tm.assert_frame_equal(df, result)
@pytest.mark.xfail(reason="msgpack currently doesn't work with localpath")
def test_path_localpath(self):
df = tm.makeDataFrame()
result = tm.round_trip_localpath(df.to_msgpack, read_msgpack)
tm.assert_frame_equal(df, result)
def test_iterator_with_string_io(self):
dfs = [DataFrame(np.random.randn(10, 2)) for i in range(5)]
s = to_msgpack(None, *dfs)
for i, result in enumerate(read_msgpack(s, iterator=True)):
tm.assert_frame_equal(result, dfs[i])
def test_invalid_arg(self):
# GH10369
class A(object):
def __init__(self):
self.read = 0
pytest.raises(ValueError, read_msgpack, path_or_buf=None)
pytest.raises(ValueError, read_msgpack, path_or_buf={})
pytest.raises(ValueError, read_msgpack, path_or_buf=A())
class TestNumpy(TestPackers):
def test_numpy_scalar_float(self):
x = np.float32(np.random.rand())
x_rec = self.encode_decode(x)
tm.assert_almost_equal(x, x_rec)
def test_numpy_scalar_complex(self):
x = np.complex64(np.random.rand() + 1j * np.random.rand())
x_rec = self.encode_decode(x)
assert np.allclose(x, x_rec)
def test_scalar_float(self):
x = np.random.rand()
x_rec = self.encode_decode(x)
tm.assert_almost_equal(x, x_rec)
def test_scalar_complex(self):
x = np.random.rand() + 1j * np.random.rand()
x_rec = self.encode_decode(x)
assert np.allclose(x, x_rec)
def test_list_numpy_float(self):
x = [np.float32(np.random.rand()) for i in range(5)]
x_rec = self.encode_decode(x)
# current msgpack cannot distinguish list/tuple
tm.assert_almost_equal(tuple(x), x_rec)
x_rec = self.encode_decode(tuple(x))
tm.assert_almost_equal(tuple(x), x_rec)
def test_list_numpy_float_complex(self):
if not hasattr(np, 'complex128'):
pytest.skip('numpy cant handle complex128')
x = [np.float32(np.random.rand()) for i in range(5)] + \
[np.complex128(np.random.rand() + 1j * np.random.rand())
for i in range(5)]
x_rec = self.encode_decode(x)
assert np.allclose(x, x_rec)
def test_list_float(self):
x = [np.random.rand() for i in range(5)]
x_rec = self.encode_decode(x)
# current msgpack cannot distinguish list/tuple
tm.assert_almost_equal(tuple(x), x_rec)
x_rec = self.encode_decode(tuple(x))
tm.assert_almost_equal(tuple(x), x_rec)
def test_list_float_complex(self):
x = [np.random.rand() for i in range(5)] + \
[(np.random.rand() + 1j * np.random.rand()) for i in range(5)]
x_rec = self.encode_decode(x)
assert np.allclose(x, x_rec)
def test_dict_float(self):
x = {'foo': 1.0, 'bar': 2.0}
x_rec = self.encode_decode(x)
tm.assert_almost_equal(x, x_rec)
def test_dict_complex(self):
x = {'foo': 1.0 + 1.0j, 'bar': 2.0 + 2.0j}
x_rec = self.encode_decode(x)
tm.assert_dict_equal(x, x_rec)
for key in x:
tm.assert_class_equal(x[key], x_rec[key], obj="complex value")
def test_dict_numpy_float(self):
x = {'foo': np.float32(1.0), 'bar': np.float32(2.0)}
x_rec = self.encode_decode(x)
tm.assert_almost_equal(x, x_rec)
def test_dict_numpy_complex(self):
x = {'foo': np.complex128(1.0 + 1.0j),
'bar': np.complex128(2.0 + 2.0j)}
x_rec = self.encode_decode(x)
tm.assert_dict_equal(x, x_rec)
for key in x:
tm.assert_class_equal(x[key], x_rec[key], obj="numpy complex128")
def test_numpy_array_float(self):
# run multiple times
for n in range(10):
x = np.random.rand(10)
for dtype in ['float32', 'float64']:
x = x.astype(dtype)
x_rec = self.encode_decode(x)
tm.assert_almost_equal(x, x_rec)
def test_numpy_array_complex(self):
x = (np.random.rand(5) + 1j * np.random.rand(5)).astype(np.complex128)
x_rec = self.encode_decode(x)
assert (all(map(lambda x, y: x == y, x, x_rec)) and
x.dtype == x_rec.dtype)
def test_list_mixed(self):
x = [1.0, np.float32(3.5), np.complex128(4.25), u('foo')]
x_rec = self.encode_decode(x)
# current msgpack cannot distinguish list/tuple
tm.assert_almost_equal(tuple(x), x_rec)
x_rec = self.encode_decode(tuple(x))
tm.assert_almost_equal(tuple(x), x_rec)
class TestBasic(TestPackers):
def test_timestamp(self):
for i in [Timestamp(
'20130101'), Timestamp('20130101', tz='US/Eastern'),
Timestamp('201301010501')]:
i_rec = self.encode_decode(i)
assert i == i_rec
def test_nat(self):
nat_rec = self.encode_decode(NaT)
assert NaT is nat_rec
def test_datetimes(self):
# fails under 2.6/win32 (np.datetime64 seems broken)
if LooseVersion(sys.version) < '2.7':
pytest.skip('2.6 with np.datetime64 is broken')
for i in [datetime.datetime(2013, 1, 1),
datetime.datetime(2013, 1, 1, 5, 1),
datetime.date(2013, 1, 1),
np.datetime64(datetime.datetime(2013, 1, 5, 2, 15))]:
i_rec = self.encode_decode(i)
assert i == i_rec
def test_timedeltas(self):
for i in [datetime.timedelta(days=1),
datetime.timedelta(days=1, seconds=10),
np.timedelta64(1000000)]:
i_rec = self.encode_decode(i)
assert i == i_rec
class TestIndex(TestPackers):
def setup_method(self, method):
super(TestIndex, self).setup_method(method)
self.d = {
'string': tm.makeStringIndex(100),
'date': tm.makeDateIndex(100),
'int': tm.makeIntIndex(100),
'rng': tm.makeRangeIndex(100),
'float': tm.makeFloatIndex(100),
'empty': Index([]),
'tuple': Index(zip(['foo', 'bar', 'baz'], [1, 2, 3])),
'period': Index(period_range('2012-1-1', freq='M', periods=3)),
'date2': Index(date_range('2013-01-1', periods=10)),
'bdate': Index(bdate_range('2013-01-02', periods=10)),
'cat': tm.makeCategoricalIndex(100)
}
self.mi = {
'reg': MultiIndex.from_tuples([('bar', 'one'), ('baz', 'two'),
('foo', 'two'),
('qux', 'one'), ('qux', 'two')],
names=['first', 'second']),
}
def test_basic_index(self):
for s, i in self.d.items():
i_rec = self.encode_decode(i)
tm.assert_index_equal(i, i_rec)
# datetime with no freq (GH5506)
i = Index([Timestamp('20130101'), Timestamp('20130103')])
i_rec = self.encode_decode(i)
tm.assert_index_equal(i, i_rec)
# datetime with timezone
i = Index([Timestamp('20130101 9:00:00'), Timestamp(
'20130103 11:00:00')]).tz_localize('US/Eastern')
i_rec = self.encode_decode(i)
tm.assert_index_equal(i, i_rec)
def test_multi_index(self):
for s, i in self.mi.items():
i_rec = self.encode_decode(i)
tm.assert_index_equal(i, i_rec)
def test_unicode(self):
i = tm.makeUnicodeIndex(100)
i_rec = self.encode_decode(i)
tm.assert_index_equal(i, i_rec)
def categorical_index(self):
# GH15487
df = DataFrame(np.random.randn(10, 2))
df = df.astype({0: 'category'}).set_index(0)
result = self.encode_decode(df)
tm.assert_frame_equal(result, df)
class TestSeries(TestPackers):
def setup_method(self, method):
super(TestSeries, self).setup_method(method)
self.d = {}
s = tm.makeStringSeries()
s.name = 'string'
self.d['string'] = s
s = tm.makeObjectSeries()
s.name = 'object'
self.d['object'] = s
s = Series(iNaT, dtype='M8[ns]', index=range(5))
self.d['date'] = s
data = {
'A': [0., 1., 2., 3., np.nan],
'B': [0, 1, 0, 1, 0],
'C': ['foo1', 'foo2', 'foo3', 'foo4', 'foo5'],
'D': date_range('1/1/2009', periods=5),
'E': [0., 1, Timestamp('20100101'), 'foo', 2.],
'F': [Timestamp('20130102', tz='US/Eastern')] * 2 +
[Timestamp('20130603', tz='CET')] * 3,
'G': [Timestamp('20130102', tz='US/Eastern')] * 5,
'H': Categorical([1, 2, 3, 4, 5]),
'I': Categorical([1, 2, 3, 4, 5], ordered=True),
}
self.d['float'] = Series(data['A'])
self.d['int'] = Series(data['B'])
self.d['mixed'] = Series(data['E'])
self.d['dt_tz_mixed'] = Series(data['F'])
self.d['dt_tz'] = Series(data['G'])
self.d['cat_ordered'] = Series(data['H'])
self.d['cat_unordered'] = Series(data['I'])
def test_basic(self):
# run multiple times here
for n in range(10):
for s, i in self.d.items():
i_rec = self.encode_decode(i)
assert_series_equal(i, i_rec)
class TestCategorical(TestPackers):
def setup_method(self, method):
super(TestCategorical, self).setup_method(method)
self.d = {}
self.d['plain_str'] = Categorical(['a', 'b', 'c', 'd', 'e'])
self.d['plain_str_ordered'] = Categorical(['a', 'b', 'c', 'd', 'e'],
ordered=True)
self.d['plain_int'] = Categorical([5, 6, 7, 8])
self.d['plain_int_ordered'] = Categorical([5, 6, 7, 8], ordered=True)
def test_basic(self):
# run multiple times here
for n in range(10):
for s, i in self.d.items():
i_rec = self.encode_decode(i)
assert_categorical_equal(i, i_rec)
class TestNDFrame(TestPackers):
def setup_method(self, method):
super(TestNDFrame, self).setup_method(method)
data = {
'A': [0., 1., 2., 3., np.nan],
'B': [0, 1, 0, 1, 0],
'C': ['foo1', 'foo2', 'foo3', 'foo4', 'foo5'],
'D': date_range('1/1/2009', periods=5),
'E': [0., 1, Timestamp('20100101'), 'foo', 2.],
'F': [Timestamp('20130102', tz='US/Eastern')] * 5,
'G': [Timestamp('20130603', tz='CET')] * 5,
'H': Categorical(['a', 'b', 'c', 'd', 'e']),
'I': Categorical(['a', 'b', 'c', 'd', 'e'], ordered=True),
}
self.frame = {
'float': DataFrame(dict(A=data['A'], B=Series(data['A']) + 1)),
'int': DataFrame(dict(A=data['B'], B=Series(data['B']) + 1)),
'mixed': DataFrame(data)}
with catch_warnings(record=True):
self.panel = {
'float': Panel(dict(ItemA=self.frame['float'],
ItemB=self.frame['float'] + 1))}
def test_basic_frame(self):
for s, i in self.frame.items():
i_rec = self.encode_decode(i)
assert_frame_equal(i, i_rec)
def test_basic_panel(self):
with catch_warnings(record=True):
for s, i in self.panel.items():
i_rec = self.encode_decode(i)
assert_panel_equal(i, i_rec)
def test_multi(self):
i_rec = self.encode_decode(self.frame)
for k in self.frame.keys():
assert_frame_equal(self.frame[k], i_rec[k])
l = tuple([self.frame['float'], self.frame['float'].A,
self.frame['float'].B, None])
l_rec = self.encode_decode(l)
check_arbitrary(l, l_rec)
# this is an oddity in that packed lists will be returned as tuples
l = [self.frame['float'], self.frame['float']
.A, self.frame['float'].B, None]
l_rec = self.encode_decode(l)
assert isinstance(l_rec, tuple)
check_arbitrary(l, l_rec)
def test_iterator(self):
l = [self.frame['float'], self.frame['float']
.A, self.frame['float'].B, None]
with ensure_clean(self.path) as path:
to_msgpack(path, *l)
for i, packed in enumerate(read_msgpack(path, iterator=True)):
check_arbitrary(packed, l[i])
def tests_datetimeindex_freq_issue(self):
# GH 5947
# inferring freq on the datetimeindex
df = DataFrame([1, 2, 3], index=date_range('1/1/2013', '1/3/2013'))
result = self.encode_decode(df)
assert_frame_equal(result, df)
df = DataFrame([1, 2], index=date_range('1/1/2013', '1/2/2013'))
result = self.encode_decode(df)
assert_frame_equal(result, df)
def test_dataframe_duplicate_column_names(self):
# GH 9618
expected_1 = DataFrame(columns=['a', 'a'])
expected_2 = DataFrame(columns=[1] * 100)
expected_2.loc[0] = np.random.randn(100)
expected_3 = DataFrame(columns=[1, 1])
expected_3.loc[0] = ['abc', np.nan]
result_1 = self.encode_decode(expected_1)
result_2 = self.encode_decode(expected_2)
result_3 = self.encode_decode(expected_3)
assert_frame_equal(result_1, expected_1)
assert_frame_equal(result_2, expected_2)
assert_frame_equal(result_3, expected_3)
class TestSparse(TestPackers):
def _check_roundtrip(self, obj, comparator, **kwargs):
# currently these are not implemetned
# i_rec = self.encode_decode(obj)
# comparator(obj, i_rec, **kwargs)
pytest.raises(NotImplementedError, self.encode_decode, obj)
def test_sparse_series(self):
s = tm.makeStringSeries()
s[3:5] = np.nan
ss = s.to_sparse()
self._check_roundtrip(ss, tm.assert_series_equal,
check_series_type=True)
ss2 = s.to_sparse(kind='integer')
self._check_roundtrip(ss2, tm.assert_series_equal,
check_series_type=True)
ss3 = s.to_sparse(fill_value=0)
self._check_roundtrip(ss3, tm.assert_series_equal,
check_series_type=True)
def test_sparse_frame(self):
s = tm.makeDataFrame()
s.loc[3:5, 1:3] = np.nan
s.loc[8:10, -2] = np.nan
ss = s.to_sparse()
self._check_roundtrip(ss, tm.assert_frame_equal,
check_frame_type=True)
ss2 = s.to_sparse(kind='integer')
self._check_roundtrip(ss2, tm.assert_frame_equal,
check_frame_type=True)
ss3 = s.to_sparse(fill_value=0)
self._check_roundtrip(ss3, tm.assert_frame_equal,
check_frame_type=True)
class TestCompression(TestPackers):
"""See https://github.com/pandas-dev/pandas/pull/9783
"""
def setup_method(self, method):
try:
from sqlalchemy import create_engine
self._create_sql_engine = create_engine
except ImportError:
self._SQLALCHEMY_INSTALLED = False
else:
self._SQLALCHEMY_INSTALLED = True
super(TestCompression, self).setup_method(method)
data = {
'A': np.arange(1000, dtype=np.float64),
'B': np.arange(1000, dtype=np.int32),
'C': list(100 * 'abcdefghij'),
'D': date_range(datetime.datetime(2015, 4, 1), periods=1000),
'E': [datetime.timedelta(days=x) for x in range(1000)],
}
self.frame = {
'float': DataFrame(dict((k, data[k]) for k in ['A', 'A'])),
'int': DataFrame(dict((k, data[k]) for k in ['B', 'B'])),
'mixed': DataFrame(data),
}
def test_plain(self):
i_rec = self.encode_decode(self.frame)
for k in self.frame.keys():
assert_frame_equal(self.frame[k], i_rec[k])
def _test_compression(self, compress):
i_rec = self.encode_decode(self.frame, compress=compress)
for k in self.frame.keys():
value = i_rec[k]
expected = self.frame[k]
assert_frame_equal(value, expected)
# make sure that we can write to the new frames
for block in value._data.blocks:
assert block.values.flags.writeable
def test_compression_zlib(self):
if not _ZLIB_INSTALLED:
pytest.skip('no zlib')
self._test_compression('zlib')
def test_compression_blosc(self):
if not _BLOSC_INSTALLED:
pytest.skip('no blosc')
self._test_compression('blosc')
def _test_compression_warns_when_decompress_caches(self, compress):
not_garbage = []
control = [] # copied data
compress_module = globals()[compress]
real_decompress = compress_module.decompress
def decompress(ob):
"""mock decompress function that delegates to the real
decompress but caches the result and a copy of the result.
"""
res = real_decompress(ob)
not_garbage.append(res) # hold a reference to this bytes object
control.append(bytearray(res)) # copy the data here to check later
return res
# types mapped to values to add in place.
rhs = {
np.dtype('float64'): 1.0,
np.dtype('int32'): 1,
np.dtype('object'): 'a',
np.dtype('datetime64[ns]'): np.timedelta64(1, 'ns'),
np.dtype('timedelta64[ns]'): np.timedelta64(1, 'ns'),
}
with patch(compress_module, 'decompress', decompress), \
tm.assert_produces_warning(PerformanceWarning) as ws:
i_rec = self.encode_decode(self.frame, compress=compress)
for k in self.frame.keys():
value = i_rec[k]
expected = self.frame[k]
assert_frame_equal(value, expected)
# make sure that we can write to the new frames even though
# we needed to copy the data
for block in value._data.blocks:
assert block.values.flags.writeable
# mutate the data in some way
block.values[0] += rhs[block.dtype]
for w in ws:
# check the messages from our warnings
assert str(w.message) == ('copying data after decompressing; '
'this may mean that decompress is '
'caching its result')
for buf, control_buf in zip(not_garbage, control):
# make sure none of our mutations above affected the
# original buffers
assert buf == control_buf
def test_compression_warns_when_decompress_caches_zlib(self):
if not _ZLIB_INSTALLED:
pytest.skip('no zlib')
self._test_compression_warns_when_decompress_caches('zlib')
def test_compression_warns_when_decompress_caches_blosc(self):
if not _BLOSC_INSTALLED:
pytest.skip('no blosc')
self._test_compression_warns_when_decompress_caches('blosc')
def _test_small_strings_no_warn(self, compress):
empty = np.array([], dtype='uint8')
with tm.assert_produces_warning(None):
empty_unpacked = self.encode_decode(empty, compress=compress)
tm.assert_numpy_array_equal(empty_unpacked, empty)
assert empty_unpacked.flags.writeable
char = np.array([ord(b'a')], dtype='uint8')
with tm.assert_produces_warning(None):
char_unpacked = self.encode_decode(char, compress=compress)
tm.assert_numpy_array_equal(char_unpacked, char)
assert char_unpacked.flags.writeable
# if this test fails I am sorry because the interpreter is now in a
# bad state where b'a' points to 98 == ord(b'b').
char_unpacked[0] = ord(b'b')
# we compare the ord of bytes b'a' with unicode u'a' because the should
# always be the same (unless we were able to mutate the shared
# character singleton in which case ord(b'a') == ord(b'b').
assert ord(b'a') == ord(u'a')
tm.assert_numpy_array_equal(
char_unpacked,
np.array([ord(b'b')], dtype='uint8'),
)
def test_small_strings_no_warn_zlib(self):
if not _ZLIB_INSTALLED:
pytest.skip('no zlib')
self._test_small_strings_no_warn('zlib')
def test_small_strings_no_warn_blosc(self):
if not _BLOSC_INSTALLED:
pytest.skip('no blosc')
self._test_small_strings_no_warn('blosc')
def test_readonly_axis_blosc(self):
# GH11880
if not _BLOSC_INSTALLED:
pytest.skip('no blosc')
df1 = DataFrame({'A': list('abcd')})
df2 = DataFrame(df1, index=[1., 2., 3., 4.])
assert 1 in self.encode_decode(df1['A'], compress='blosc')
assert 1. in self.encode_decode(df2['A'], compress='blosc')
def test_readonly_axis_zlib(self):
# GH11880
df1 = DataFrame({'A': list('abcd')})
df2 = DataFrame(df1, index=[1., 2., 3., 4.])
assert 1 in self.encode_decode(df1['A'], compress='zlib')
assert 1. in self.encode_decode(df2['A'], compress='zlib')
def test_readonly_axis_blosc_to_sql(self):
# GH11880
if not _BLOSC_INSTALLED:
pytest.skip('no blosc')
if not self._SQLALCHEMY_INSTALLED:
pytest.skip('no sqlalchemy')
expected = DataFrame({'A': list('abcd')})
df = self.encode_decode(expected, compress='blosc')
eng = self._create_sql_engine("sqlite:///:memory:")
df.to_sql('test', eng, if_exists='append')
result = pandas.read_sql_table('test', eng, index_col='index')
result.index.names = [None]
assert_frame_equal(expected, result)
def test_readonly_axis_zlib_to_sql(self):
# GH11880
if not _ZLIB_INSTALLED:
pytest.skip('no zlib')
if not self._SQLALCHEMY_INSTALLED:
pytest.skip('no sqlalchemy')
expected = DataFrame({'A': list('abcd')})
df = self.encode_decode(expected, compress='zlib')
eng = self._create_sql_engine("sqlite:///:memory:")
df.to_sql('test', eng, if_exists='append')
result = pandas.read_sql_table('test', eng, index_col='index')
result.index.names = [None]
assert_frame_equal(expected, result)
class TestEncoding(TestPackers):
def setup_method(self, method):
super(TestEncoding, self).setup_method(method)
data = {
'A': [compat.u('\u2019')] * 1000,
'B': np.arange(1000, dtype=np.int32),
'C': list(100 * 'abcdefghij'),
'D': date_range(datetime.datetime(2015, 4, 1), periods=1000),
'E': [datetime.timedelta(days=x) for x in range(1000)],
'G': [400] * 1000
}
self.frame = {
'float': DataFrame(dict((k, data[k]) for k in ['A', 'A'])),
'int': DataFrame(dict((k, data[k]) for k in ['B', 'B'])),
'mixed': DataFrame(data),
}
self.utf_encodings = ['utf8', 'utf16', 'utf32']
def test_utf(self):
# GH10581
for encoding in self.utf_encodings:
for frame in compat.itervalues(self.frame):
result = self.encode_decode(frame, encoding=encoding)
assert_frame_equal(result, frame)
def test_default_encoding(self):
for frame in compat.itervalues(self.frame):
result = frame.to_msgpack()
expected = frame.to_msgpack(encoding='utf8')
assert result == expected
result = self.encode_decode(frame)
assert_frame_equal(result, frame)
def legacy_packers_versions():
# yield the packers versions
path = tm.get_data_path('legacy_msgpack')
for v in os.listdir(path):
p = os.path.join(path, v)
if os.path.isdir(p):
yield v
class TestMsgpack(object):
"""
How to add msgpack tests:
1. Install pandas version intended to output the msgpack.
TestPackers
2. Execute "generate_legacy_storage_files.py" to create the msgpack.
$ python generate_legacy_storage_files.py <output_dir> msgpack
3. Move the created pickle to "data/legacy_msgpack/<version>" directory.
"""
minimum_structure = {'series': ['float', 'int', 'mixed',
'ts', 'mi', 'dup'],
'frame': ['float', 'int', 'mixed', 'mi'],
'panel': ['float'],
'index': ['int', 'date', 'period'],
'mi': ['reg2']}
def check_min_structure(self, data, version):
for typ, v in self.minimum_structure.items():
assert typ in data, '"{0}" not found in unpacked data'.format(typ)
for kind in v:
msg = '"{0}" not found in data["{1}"]'.format(kind, typ)
assert kind in data[typ], msg
def compare(self, current_data, all_data, vf, version):
# GH12277 encoding default used to be latin-1, now utf-8
if LooseVersion(version) < '0.18.0':
data = read_msgpack(vf, encoding='latin-1')
else:
data = read_msgpack(vf)
self.check_min_structure(data, version)
for typ, dv in data.items():
assert typ in all_data, ('unpacked data contains '
'extra key "{0}"'
.format(typ))
for dt, result in dv.items():
assert dt in current_data[typ], ('data["{0}"] contains extra '
'key "{1}"'.format(typ, dt))
try:
expected = current_data[typ][dt]
except KeyError:
continue
# use a specific comparator
# if available
comp_method = "compare_{typ}_{dt}".format(typ=typ, dt=dt)
comparator = getattr(self, comp_method, None)
if comparator is not None:
comparator(result, expected, typ, version)
else:
check_arbitrary(result, expected)
return data
def compare_series_dt_tz(self, result, expected, typ, version):
# 8260
# dtype is object < 0.17.0
if LooseVersion(version) < '0.17.0':
expected = expected.astype(object)
tm.assert_series_equal(result, expected)
else:
tm.assert_series_equal(result, expected)
def compare_frame_dt_mixed_tzs(self, result, expected, typ, version):
# 8260
# dtype is object < 0.17.0
if LooseVersion(version) < '0.17.0':
expected = expected.astype(object)
tm.assert_frame_equal(result, expected)
else:
tm.assert_frame_equal(result, expected)
@pytest.mark.parametrize('version', legacy_packers_versions())
def test_msgpacks_legacy(self, current_packers_data, all_packers_data,
version):
pth = tm.get_data_path('legacy_msgpack/{0}'.format(version))
n = 0
for f in os.listdir(pth):
# GH12142 0.17 files packed in P2 can't be read in P3
if (compat.PY3 and version.startswith('0.17.') and
f.split('.')[-4][-1] == '2'):
continue
vf = os.path.join(pth, f)
try:
with catch_warnings(record=True):
self.compare(current_packers_data, all_packers_data,
vf, version)
except ImportError:
# blosc not installed
continue
n += 1
assert n > 0, 'Msgpack files are not tested'
| mit |
jeffery-do/Vizdoombot | doom/lib/python3.5/site-packages/matplotlib/_cm.py | 4 | 93997 | """
Nothing here but dictionaries for generating LinearSegmentedColormaps,
and a dictionary of these dictionaries.
Documentation for each is in pyplot.colormaps()
"""
from __future__ import (absolute_import, division, print_function,
unicode_literals)
import numpy as np
_binary_data = {
'red': ((0., 1., 1.), (1., 0., 0.)),
'green': ((0., 1., 1.), (1., 0., 0.)),
'blue': ((0., 1., 1.), (1., 0., 0.))
}
_autumn_data = {'red': ((0., 1.0, 1.0), (1.0, 1.0, 1.0)),
'green': ((0., 0., 0.), (1.0, 1.0, 1.0)),
'blue': ((0., 0., 0.), (1.0, 0., 0.))}
_bone_data = {'red': ((0., 0., 0.),
(0.746032, 0.652778, 0.652778),
(1.0, 1.0, 1.0)),
'green': ((0., 0., 0.),
(0.365079, 0.319444, 0.319444),
(0.746032, 0.777778, 0.777778),
(1.0, 1.0, 1.0)),
'blue': ((0., 0., 0.),
(0.365079, 0.444444, 0.444444),
(1.0, 1.0, 1.0))}
_cool_data = {'red': ((0., 0., 0.), (1.0, 1.0, 1.0)),
'green': ((0., 1., 1.), (1.0, 0., 0.)),
'blue': ((0., 1., 1.), (1.0, 1., 1.))}
_copper_data = {'red': ((0., 0., 0.),
(0.809524, 1.000000, 1.000000),
(1.0, 1.0, 1.0)),
'green': ((0., 0., 0.),
(1.0, 0.7812, 0.7812)),
'blue': ((0., 0., 0.),
(1.0, 0.4975, 0.4975))}
_flag_data = {
'red': lambda x: 0.75 * np.sin((x * 31.5 + 0.25) * np.pi) + 0.5,
'green': lambda x: np.sin(x * 31.5 * np.pi),
'blue': lambda x: 0.75 * np.sin((x * 31.5 - 0.25) * np.pi) + 0.5,
}
_prism_data = {
'red': lambda x: 0.75 * np.sin((x * 20.9 + 0.25) * np.pi) + 0.67,
'green': lambda x: 0.75 * np.sin((x * 20.9 - 0.25) * np.pi) + 0.33,
'blue': lambda x: -1.1 * np.sin((x * 20.9) * np.pi),
}
def cubehelix(gamma=1.0, s=0.5, r=-1.5, h=1.0):
"""Return custom data dictionary of (r,g,b) conversion functions, which
can be used with :func:`register_cmap`, for the cubehelix color scheme.
Unlike most other color schemes cubehelix was designed by D.A. Green to
be monotonically increasing in terms of perceived brightness.
Also, when printed on a black and white postscript printer, the scheme
results in a greyscale with monotonically increasing brightness.
This color scheme is named cubehelix because the r,g,b values produced
can be visualised as a squashed helix around the diagonal in the
r,g,b color cube.
For a unit color cube (i.e. 3-D coordinates for r,g,b each in the
range 0 to 1) the color scheme starts at (r,g,b) = (0,0,0), i.e. black,
and finishes at (r,g,b) = (1,1,1), i.e. white. For some fraction *x*,
between 0 and 1, the color is the corresponding grey value at that
fraction along the black to white diagonal (x,x,x) plus a color
element. This color element is calculated in a plane of constant
perceived intensity and controlled by the following parameters.
Optional keyword arguments:
========= =======================================================
Keyword Description
========= =======================================================
gamma gamma factor to emphasise either low intensity values
(gamma < 1), or high intensity values (gamma > 1);
defaults to 1.0.
s the start color; defaults to 0.5 (i.e. purple).
r the number of r,g,b rotations in color that are made
from the start to the end of the color scheme; defaults
to -1.5 (i.e. -> B -> G -> R -> B).
h the hue parameter which controls how saturated the
colors are. If this parameter is zero then the color
scheme is purely a greyscale; defaults to 1.0.
========= =======================================================
"""
def get_color_function(p0, p1):
def color(x):
# Apply gamma factor to emphasise low or high intensity values
xg = x ** gamma
# Calculate amplitude and angle of deviation from the black
# to white diagonal in the plane of constant
# perceived intensity.
a = h * xg * (1 - xg) / 2
phi = 2 * np.pi * (s / 3 + r * x)
return xg + a * (p0 * np.cos(phi) + p1 * np.sin(phi))
return color
return {
'red': get_color_function(-0.14861, 1.78277),
'green': get_color_function(-0.29227, -0.90649),
'blue': get_color_function(1.97294, 0.0),
}
_cubehelix_data = cubehelix()
_bwr_data = ((0.0, 0.0, 1.0), (1.0, 1.0, 1.0), (1.0, 0.0, 0.0))
_brg_data = ((0.0, 0.0, 1.0), (1.0, 0.0, 0.0), (0.0, 1.0, 0.0))
# Gnuplot palette functions
gfunc = {
0: lambda x: 0,
1: lambda x: 0.5,
2: lambda x: 1,
3: lambda x: x,
4: lambda x: x ** 2,
5: lambda x: x ** 3,
6: lambda x: x ** 4,
7: lambda x: np.sqrt(x),
8: lambda x: np.sqrt(np.sqrt(x)),
9: lambda x: np.sin(x * np.pi / 2),
10: lambda x: np.cos(x * np.pi / 2),
11: lambda x: np.abs(x - 0.5),
12: lambda x: (2 * x - 1) ** 2,
13: lambda x: np.sin(x * np.pi),
14: lambda x: np.abs(np.cos(x * np.pi)),
15: lambda x: np.sin(x * 2 * np.pi),
16: lambda x: np.cos(x * 2 * np.pi),
17: lambda x: np.abs(np.sin(x * 2 * np.pi)),
18: lambda x: np.abs(np.cos(x * 2 * np.pi)),
19: lambda x: np.abs(np.sin(x * 4 * np.pi)),
20: lambda x: np.abs(np.cos(x * 4 * np.pi)),
21: lambda x: 3 * x,
22: lambda x: 3 * x - 1,
23: lambda x: 3 * x - 2,
24: lambda x: np.abs(3 * x - 1),
25: lambda x: np.abs(3 * x - 2),
26: lambda x: (3 * x - 1) / 2,
27: lambda x: (3 * x - 2) / 2,
28: lambda x: np.abs((3 * x - 1) / 2),
29: lambda x: np.abs((3 * x - 2) / 2),
30: lambda x: x / 0.32 - 0.78125,
31: lambda x: 2 * x - 0.84,
32: lambda x: gfunc32(x),
33: lambda x: np.abs(2 * x - 0.5),
34: lambda x: 2 * x,
35: lambda x: 2 * x - 0.5,
36: lambda x: 2 * x - 1.
}
def gfunc32(x):
ret = np.zeros(len(x))
m = (x < 0.25)
ret[m] = 4 * x[m]
m = (x >= 0.25) & (x < 0.92)
ret[m] = -2 * x[m] + 1.84
m = (x >= 0.92)
ret[m] = x[m] / 0.08 - 11.5
return ret
_gnuplot_data = {
'red': gfunc[7],
'green': gfunc[5],
'blue': gfunc[15],
}
_gnuplot2_data = {
'red': gfunc[30],
'green': gfunc[31],
'blue': gfunc[32],
}
_ocean_data = {
'red': gfunc[23],
'green': gfunc[28],
'blue': gfunc[3],
}
_afmhot_data = {
'red': gfunc[34],
'green': gfunc[35],
'blue': gfunc[36],
}
_rainbow_data = {
'red': gfunc[33],
'green': gfunc[13],
'blue': gfunc[10],
}
_seismic_data = (
(0.0, 0.0, 0.3), (0.0, 0.0, 1.0),
(1.0, 1.0, 1.0), (1.0, 0.0, 0.0),
(0.5, 0.0, 0.0))
_terrain_data = (
(0.00, (0.2, 0.2, 0.6)),
(0.15, (0.0, 0.6, 1.0)),
(0.25, (0.0, 0.8, 0.4)),
(0.50, (1.0, 1.0, 0.6)),
(0.75, (0.5, 0.36, 0.33)),
(1.00, (1.0, 1.0, 1.0)))
_gray_data = {'red': ((0., 0, 0), (1., 1, 1)),
'green': ((0., 0, 0), (1., 1, 1)),
'blue': ((0., 0, 0), (1., 1, 1))}
_hot_data = {'red': ((0., 0.0416, 0.0416),
(0.365079, 1.000000, 1.000000),
(1.0, 1.0, 1.0)),
'green': ((0., 0., 0.),
(0.365079, 0.000000, 0.000000),
(0.746032, 1.000000, 1.000000),
(1.0, 1.0, 1.0)),
'blue': ((0., 0., 0.),
(0.746032, 0.000000, 0.000000),
(1.0, 1.0, 1.0))}
_hsv_data = {'red': ((0., 1., 1.),
(0.158730, 1.000000, 1.000000),
(0.174603, 0.968750, 0.968750),
(0.333333, 0.031250, 0.031250),
(0.349206, 0.000000, 0.000000),
(0.666667, 0.000000, 0.000000),
(0.682540, 0.031250, 0.031250),
(0.841270, 0.968750, 0.968750),
(0.857143, 1.000000, 1.000000),
(1.0, 1.0, 1.0)),
'green': ((0., 0., 0.),
(0.158730, 0.937500, 0.937500),
(0.174603, 1.000000, 1.000000),
(0.507937, 1.000000, 1.000000),
(0.666667, 0.062500, 0.062500),
(0.682540, 0.000000, 0.000000),
(1.0, 0., 0.)),
'blue': ((0., 0., 0.),
(0.333333, 0.000000, 0.000000),
(0.349206, 0.062500, 0.062500),
(0.507937, 1.000000, 1.000000),
(0.841270, 1.000000, 1.000000),
(0.857143, 0.937500, 0.937500),
(1.0, 0.09375, 0.09375))}
_jet_data = {'red': ((0., 0, 0), (0.35, 0, 0), (0.66, 1, 1), (0.89, 1, 1),
(1, 0.5, 0.5)),
'green': ((0., 0, 0), (0.125, 0, 0), (0.375, 1, 1), (0.64, 1, 1),
(0.91, 0, 0), (1, 0, 0)),
'blue': ((0., 0.5, 0.5), (0.11, 1, 1), (0.34, 1, 1),
(0.65, 0, 0), (1, 0, 0))}
_pink_data = {'red': ((0., 0.1178, 0.1178), (0.015873, 0.195857, 0.195857),
(0.031746, 0.250661, 0.250661),
(0.047619, 0.295468, 0.295468),
(0.063492, 0.334324, 0.334324),
(0.079365, 0.369112, 0.369112),
(0.095238, 0.400892, 0.400892),
(0.111111, 0.430331, 0.430331),
(0.126984, 0.457882, 0.457882),
(0.142857, 0.483867, 0.483867),
(0.158730, 0.508525, 0.508525),
(0.174603, 0.532042, 0.532042),
(0.190476, 0.554563, 0.554563),
(0.206349, 0.576204, 0.576204),
(0.222222, 0.597061, 0.597061),
(0.238095, 0.617213, 0.617213),
(0.253968, 0.636729, 0.636729),
(0.269841, 0.655663, 0.655663),
(0.285714, 0.674066, 0.674066),
(0.301587, 0.691980, 0.691980),
(0.317460, 0.709441, 0.709441),
(0.333333, 0.726483, 0.726483),
(0.349206, 0.743134, 0.743134),
(0.365079, 0.759421, 0.759421),
(0.380952, 0.766356, 0.766356),
(0.396825, 0.773229, 0.773229),
(0.412698, 0.780042, 0.780042),
(0.428571, 0.786796, 0.786796),
(0.444444, 0.793492, 0.793492),
(0.460317, 0.800132, 0.800132),
(0.476190, 0.806718, 0.806718),
(0.492063, 0.813250, 0.813250),
(0.507937, 0.819730, 0.819730),
(0.523810, 0.826160, 0.826160),
(0.539683, 0.832539, 0.832539),
(0.555556, 0.838870, 0.838870),
(0.571429, 0.845154, 0.845154),
(0.587302, 0.851392, 0.851392),
(0.603175, 0.857584, 0.857584),
(0.619048, 0.863731, 0.863731),
(0.634921, 0.869835, 0.869835),
(0.650794, 0.875897, 0.875897),
(0.666667, 0.881917, 0.881917),
(0.682540, 0.887896, 0.887896),
(0.698413, 0.893835, 0.893835),
(0.714286, 0.899735, 0.899735),
(0.730159, 0.905597, 0.905597),
(0.746032, 0.911421, 0.911421),
(0.761905, 0.917208, 0.917208),
(0.777778, 0.922958, 0.922958),
(0.793651, 0.928673, 0.928673),
(0.809524, 0.934353, 0.934353),
(0.825397, 0.939999, 0.939999),
(0.841270, 0.945611, 0.945611),
(0.857143, 0.951190, 0.951190),
(0.873016, 0.956736, 0.956736),
(0.888889, 0.962250, 0.962250),
(0.904762, 0.967733, 0.967733),
(0.920635, 0.973185, 0.973185),
(0.936508, 0.978607, 0.978607),
(0.952381, 0.983999, 0.983999),
(0.968254, 0.989361, 0.989361),
(0.984127, 0.994695, 0.994695), (1.0, 1.0, 1.0)),
'green': ((0., 0., 0.), (0.015873, 0.102869, 0.102869),
(0.031746, 0.145479, 0.145479),
(0.047619, 0.178174, 0.178174),
(0.063492, 0.205738, 0.205738),
(0.079365, 0.230022, 0.230022),
(0.095238, 0.251976, 0.251976),
(0.111111, 0.272166, 0.272166),
(0.126984, 0.290957, 0.290957),
(0.142857, 0.308607, 0.308607),
(0.158730, 0.325300, 0.325300),
(0.174603, 0.341178, 0.341178),
(0.190476, 0.356348, 0.356348),
(0.206349, 0.370899, 0.370899),
(0.222222, 0.384900, 0.384900),
(0.238095, 0.398410, 0.398410),
(0.253968, 0.411476, 0.411476),
(0.269841, 0.424139, 0.424139),
(0.285714, 0.436436, 0.436436),
(0.301587, 0.448395, 0.448395),
(0.317460, 0.460044, 0.460044),
(0.333333, 0.471405, 0.471405),
(0.349206, 0.482498, 0.482498),
(0.365079, 0.493342, 0.493342),
(0.380952, 0.517549, 0.517549),
(0.396825, 0.540674, 0.540674),
(0.412698, 0.562849, 0.562849),
(0.428571, 0.584183, 0.584183),
(0.444444, 0.604765, 0.604765),
(0.460317, 0.624669, 0.624669),
(0.476190, 0.643958, 0.643958),
(0.492063, 0.662687, 0.662687),
(0.507937, 0.680900, 0.680900),
(0.523810, 0.698638, 0.698638),
(0.539683, 0.715937, 0.715937),
(0.555556, 0.732828, 0.732828),
(0.571429, 0.749338, 0.749338),
(0.587302, 0.765493, 0.765493),
(0.603175, 0.781313, 0.781313),
(0.619048, 0.796819, 0.796819),
(0.634921, 0.812029, 0.812029),
(0.650794, 0.826960, 0.826960),
(0.666667, 0.841625, 0.841625),
(0.682540, 0.856040, 0.856040),
(0.698413, 0.870216, 0.870216),
(0.714286, 0.884164, 0.884164),
(0.730159, 0.897896, 0.897896),
(0.746032, 0.911421, 0.911421),
(0.761905, 0.917208, 0.917208),
(0.777778, 0.922958, 0.922958),
(0.793651, 0.928673, 0.928673),
(0.809524, 0.934353, 0.934353),
(0.825397, 0.939999, 0.939999),
(0.841270, 0.945611, 0.945611),
(0.857143, 0.951190, 0.951190),
(0.873016, 0.956736, 0.956736),
(0.888889, 0.962250, 0.962250),
(0.904762, 0.967733, 0.967733),
(0.920635, 0.973185, 0.973185),
(0.936508, 0.978607, 0.978607),
(0.952381, 0.983999, 0.983999),
(0.968254, 0.989361, 0.989361),
(0.984127, 0.994695, 0.994695), (1.0, 1.0, 1.0)),
'blue': ((0., 0., 0.), (0.015873, 0.102869, 0.102869),
(0.031746, 0.145479, 0.145479),
(0.047619, 0.178174, 0.178174),
(0.063492, 0.205738, 0.205738),
(0.079365, 0.230022, 0.230022),
(0.095238, 0.251976, 0.251976),
(0.111111, 0.272166, 0.272166),
(0.126984, 0.290957, 0.290957),
(0.142857, 0.308607, 0.308607),
(0.158730, 0.325300, 0.325300),
(0.174603, 0.341178, 0.341178),
(0.190476, 0.356348, 0.356348),
(0.206349, 0.370899, 0.370899),
(0.222222, 0.384900, 0.384900),
(0.238095, 0.398410, 0.398410),
(0.253968, 0.411476, 0.411476),
(0.269841, 0.424139, 0.424139),
(0.285714, 0.436436, 0.436436),
(0.301587, 0.448395, 0.448395),
(0.317460, 0.460044, 0.460044),
(0.333333, 0.471405, 0.471405),
(0.349206, 0.482498, 0.482498),
(0.365079, 0.493342, 0.493342),
(0.380952, 0.503953, 0.503953),
(0.396825, 0.514344, 0.514344),
(0.412698, 0.524531, 0.524531),
(0.428571, 0.534522, 0.534522),
(0.444444, 0.544331, 0.544331),
(0.460317, 0.553966, 0.553966),
(0.476190, 0.563436, 0.563436),
(0.492063, 0.572750, 0.572750),
(0.507937, 0.581914, 0.581914),
(0.523810, 0.590937, 0.590937),
(0.539683, 0.599824, 0.599824),
(0.555556, 0.608581, 0.608581),
(0.571429, 0.617213, 0.617213),
(0.587302, 0.625727, 0.625727),
(0.603175, 0.634126, 0.634126),
(0.619048, 0.642416, 0.642416),
(0.634921, 0.650600, 0.650600),
(0.650794, 0.658682, 0.658682),
(0.666667, 0.666667, 0.666667),
(0.682540, 0.674556, 0.674556),
(0.698413, 0.682355, 0.682355),
(0.714286, 0.690066, 0.690066),
(0.730159, 0.697691, 0.697691),
(0.746032, 0.705234, 0.705234),
(0.761905, 0.727166, 0.727166),
(0.777778, 0.748455, 0.748455),
(0.793651, 0.769156, 0.769156),
(0.809524, 0.789314, 0.789314),
(0.825397, 0.808969, 0.808969),
(0.841270, 0.828159, 0.828159),
(0.857143, 0.846913, 0.846913),
(0.873016, 0.865261, 0.865261),
(0.888889, 0.883229, 0.883229),
(0.904762, 0.900837, 0.900837),
(0.920635, 0.918109, 0.918109),
(0.936508, 0.935061, 0.935061),
(0.952381, 0.951711, 0.951711),
(0.968254, 0.968075, 0.968075),
(0.984127, 0.984167, 0.984167), (1.0, 1.0, 1.0))}
_spring_data = {'red': ((0., 1., 1.), (1.0, 1.0, 1.0)),
'green': ((0., 0., 0.), (1.0, 1.0, 1.0)),
'blue': ((0., 1., 1.), (1.0, 0.0, 0.0))}
_summer_data = {'red': ((0., 0., 0.), (1.0, 1.0, 1.0)),
'green': ((0., 0.5, 0.5), (1.0, 1.0, 1.0)),
'blue': ((0., 0.4, 0.4), (1.0, 0.4, 0.4))}
_winter_data = {'red': ((0., 0., 0.), (1.0, 0.0, 0.0)),
'green': ((0., 0., 0.), (1.0, 1.0, 1.0)),
'blue': ((0., 1., 1.), (1.0, 0.5, 0.5))}
_nipy_spectral_data = {
'red': [(0.0, 0.0, 0.0), (0.05, 0.4667, 0.4667),
(0.10, 0.5333, 0.5333), (0.15, 0.0, 0.0),
(0.20, 0.0, 0.0), (0.25, 0.0, 0.0),
(0.30, 0.0, 0.0), (0.35, 0.0, 0.0),
(0.40, 0.0, 0.0), (0.45, 0.0, 0.0),
(0.50, 0.0, 0.0), (0.55, 0.0, 0.0),
(0.60, 0.0, 0.0), (0.65, 0.7333, 0.7333),
(0.70, 0.9333, 0.9333), (0.75, 1.0, 1.0),
(0.80, 1.0, 1.0), (0.85, 1.0, 1.0),
(0.90, 0.8667, 0.8667), (0.95, 0.80, 0.80),
(1.0, 0.80, 0.80)],
'green': [(0.0, 0.0, 0.0), (0.05, 0.0, 0.0),
(0.10, 0.0, 0.0), (0.15, 0.0, 0.0),
(0.20, 0.0, 0.0), (0.25, 0.4667, 0.4667),
(0.30, 0.6000, 0.6000), (0.35, 0.6667, 0.6667),
(0.40, 0.6667, 0.6667), (0.45, 0.6000, 0.6000),
(0.50, 0.7333, 0.7333), (0.55, 0.8667, 0.8667),
(0.60, 1.0, 1.0), (0.65, 1.0, 1.0),
(0.70, 0.9333, 0.9333), (0.75, 0.8000, 0.8000),
(0.80, 0.6000, 0.6000), (0.85, 0.0, 0.0),
(0.90, 0.0, 0.0), (0.95, 0.0, 0.0),
(1.0, 0.80, 0.80)],
'blue': [(0.0, 0.0, 0.0), (0.05, 0.5333, 0.5333),
(0.10, 0.6000, 0.6000), (0.15, 0.6667, 0.6667),
(0.20, 0.8667, 0.8667), (0.25, 0.8667, 0.8667),
(0.30, 0.8667, 0.8667), (0.35, 0.6667, 0.6667),
(0.40, 0.5333, 0.5333), (0.45, 0.0, 0.0),
(0.5, 0.0, 0.0), (0.55, 0.0, 0.0),
(0.60, 0.0, 0.0), (0.65, 0.0, 0.0),
(0.70, 0.0, 0.0), (0.75, 0.0, 0.0),
(0.80, 0.0, 0.0), (0.85, 0.0, 0.0),
(0.90, 0.0, 0.0), (0.95, 0.0, 0.0),
(1.0, 0.80, 0.80)],
}
# 34 colormaps based on color specifications and designs
# developed by Cynthia Brewer (http://colorbrewer.org).
# The ColorBrewer palettes have been included under the terms
# of an Apache-stype license (for details, see the file
# LICENSE_COLORBREWER in the license directory of the matplotlib
# source distribution).
_Accent_data = {'blue': [(0.0, 0.49803921580314636,
0.49803921580314636), (0.14285714285714285, 0.83137255907058716,
0.83137255907058716), (0.2857142857142857, 0.52549022436141968,
0.52549022436141968), (0.42857142857142855, 0.60000002384185791,
0.60000002384185791), (0.5714285714285714, 0.69019609689712524,
0.69019609689712524), (0.7142857142857143, 0.49803921580314636,
0.49803921580314636), (0.8571428571428571, 0.090196080505847931,
0.090196080505847931), (1.0, 0.40000000596046448,
0.40000000596046448)],
'green': [(0.0, 0.78823530673980713, 0.78823530673980713),
(0.14285714285714285, 0.68235296010971069, 0.68235296010971069),
(0.2857142857142857, 0.75294119119644165, 0.75294119119644165),
(0.42857142857142855, 1.0, 1.0), (0.5714285714285714,
0.42352941632270813, 0.42352941632270813), (0.7142857142857143,
0.0078431377187371254, 0.0078431377187371254),
(0.8571428571428571, 0.35686275362968445, 0.35686275362968445),
(1.0, 0.40000000596046448, 0.40000000596046448)],
'red': [(0.0, 0.49803921580314636, 0.49803921580314636),
(0.14285714285714285, 0.7450980544090271, 0.7450980544090271),
(0.2857142857142857, 0.99215686321258545, 0.99215686321258545),
(0.42857142857142855, 1.0, 1.0), (0.5714285714285714,
0.21960784494876862, 0.21960784494876862), (0.7142857142857143,
0.94117647409439087, 0.94117647409439087), (0.8571428571428571,
0.74901962280273438, 0.74901962280273438), (1.0,
0.40000000596046448, 0.40000000596046448)]}
_Blues_data = {'blue': [(0.0, 1.0, 1.0), (0.125, 0.9686274528503418,
0.9686274528503418), (0.25, 0.93725490570068359, 0.93725490570068359),
(0.375, 0.88235294818878174, 0.88235294818878174), (0.5,
0.83921569585800171, 0.83921569585800171), (0.625, 0.7764706015586853,
0.7764706015586853), (0.75, 0.70980393886566162, 0.70980393886566162),
(0.875, 0.61176472902297974, 0.61176472902297974), (1.0,
0.41960784792900085, 0.41960784792900085)],
'green': [(0.0, 0.9843137264251709, 0.9843137264251709), (0.125,
0.92156863212585449, 0.92156863212585449), (0.25,
0.85882353782653809, 0.85882353782653809), (0.375,
0.7921568751335144, 0.7921568751335144), (0.5,
0.68235296010971069, 0.68235296010971069), (0.625,
0.57254904508590698, 0.57254904508590698), (0.75,
0.44313725829124451, 0.44313725829124451), (0.875,
0.31764706969261169, 0.31764706969261169), (1.0,
0.18823529779911041, 0.18823529779911041)],
'red': [(0.0, 0.9686274528503418, 0.9686274528503418), (0.125,
0.87058824300765991, 0.87058824300765991), (0.25,
0.7764706015586853, 0.7764706015586853), (0.375,
0.61960786581039429, 0.61960786581039429), (0.5,
0.41960784792900085, 0.41960784792900085), (0.625,
0.25882354378700256, 0.25882354378700256), (0.75,
0.12941177189350128, 0.12941177189350128), (0.875,
0.031372550874948502, 0.031372550874948502), (1.0,
0.031372550874948502, 0.031372550874948502)]}
_BrBG_data = {'blue': [(0.0, 0.019607843831181526,
0.019607843831181526), (0.10000000000000001, 0.039215687662363052,
0.039215687662363052), (0.20000000000000001, 0.17647059261798859,
0.17647059261798859), (0.29999999999999999, 0.49019607901573181,
0.49019607901573181), (0.40000000000000002, 0.76470589637756348,
0.76470589637756348), (0.5, 0.96078431606292725, 0.96078431606292725),
(0.59999999999999998, 0.89803922176361084, 0.89803922176361084),
(0.69999999999999996, 0.75686275959014893, 0.75686275959014893),
(0.80000000000000004, 0.56078433990478516, 0.56078433990478516),
(0.90000000000000002, 0.36862745881080627, 0.36862745881080627), (1.0,
0.18823529779911041, 0.18823529779911041)],
'green': [(0.0, 0.18823529779911041, 0.18823529779911041),
(0.10000000000000001, 0.31764706969261169, 0.31764706969261169),
(0.20000000000000001, 0.5058823823928833, 0.5058823823928833),
(0.29999999999999999, 0.7607843279838562, 0.7607843279838562),
(0.40000000000000002, 0.90980392694473267, 0.90980392694473267),
(0.5, 0.96078431606292725, 0.96078431606292725),
(0.59999999999999998, 0.91764706373214722, 0.91764706373214722),
(0.69999999999999996, 0.80392158031463623, 0.80392158031463623),
(0.80000000000000004, 0.59215688705444336, 0.59215688705444336),
(0.90000000000000002, 0.40000000596046448, 0.40000000596046448),
(1.0, 0.23529411852359772, 0.23529411852359772)],
'red': [(0.0, 0.32941177487373352, 0.32941177487373352),
(0.10000000000000001, 0.54901963472366333, 0.54901963472366333),
(0.20000000000000001, 0.74901962280273438, 0.74901962280273438),
(0.29999999999999999, 0.87450981140136719, 0.87450981140136719),
(0.40000000000000002, 0.96470588445663452, 0.96470588445663452),
(0.5, 0.96078431606292725, 0.96078431606292725),
(0.59999999999999998, 0.78039216995239258, 0.78039216995239258),
(0.69999999999999996, 0.50196081399917603, 0.50196081399917603),
(0.80000000000000004, 0.20784313976764679, 0.20784313976764679),
(0.90000000000000002, 0.0039215688593685627,
0.0039215688593685627), (1.0, 0.0, 0.0)]}
_BuGn_data = {'blue': [(0.0, 0.99215686321258545,
0.99215686321258545), (0.125, 0.97647058963775635,
0.97647058963775635), (0.25, 0.90196079015731812,
0.90196079015731812), (0.375, 0.78823530673980713,
0.78823530673980713), (0.5, 0.64313727617263794, 0.64313727617263794),
(0.625, 0.46274510025978088, 0.46274510025978088), (0.75,
0.27058824896812439, 0.27058824896812439), (0.875,
0.17254902422428131, 0.17254902422428131), (1.0, 0.10588235408067703,
0.10588235408067703)],
'green': [(0.0, 0.98823529481887817, 0.98823529481887817), (0.125,
0.96078431606292725, 0.96078431606292725), (0.25,
0.92549020051956177, 0.92549020051956177), (0.375,
0.84705883264541626, 0.84705883264541626), (0.5,
0.7607843279838562, 0.7607843279838562), (0.625,
0.68235296010971069, 0.68235296010971069), (0.75,
0.54509806632995605, 0.54509806632995605), (0.875,
0.42745098471641541, 0.42745098471641541), (1.0,
0.26666668057441711, 0.26666668057441711)], 'red': [(0.0,
0.9686274528503418, 0.9686274528503418), (0.125,
0.89803922176361084, 0.89803922176361084), (0.25,
0.80000001192092896, 0.80000001192092896), (0.375,
0.60000002384185791, 0.60000002384185791), (0.5,
0.40000000596046448, 0.40000000596046448), (0.625,
0.25490197539329529, 0.25490197539329529), (0.75,
0.13725490868091583, 0.13725490868091583), (0.875, 0.0, 0.0),
(1.0, 0.0, 0.0)]}
_BuPu_data = {'blue': [(0.0, 0.99215686321258545,
0.99215686321258545), (0.125, 0.95686274766921997,
0.95686274766921997), (0.25, 0.90196079015731812,
0.90196079015731812), (0.375, 0.85490196943283081,
0.85490196943283081), (0.5, 0.7764706015586853, 0.7764706015586853),
(0.625, 0.69411766529083252, 0.69411766529083252), (0.75,
0.61568629741668701, 0.61568629741668701), (0.875,
0.48627451062202454, 0.48627451062202454), (1.0, 0.29411765933036804,
0.29411765933036804)],
'green': [(0.0, 0.98823529481887817, 0.98823529481887817), (0.125,
0.92549020051956177, 0.92549020051956177), (0.25,
0.82745099067687988, 0.82745099067687988), (0.375,
0.73725491762161255, 0.73725491762161255), (0.5,
0.58823531866073608, 0.58823531866073608), (0.625,
0.41960784792900085, 0.41960784792900085), (0.75,
0.25490197539329529, 0.25490197539329529), (0.875,
0.058823529630899429, 0.058823529630899429), (1.0, 0.0, 0.0)],
'red': [(0.0, 0.9686274528503418, 0.9686274528503418), (0.125,
0.87843137979507446, 0.87843137979507446), (0.25,
0.74901962280273438, 0.74901962280273438), (0.375,
0.61960786581039429, 0.61960786581039429), (0.5,
0.54901963472366333, 0.54901963472366333), (0.625,
0.54901963472366333, 0.54901963472366333), (0.75,
0.53333336114883423, 0.53333336114883423), (0.875,
0.5058823823928833, 0.5058823823928833), (1.0,
0.30196079611778259, 0.30196079611778259)]}
_Dark2_data = {'blue': [(0.0, 0.46666666865348816,
0.46666666865348816), (0.14285714285714285, 0.0078431377187371254,
0.0078431377187371254), (0.2857142857142857, 0.70196080207824707,
0.70196080207824707), (0.42857142857142855, 0.54117649793624878,
0.54117649793624878), (0.5714285714285714, 0.11764705926179886,
0.11764705926179886), (0.7142857142857143, 0.0078431377187371254,
0.0078431377187371254), (0.8571428571428571, 0.11372549086809158,
0.11372549086809158), (1.0, 0.40000000596046448,
0.40000000596046448)],
'green': [(0.0, 0.61960786581039429, 0.61960786581039429),
(0.14285714285714285, 0.37254902720451355, 0.37254902720451355),
(0.2857142857142857, 0.43921568989753723, 0.43921568989753723),
(0.42857142857142855, 0.16078431904315948, 0.16078431904315948),
(0.5714285714285714, 0.65098041296005249, 0.65098041296005249),
(0.7142857142857143, 0.67058825492858887, 0.67058825492858887),
(0.8571428571428571, 0.46274510025978088, 0.46274510025978088),
(1.0, 0.40000000596046448, 0.40000000596046448)],
'red': [(0.0, 0.10588235408067703, 0.10588235408067703),
(0.14285714285714285, 0.85098040103912354, 0.85098040103912354),
(0.2857142857142857, 0.45882353186607361, 0.45882353186607361),
(0.42857142857142855, 0.90588235855102539, 0.90588235855102539),
(0.5714285714285714, 0.40000000596046448, 0.40000000596046448),
(0.7142857142857143, 0.90196079015731812, 0.90196079015731812),
(0.8571428571428571, 0.65098041296005249, 0.65098041296005249),
(1.0, 0.40000000596046448, 0.40000000596046448)]}
_GnBu_data = {'blue': [(0.0, 0.94117647409439087,
0.94117647409439087), (0.125, 0.85882353782653809,
0.85882353782653809), (0.25, 0.77254903316497803,
0.77254903316497803), (0.375, 0.70980393886566162,
0.70980393886566162), (0.5, 0.76862746477127075, 0.76862746477127075),
(0.625, 0.82745099067687988, 0.82745099067687988), (0.75,
0.7450980544090271, 0.7450980544090271), (0.875, 0.67450982332229614,
0.67450982332229614), (1.0, 0.5058823823928833, 0.5058823823928833)],
'green': [(0.0, 0.98823529481887817, 0.98823529481887817), (0.125,
0.9529411792755127, 0.9529411792755127), (0.25,
0.92156863212585449, 0.92156863212585449), (0.375,
0.86666667461395264, 0.86666667461395264), (0.5,
0.80000001192092896, 0.80000001192092896), (0.625,
0.70196080207824707, 0.70196080207824707), (0.75,
0.54901963472366333, 0.54901963472366333), (0.875,
0.40784314274787903, 0.40784314274787903), (1.0,
0.25098040699958801, 0.25098040699958801)],
'red': [(0.0, 0.9686274528503418, 0.9686274528503418), (0.125,
0.87843137979507446, 0.87843137979507446), (0.25,
0.80000001192092896, 0.80000001192092896), (0.375,
0.65882354974746704, 0.65882354974746704), (0.5,
0.48235294222831726, 0.48235294222831726), (0.625,
0.30588236451148987, 0.30588236451148987), (0.75,
0.16862745583057404, 0.16862745583057404), (0.875,
0.031372550874948502, 0.031372550874948502), (1.0,
0.031372550874948502, 0.031372550874948502)]}
_Greens_data = {'blue': [(0.0, 0.96078431606292725,
0.96078431606292725), (0.125, 0.87843137979507446,
0.87843137979507446), (0.25, 0.75294119119644165,
0.75294119119644165), (0.375, 0.60784316062927246,
0.60784316062927246), (0.5, 0.46274510025978088, 0.46274510025978088),
(0.625, 0.364705890417099, 0.364705890417099), (0.75,
0.27058824896812439, 0.27058824896812439), (0.875,
0.17254902422428131, 0.17254902422428131), (1.0, 0.10588235408067703,
0.10588235408067703)],
'green': [(0.0, 0.98823529481887817, 0.98823529481887817), (0.125,
0.96078431606292725, 0.96078431606292725), (0.25,
0.91372549533843994, 0.91372549533843994), (0.375,
0.85098040103912354, 0.85098040103912354), (0.5,
0.76862746477127075, 0.76862746477127075), (0.625,
0.67058825492858887, 0.67058825492858887), (0.75,
0.54509806632995605, 0.54509806632995605), (0.875,
0.42745098471641541, 0.42745098471641541), (1.0,
0.26666668057441711, 0.26666668057441711)],
'red': [(0.0, 0.9686274528503418, 0.9686274528503418), (0.125,
0.89803922176361084, 0.89803922176361084), (0.25,
0.78039216995239258, 0.78039216995239258), (0.375,
0.63137257099151611, 0.63137257099151611), (0.5,
0.45490196347236633, 0.45490196347236633), (0.625,
0.25490197539329529, 0.25490197539329529), (0.75,
0.13725490868091583, 0.13725490868091583), (0.875, 0.0, 0.0),
(1.0, 0.0, 0.0)]}
_Greys_data = {'blue': [(0.0, 1.0, 1.0), (0.125, 0.94117647409439087,
0.94117647409439087), (0.25, 0.85098040103912354,
0.85098040103912354), (0.375, 0.74117648601531982,
0.74117648601531982), (0.5, 0.58823531866073608, 0.58823531866073608),
(0.625, 0.45098039507865906, 0.45098039507865906), (0.75,
0.32156863808631897, 0.32156863808631897), (0.875,
0.14509804546833038, 0.14509804546833038), (1.0, 0.0, 0.0)],
'green': [(0.0, 1.0, 1.0), (0.125, 0.94117647409439087,
0.94117647409439087), (0.25, 0.85098040103912354,
0.85098040103912354), (0.375, 0.74117648601531982,
0.74117648601531982), (0.5, 0.58823531866073608,
0.58823531866073608), (0.625, 0.45098039507865906,
0.45098039507865906), (0.75, 0.32156863808631897,
0.32156863808631897), (0.875, 0.14509804546833038,
0.14509804546833038), (1.0, 0.0, 0.0)],
'red': [(0.0, 1.0, 1.0), (0.125, 0.94117647409439087,
0.94117647409439087), (0.25, 0.85098040103912354,
0.85098040103912354), (0.375, 0.74117648601531982,
0.74117648601531982), (0.5, 0.58823531866073608,
0.58823531866073608), (0.625, 0.45098039507865906,
0.45098039507865906), (0.75, 0.32156863808631897,
0.32156863808631897), (0.875, 0.14509804546833038,
0.14509804546833038), (1.0, 0.0, 0.0)]}
_Oranges_data = {'blue': [(0.0, 0.92156863212585449,
0.92156863212585449), (0.125, 0.80784314870834351,
0.80784314870834351), (0.25, 0.63529413938522339,
0.63529413938522339), (0.375, 0.41960784792900085,
0.41960784792900085), (0.5, 0.23529411852359772, 0.23529411852359772),
(0.625, 0.074509806931018829, 0.074509806931018829), (0.75,
0.0039215688593685627, 0.0039215688593685627), (0.875,
0.011764706112444401, 0.011764706112444401), (1.0,
0.015686275437474251, 0.015686275437474251)],
'green': [(0.0, 0.96078431606292725, 0.96078431606292725), (0.125,
0.90196079015731812, 0.90196079015731812), (0.25,
0.81568628549575806, 0.81568628549575806), (0.375,
0.68235296010971069, 0.68235296010971069), (0.5,
0.55294120311737061, 0.55294120311737061), (0.625,
0.4117647111415863, 0.4117647111415863), (0.75,
0.28235295414924622, 0.28235295414924622), (0.875,
0.21176470816135406, 0.21176470816135406), (1.0,
0.15294118225574493, 0.15294118225574493)],
'red': [(0.0, 1.0, 1.0), (0.125, 0.99607843160629272,
0.99607843160629272), (0.25, 0.99215686321258545,
0.99215686321258545), (0.375, 0.99215686321258545,
0.99215686321258545), (0.5, 0.99215686321258545,
0.99215686321258545), (0.625, 0.94509804248809814,
0.94509804248809814), (0.75, 0.85098040103912354,
0.85098040103912354), (0.875, 0.65098041296005249,
0.65098041296005249), (1.0, 0.49803921580314636,
0.49803921580314636)]}
_OrRd_data = {'blue': [(0.0, 0.92549020051956177,
0.92549020051956177), (0.125, 0.78431373834609985,
0.78431373834609985), (0.25, 0.61960786581039429,
0.61960786581039429), (0.375, 0.51764708757400513,
0.51764708757400513), (0.5, 0.3490196168422699, 0.3490196168422699),
(0.625, 0.28235295414924622, 0.28235295414924622), (0.75,
0.12156862765550613, 0.12156862765550613), (0.875, 0.0, 0.0), (1.0,
0.0, 0.0)],
'green': [(0.0, 0.9686274528503418, 0.9686274528503418), (0.125,
0.90980392694473267, 0.90980392694473267), (0.25,
0.83137255907058716, 0.83137255907058716), (0.375,
0.73333334922790527, 0.73333334922790527), (0.5,
0.55294120311737061, 0.55294120311737061), (0.625,
0.3960784375667572, 0.3960784375667572), (0.75,
0.18823529779911041, 0.18823529779911041), (0.875, 0.0, 0.0),
(1.0, 0.0, 0.0)],
'red': [(0.0, 1.0, 1.0), (0.125, 0.99607843160629272,
0.99607843160629272), (0.25, 0.99215686321258545,
0.99215686321258545), (0.375, 0.99215686321258545,
0.99215686321258545), (0.5, 0.98823529481887817,
0.98823529481887817), (0.625, 0.93725490570068359,
0.93725490570068359), (0.75, 0.84313726425170898,
0.84313726425170898), (0.875, 0.70196080207824707,
0.70196080207824707), (1.0, 0.49803921580314636,
0.49803921580314636)]}
_Paired_data = {'blue': [(0.0, 0.89019608497619629,
0.89019608497619629), (0.090909090909090912, 0.70588237047195435,
0.70588237047195435), (0.18181818181818182, 0.54117649793624878,
0.54117649793624878), (0.27272727272727271, 0.17254902422428131,
0.17254902422428131), (0.36363636363636365, 0.60000002384185791,
0.60000002384185791), (0.45454545454545453, 0.10980392247438431,
0.10980392247438431), (0.54545454545454541, 0.43529412150382996,
0.43529412150382996), (0.63636363636363635, 0.0, 0.0),
(0.72727272727272729, 0.83921569585800171, 0.83921569585800171),
(0.81818181818181823, 0.60392159223556519, 0.60392159223556519),
(0.90909090909090906, 0.60000002384185791, 0.60000002384185791), (1.0,
0.15686275064945221, 0.15686275064945221)],
'green': [(0.0, 0.80784314870834351, 0.80784314870834351),
(0.090909090909090912, 0.47058823704719543, 0.47058823704719543),
(0.18181818181818182, 0.87450981140136719, 0.87450981140136719),
(0.27272727272727271, 0.62745100259780884, 0.62745100259780884),
(0.36363636363636365, 0.60392159223556519, 0.60392159223556519),
(0.45454545454545453, 0.10196078568696976, 0.10196078568696976),
(0.54545454545454541, 0.74901962280273438, 0.74901962280273438),
(0.63636363636363635, 0.49803921580314636, 0.49803921580314636),
(0.72727272727272729, 0.69803923368453979, 0.69803923368453979),
(0.81818181818181823, 0.23921568691730499, 0.23921568691730499),
(0.90909090909090906, 1.0, 1.0), (1.0, 0.3490196168422699,
0.3490196168422699)],
'red': [(0.0, 0.65098041296005249, 0.65098041296005249),
(0.090909090909090912, 0.12156862765550613, 0.12156862765550613),
(0.18181818181818182, 0.69803923368453979, 0.69803923368453979),
(0.27272727272727271, 0.20000000298023224, 0.20000000298023224),
(0.36363636363636365, 0.9843137264251709, 0.9843137264251709),
(0.45454545454545453, 0.89019608497619629, 0.89019608497619629),
(0.54545454545454541, 0.99215686321258545, 0.99215686321258545),
(0.63636363636363635, 1.0, 1.0), (0.72727272727272729,
0.7921568751335144, 0.7921568751335144), (0.81818181818181823,
0.41568627953529358, 0.41568627953529358), (0.90909090909090906,
1.0, 1.0), (1.0, 0.69411766529083252, 0.69411766529083252)]}
_Pastel1_data = {'blue': [(0.0, 0.68235296010971069,
0.68235296010971069), (0.125, 0.89019608497619629,
0.89019608497619629), (0.25, 0.77254903316497803,
0.77254903316497803), (0.375, 0.89411765336990356,
0.89411765336990356), (0.5, 0.65098041296005249, 0.65098041296005249),
(0.625, 0.80000001192092896, 0.80000001192092896), (0.75,
0.74117648601531982, 0.74117648601531982), (0.875,
0.92549020051956177, 0.92549020051956177), (1.0, 0.94901961088180542,
0.94901961088180542)],
'green': [(0.0, 0.70588237047195435, 0.70588237047195435), (0.125,
0.80392158031463623, 0.80392158031463623), (0.25,
0.92156863212585449, 0.92156863212585449), (0.375,
0.79607844352722168, 0.79607844352722168), (0.5,
0.85098040103912354, 0.85098040103912354), (0.625, 1.0, 1.0),
(0.75, 0.84705883264541626, 0.84705883264541626), (0.875,
0.85490196943283081, 0.85490196943283081), (1.0,
0.94901961088180542, 0.94901961088180542)],
'red': [(0.0, 0.9843137264251709, 0.9843137264251709), (0.125,
0.70196080207824707, 0.70196080207824707), (0.25,
0.80000001192092896, 0.80000001192092896), (0.375,
0.87058824300765991, 0.87058824300765991), (0.5,
0.99607843160629272, 0.99607843160629272), (0.625, 1.0, 1.0),
(0.75, 0.89803922176361084, 0.89803922176361084), (0.875,
0.99215686321258545, 0.99215686321258545), (1.0,
0.94901961088180542, 0.94901961088180542)]}
_Pastel2_data = {'blue': [(0.0, 0.80392158031463623,
0.80392158031463623), (0.14285714285714285, 0.67450982332229614,
0.67450982332229614), (0.2857142857142857, 0.90980392694473267,
0.90980392694473267), (0.42857142857142855, 0.89411765336990356,
0.89411765336990356), (0.5714285714285714, 0.78823530673980713,
0.78823530673980713), (0.7142857142857143, 0.68235296010971069,
0.68235296010971069), (0.8571428571428571, 0.80000001192092896,
0.80000001192092896), (1.0, 0.80000001192092896,
0.80000001192092896)],
'green': [(0.0, 0.88627451658248901, 0.88627451658248901),
(0.14285714285714285, 0.80392158031463623, 0.80392158031463623),
(0.2857142857142857, 0.83529412746429443, 0.83529412746429443),
(0.42857142857142855, 0.7921568751335144, 0.7921568751335144),
(0.5714285714285714, 0.96078431606292725, 0.96078431606292725),
(0.7142857142857143, 0.94901961088180542, 0.94901961088180542),
(0.8571428571428571, 0.88627451658248901, 0.88627451658248901),
(1.0, 0.80000001192092896, 0.80000001192092896)],
'red': [(0.0, 0.70196080207824707, 0.70196080207824707),
(0.14285714285714285, 0.99215686321258545, 0.99215686321258545),
(0.2857142857142857, 0.79607844352722168, 0.79607844352722168),
(0.42857142857142855, 0.95686274766921997, 0.95686274766921997),
(0.5714285714285714, 0.90196079015731812, 0.90196079015731812),
(0.7142857142857143, 1.0, 1.0), (0.8571428571428571,
0.94509804248809814, 0.94509804248809814), (1.0,
0.80000001192092896, 0.80000001192092896)]}
_PiYG_data = {'blue': [(0.0, 0.32156863808631897,
0.32156863808631897), (0.10000000000000001, 0.49019607901573181,
0.49019607901573181), (0.20000000000000001, 0.68235296010971069,
0.68235296010971069), (0.29999999999999999, 0.85490196943283081,
0.85490196943283081), (0.40000000000000002, 0.93725490570068359,
0.93725490570068359), (0.5, 0.9686274528503418, 0.9686274528503418),
(0.59999999999999998, 0.81568628549575806, 0.81568628549575806),
(0.69999999999999996, 0.52549022436141968, 0.52549022436141968),
(0.80000000000000004, 0.25490197539329529, 0.25490197539329529),
(0.90000000000000002, 0.12941177189350128, 0.12941177189350128), (1.0,
0.098039217293262482, 0.098039217293262482)],
'green': [(0.0, 0.0039215688593685627, 0.0039215688593685627),
(0.10000000000000001, 0.10588235408067703, 0.10588235408067703),
(0.20000000000000001, 0.46666666865348816, 0.46666666865348816),
(0.29999999999999999, 0.7137255072593689, 0.7137255072593689),
(0.40000000000000002, 0.87843137979507446, 0.87843137979507446),
(0.5, 0.9686274528503418, 0.9686274528503418),
(0.59999999999999998, 0.96078431606292725, 0.96078431606292725),
(0.69999999999999996, 0.88235294818878174, 0.88235294818878174),
(0.80000000000000004, 0.73725491762161255, 0.73725491762161255),
(0.90000000000000002, 0.57254904508590698, 0.57254904508590698),
(1.0, 0.39215686917304993, 0.39215686917304993)],
'red': [(0.0, 0.55686277151107788, 0.55686277151107788),
(0.10000000000000001, 0.77254903316497803, 0.77254903316497803),
(0.20000000000000001, 0.87058824300765991, 0.87058824300765991),
(0.29999999999999999, 0.94509804248809814, 0.94509804248809814),
(0.40000000000000002, 0.99215686321258545, 0.99215686321258545),
(0.5, 0.9686274528503418, 0.9686274528503418),
(0.59999999999999998, 0.90196079015731812, 0.90196079015731812),
(0.69999999999999996, 0.72156864404678345, 0.72156864404678345),
(0.80000000000000004, 0.49803921580314636, 0.49803921580314636),
(0.90000000000000002, 0.30196079611778259, 0.30196079611778259),
(1.0, 0.15294118225574493, 0.15294118225574493)]}
_PRGn_data = {'blue': [(0.0, 0.29411765933036804,
0.29411765933036804), (0.10000000000000001, 0.51372551918029785,
0.51372551918029785), (0.20000000000000001, 0.67058825492858887,
0.67058825492858887), (0.29999999999999999, 0.81176471710205078,
0.81176471710205078), (0.40000000000000002, 0.90980392694473267,
0.90980392694473267), (0.5, 0.9686274528503418, 0.9686274528503418),
(0.59999999999999998, 0.82745099067687988, 0.82745099067687988),
(0.69999999999999996, 0.62745100259780884, 0.62745100259780884),
(0.80000000000000004, 0.3803921639919281, 0.3803921639919281),
(0.90000000000000002, 0.21568627655506134, 0.21568627655506134), (1.0,
0.10588235408067703, 0.10588235408067703)],
'green': [(0.0, 0.0, 0.0), (0.10000000000000001,
0.16470588743686676, 0.16470588743686676), (0.20000000000000001,
0.43921568989753723, 0.43921568989753723), (0.29999999999999999,
0.64705884456634521, 0.64705884456634521), (0.40000000000000002,
0.83137255907058716, 0.83137255907058716), (0.5,
0.9686274528503418, 0.9686274528503418), (0.59999999999999998,
0.94117647409439087, 0.94117647409439087), (0.69999999999999996,
0.85882353782653809, 0.85882353782653809), (0.80000000000000004,
0.68235296010971069, 0.68235296010971069), (0.90000000000000002,
0.47058823704719543, 0.47058823704719543), (1.0,
0.26666668057441711, 0.26666668057441711)],
'red': [(0.0, 0.25098040699958801, 0.25098040699958801),
(0.10000000000000001, 0.46274510025978088, 0.46274510025978088),
(0.20000000000000001, 0.60000002384185791, 0.60000002384185791),
(0.29999999999999999, 0.7607843279838562, 0.7607843279838562),
(0.40000000000000002, 0.90588235855102539, 0.90588235855102539),
(0.5, 0.9686274528503418, 0.9686274528503418),
(0.59999999999999998, 0.85098040103912354, 0.85098040103912354),
(0.69999999999999996, 0.65098041296005249, 0.65098041296005249),
(0.80000000000000004, 0.35294118523597717, 0.35294118523597717),
(0.90000000000000002, 0.10588235408067703, 0.10588235408067703),
(1.0, 0.0, 0.0)]}
_PuBu_data = {'blue': [(0.0, 0.9843137264251709, 0.9843137264251709),
(0.125, 0.94901961088180542, 0.94901961088180542), (0.25,
0.90196079015731812, 0.90196079015731812), (0.375,
0.85882353782653809, 0.85882353782653809), (0.5, 0.81176471710205078,
0.81176471710205078), (0.625, 0.75294119119644165,
0.75294119119644165), (0.75, 0.69019609689712524,
0.69019609689712524), (0.875, 0.55294120311737061,
0.55294120311737061), (1.0, 0.34509804844856262,
0.34509804844856262)],
'green': [(0.0, 0.9686274528503418, 0.9686274528503418), (0.125,
0.90588235855102539, 0.90588235855102539), (0.25,
0.81960785388946533, 0.81960785388946533), (0.375,
0.74117648601531982, 0.74117648601531982), (0.5,
0.66274511814117432, 0.66274511814117432), (0.625,
0.56470590829849243, 0.56470590829849243), (0.75,
0.43921568989753723, 0.43921568989753723), (0.875,
0.35294118523597717, 0.35294118523597717), (1.0,
0.21960784494876862, 0.21960784494876862)],
'red': [(0.0, 1.0, 1.0), (0.125, 0.92549020051956177,
0.92549020051956177), (0.25, 0.81568628549575806,
0.81568628549575806), (0.375, 0.65098041296005249,
0.65098041296005249), (0.5, 0.45490196347236633,
0.45490196347236633), (0.625, 0.21176470816135406,
0.21176470816135406), (0.75, 0.019607843831181526,
0.019607843831181526), (0.875, 0.015686275437474251,
0.015686275437474251), (1.0, 0.0078431377187371254,
0.0078431377187371254)]}
_PuBuGn_data = {'blue': [(0.0, 0.9843137264251709,
0.9843137264251709), (0.125, 0.94117647409439087,
0.94117647409439087), (0.25, 0.90196079015731812,
0.90196079015731812), (0.375, 0.85882353782653809,
0.85882353782653809), (0.5, 0.81176471710205078, 0.81176471710205078),
(0.625, 0.75294119119644165, 0.75294119119644165), (0.75,
0.54117649793624878, 0.54117649793624878), (0.875, 0.3490196168422699,
0.3490196168422699), (1.0, 0.21176470816135406, 0.21176470816135406)],
'green': [(0.0, 0.9686274528503418, 0.9686274528503418), (0.125,
0.88627451658248901, 0.88627451658248901), (0.25,
0.81960785388946533, 0.81960785388946533), (0.375,
0.74117648601531982, 0.74117648601531982), (0.5,
0.66274511814117432, 0.66274511814117432), (0.625,
0.56470590829849243, 0.56470590829849243), (0.75,
0.5058823823928833, 0.5058823823928833), (0.875,
0.42352941632270813, 0.42352941632270813), (1.0,
0.27450981736183167, 0.27450981736183167)],
'red': [(0.0, 1.0, 1.0), (0.125, 0.92549020051956177,
0.92549020051956177), (0.25, 0.81568628549575806,
0.81568628549575806), (0.375, 0.65098041296005249,
0.65098041296005249), (0.5, 0.40392157435417175,
0.40392157435417175), (0.625, 0.21176470816135406,
0.21176470816135406), (0.75, 0.0078431377187371254,
0.0078431377187371254), (0.875, 0.0039215688593685627,
0.0039215688593685627), (1.0, 0.0039215688593685627,
0.0039215688593685627)]}
_PuOr_data = {'blue': [(0.0, 0.031372550874948502,
0.031372550874948502), (0.10000000000000001, 0.023529412224888802,
0.023529412224888802), (0.20000000000000001, 0.078431375324726105,
0.078431375324726105), (0.29999999999999999, 0.38823530077934265,
0.38823530077934265), (0.40000000000000002, 0.7137255072593689,
0.7137255072593689), (0.5, 0.9686274528503418, 0.9686274528503418),
(0.59999999999999998, 0.92156863212585449, 0.92156863212585449),
(0.69999999999999996, 0.82352942228317261, 0.82352942228317261),
(0.80000000000000004, 0.67450982332229614, 0.67450982332229614),
(0.90000000000000002, 0.53333336114883423, 0.53333336114883423), (1.0,
0.29411765933036804, 0.29411765933036804)],
'green': [(0.0, 0.23137255012989044, 0.23137255012989044),
(0.10000000000000001, 0.34509804844856262, 0.34509804844856262),
(0.20000000000000001, 0.50980395078659058, 0.50980395078659058),
(0.29999999999999999, 0.72156864404678345, 0.72156864404678345),
(0.40000000000000002, 0.87843137979507446, 0.87843137979507446),
(0.5, 0.9686274528503418, 0.9686274528503418),
(0.59999999999999998, 0.85490196943283081, 0.85490196943283081),
(0.69999999999999996, 0.67058825492858887, 0.67058825492858887),
(0.80000000000000004, 0.45098039507865906, 0.45098039507865906),
(0.90000000000000002, 0.15294118225574493, 0.15294118225574493),
(1.0, 0.0, 0.0)],
'red': [(0.0, 0.49803921580314636, 0.49803921580314636),
(0.10000000000000001, 0.70196080207824707, 0.70196080207824707),
(0.20000000000000001, 0.87843137979507446, 0.87843137979507446),
(0.29999999999999999, 0.99215686321258545, 0.99215686321258545),
(0.40000000000000002, 0.99607843160629272, 0.99607843160629272),
(0.5, 0.9686274528503418, 0.9686274528503418),
(0.59999999999999998, 0.84705883264541626, 0.84705883264541626),
(0.69999999999999996, 0.69803923368453979, 0.69803923368453979),
(0.80000000000000004, 0.50196081399917603, 0.50196081399917603),
(0.90000000000000002, 0.32941177487373352, 0.32941177487373352),
(1.0, 0.17647059261798859, 0.17647059261798859)]}
_PuRd_data = {'blue': [(0.0, 0.97647058963775635,
0.97647058963775635), (0.125, 0.93725490570068359,
0.93725490570068359), (0.25, 0.85490196943283081,
0.85490196943283081), (0.375, 0.78039216995239258,
0.78039216995239258), (0.5, 0.69019609689712524, 0.69019609689712524),
(0.625, 0.54117649793624878, 0.54117649793624878), (0.75,
0.33725491166114807, 0.33725491166114807), (0.875,
0.26274511218070984, 0.26274511218070984), (1.0, 0.12156862765550613,
0.12156862765550613)],
'green': [(0.0, 0.95686274766921997, 0.95686274766921997), (0.125,
0.88235294818878174, 0.88235294818878174), (0.25,
0.72549021244049072, 0.72549021244049072), (0.375,
0.58039218187332153, 0.58039218187332153), (0.5,
0.3960784375667572, 0.3960784375667572), (0.625,
0.16078431904315948, 0.16078431904315948), (0.75,
0.070588238537311554, 0.070588238537311554), (0.875, 0.0, 0.0),
(1.0, 0.0, 0.0)],
'red': [(0.0, 0.9686274528503418, 0.9686274528503418), (0.125,
0.90588235855102539, 0.90588235855102539), (0.25,
0.83137255907058716, 0.83137255907058716), (0.375,
0.78823530673980713, 0.78823530673980713), (0.5,
0.87450981140136719, 0.87450981140136719), (0.625,
0.90588235855102539, 0.90588235855102539), (0.75,
0.80784314870834351, 0.80784314870834351), (0.875,
0.59607845544815063, 0.59607845544815063), (1.0,
0.40392157435417175, 0.40392157435417175)]}
_Purples_data = {'blue': [(0.0, 0.99215686321258545,
0.99215686321258545), (0.125, 0.96078431606292725,
0.96078431606292725), (0.25, 0.92156863212585449,
0.92156863212585449), (0.375, 0.86274510622024536,
0.86274510622024536), (0.5, 0.78431373834609985, 0.78431373834609985),
(0.625, 0.729411780834198, 0.729411780834198), (0.75,
0.63921570777893066, 0.63921570777893066), (0.875,
0.56078433990478516, 0.56078433990478516), (1.0, 0.49019607901573181,
0.49019607901573181)],
'green': [(0.0, 0.9843137264251709, 0.9843137264251709), (0.125,
0.92941176891326904, 0.92941176891326904), (0.25,
0.85490196943283081, 0.85490196943283081), (0.375,
0.74117648601531982, 0.74117648601531982), (0.5,
0.60392159223556519, 0.60392159223556519), (0.625,
0.49019607901573181, 0.49019607901573181), (0.75,
0.31764706969261169, 0.31764706969261169), (0.875,
0.15294118225574493, 0.15294118225574493), (1.0, 0.0, 0.0)],
'red': [(0.0, 0.98823529481887817, 0.98823529481887817), (0.125,
0.93725490570068359, 0.93725490570068359), (0.25,
0.85490196943283081, 0.85490196943283081), (0.375,
0.73725491762161255, 0.73725491762161255), (0.5,
0.61960786581039429, 0.61960786581039429), (0.625,
0.50196081399917603, 0.50196081399917603), (0.75,
0.41568627953529358, 0.41568627953529358), (0.875,
0.32941177487373352, 0.32941177487373352), (1.0,
0.24705882370471954, 0.24705882370471954)]}
_RdBu_data = {'blue': [(0.0, 0.12156862765550613,
0.12156862765550613), (0.10000000000000001, 0.16862745583057404,
0.16862745583057404), (0.20000000000000001, 0.30196079611778259,
0.30196079611778259), (0.29999999999999999, 0.50980395078659058,
0.50980395078659058), (0.40000000000000002, 0.78039216995239258,
0.78039216995239258), (0.5, 0.9686274528503418, 0.9686274528503418),
(0.59999999999999998, 0.94117647409439087, 0.94117647409439087),
(0.69999999999999996, 0.87058824300765991, 0.87058824300765991),
(0.80000000000000004, 0.76470589637756348, 0.76470589637756348),
(0.90000000000000002, 0.67450982332229614, 0.67450982332229614), (1.0,
0.3803921639919281, 0.3803921639919281)],
'green': [(0.0, 0.0, 0.0), (0.10000000000000001,
0.094117648899555206, 0.094117648899555206), (0.20000000000000001,
0.37647059559822083, 0.37647059559822083), (0.29999999999999999,
0.64705884456634521, 0.64705884456634521), (0.40000000000000002,
0.85882353782653809, 0.85882353782653809), (0.5,
0.9686274528503418, 0.9686274528503418), (0.59999999999999998,
0.89803922176361084, 0.89803922176361084), (0.69999999999999996,
0.77254903316497803, 0.77254903316497803), (0.80000000000000004,
0.57647061347961426, 0.57647061347961426), (0.90000000000000002,
0.40000000596046448, 0.40000000596046448), (1.0,
0.18823529779911041, 0.18823529779911041)],
'red': [(0.0, 0.40392157435417175, 0.40392157435417175),
(0.10000000000000001, 0.69803923368453979, 0.69803923368453979),
(0.20000000000000001, 0.83921569585800171, 0.83921569585800171),
(0.29999999999999999, 0.95686274766921997, 0.95686274766921997),
(0.40000000000000002, 0.99215686321258545, 0.99215686321258545),
(0.5, 0.9686274528503418, 0.9686274528503418),
(0.59999999999999998, 0.81960785388946533, 0.81960785388946533),
(0.69999999999999996, 0.57254904508590698, 0.57254904508590698),
(0.80000000000000004, 0.26274511218070984, 0.26274511218070984),
(0.90000000000000002, 0.12941177189350128, 0.12941177189350128),
(1.0, 0.019607843831181526, 0.019607843831181526)]}
_RdGy_data = {'blue': [(0.0, 0.12156862765550613,
0.12156862765550613), (0.10000000000000001, 0.16862745583057404,
0.16862745583057404), (0.20000000000000001, 0.30196079611778259,
0.30196079611778259), (0.29999999999999999, 0.50980395078659058,
0.50980395078659058), (0.40000000000000002, 0.78039216995239258,
0.78039216995239258), (0.5, 1.0, 1.0), (0.59999999999999998,
0.87843137979507446, 0.87843137979507446), (0.69999999999999996,
0.729411780834198, 0.729411780834198), (0.80000000000000004,
0.52941179275512695, 0.52941179275512695), (0.90000000000000002,
0.30196079611778259, 0.30196079611778259), (1.0, 0.10196078568696976,
0.10196078568696976)],
'green': [(0.0, 0.0, 0.0), (0.10000000000000001,
0.094117648899555206, 0.094117648899555206), (0.20000000000000001,
0.37647059559822083, 0.37647059559822083), (0.29999999999999999,
0.64705884456634521, 0.64705884456634521), (0.40000000000000002,
0.85882353782653809, 0.85882353782653809), (0.5, 1.0, 1.0),
(0.59999999999999998, 0.87843137979507446, 0.87843137979507446),
(0.69999999999999996, 0.729411780834198, 0.729411780834198),
(0.80000000000000004, 0.52941179275512695, 0.52941179275512695),
(0.90000000000000002, 0.30196079611778259, 0.30196079611778259),
(1.0, 0.10196078568696976, 0.10196078568696976)],
'red': [(0.0, 0.40392157435417175, 0.40392157435417175),
(0.10000000000000001, 0.69803923368453979, 0.69803923368453979),
(0.20000000000000001, 0.83921569585800171, 0.83921569585800171),
(0.29999999999999999, 0.95686274766921997, 0.95686274766921997),
(0.40000000000000002, 0.99215686321258545, 0.99215686321258545),
(0.5, 1.0, 1.0), (0.59999999999999998, 0.87843137979507446,
0.87843137979507446), (0.69999999999999996, 0.729411780834198,
0.729411780834198), (0.80000000000000004, 0.52941179275512695,
0.52941179275512695), (0.90000000000000002, 0.30196079611778259,
0.30196079611778259), (1.0, 0.10196078568696976,
0.10196078568696976)]}
_RdPu_data = {'blue': [(0.0, 0.9529411792755127, 0.9529411792755127),
(0.125, 0.86666667461395264, 0.86666667461395264), (0.25,
0.75294119119644165, 0.75294119119644165), (0.375,
0.70980393886566162, 0.70980393886566162), (0.5, 0.63137257099151611,
0.63137257099151611), (0.625, 0.59215688705444336,
0.59215688705444336), (0.75, 0.49411764740943909,
0.49411764740943909), (0.875, 0.46666666865348816,
0.46666666865348816), (1.0, 0.41568627953529358,
0.41568627953529358)],
'green': [(0.0, 0.9686274528503418, 0.9686274528503418), (0.125,
0.87843137979507446, 0.87843137979507446), (0.25,
0.77254903316497803, 0.77254903316497803), (0.375,
0.62352943420410156, 0.62352943420410156), (0.5,
0.40784314274787903, 0.40784314274787903), (0.625,
0.20392157137393951, 0.20392157137393951), (0.75,
0.0039215688593685627, 0.0039215688593685627), (0.875,
0.0039215688593685627, 0.0039215688593685627), (1.0, 0.0, 0.0)],
'red': [(0.0, 1.0, 1.0), (0.125, 0.99215686321258545,
0.99215686321258545), (0.25, 0.98823529481887817,
0.98823529481887817), (0.375, 0.98039215803146362,
0.98039215803146362), (0.5, 0.9686274528503418,
0.9686274528503418), (0.625, 0.86666667461395264,
0.86666667461395264), (0.75, 0.68235296010971069,
0.68235296010971069), (0.875, 0.47843137383460999,
0.47843137383460999), (1.0, 0.28627452254295349,
0.28627452254295349)]}
_RdYlBu_data = {'blue': [(0.0, 0.14901961386203766,
0.14901961386203766), (0.10000000149011612,
0.15294118225574493, 0.15294118225574493),
(0.20000000298023224, 0.26274511218070984,
0.26274511218070984), (0.30000001192092896,
0.3803921639919281, 0.3803921639919281),
(0.40000000596046448, 0.56470590829849243,
0.56470590829849243), (0.5, 0.74901962280273438,
0.74901962280273438), (0.60000002384185791,
0.97254902124404907, 0.97254902124404907),
(0.69999998807907104, 0.91372549533843994,
0.91372549533843994), (0.80000001192092896,
0.81960785388946533, 0.81960785388946533),
(0.89999997615814209, 0.70588237047195435,
0.70588237047195435), (1.0, 0.58431375026702881,
0.58431375026702881)], 'green': [(0.0, 0.0, 0.0),
(0.10000000149011612, 0.18823529779911041,
0.18823529779911041), (0.20000000298023224,
0.42745098471641541, 0.42745098471641541),
(0.30000001192092896, 0.68235296010971069,
0.68235296010971069), (0.40000000596046448,
0.87843137979507446, 0.87843137979507446), (0.5, 1.0,
1.0), (0.60000002384185791, 0.9529411792755127,
0.9529411792755127), (0.69999998807907104,
0.85098040103912354, 0.85098040103912354),
(0.80000001192092896, 0.67843139171600342,
0.67843139171600342), (0.89999997615814209,
0.45882353186607361, 0.45882353186607361), (1.0,
0.21176470816135406, 0.21176470816135406)], 'red':
[(0.0, 0.64705884456634521, 0.64705884456634521),
(0.10000000149011612, 0.84313726425170898,
0.84313726425170898), (0.20000000298023224,
0.95686274766921997, 0.95686274766921997),
(0.30000001192092896, 0.99215686321258545,
0.99215686321258545), (0.40000000596046448,
0.99607843160629272, 0.99607843160629272), (0.5, 1.0,
1.0), (0.60000002384185791, 0.87843137979507446,
0.87843137979507446), (0.69999998807907104,
0.67058825492858887, 0.67058825492858887),
(0.80000001192092896, 0.45490196347236633,
0.45490196347236633), (0.89999997615814209,
0.27058824896812439, 0.27058824896812439), (1.0,
0.19215686619281769, 0.19215686619281769)]}
_RdYlGn_data = {'blue': [(0.0, 0.14901961386203766,
0.14901961386203766), (0.10000000000000001, 0.15294118225574493,
0.15294118225574493), (0.20000000000000001, 0.26274511218070984,
0.26274511218070984), (0.29999999999999999, 0.3803921639919281,
0.3803921639919281), (0.40000000000000002, 0.54509806632995605,
0.54509806632995605), (0.5, 0.74901962280273438, 0.74901962280273438),
(0.59999999999999998, 0.54509806632995605, 0.54509806632995605),
(0.69999999999999996, 0.41568627953529358, 0.41568627953529358),
(0.80000000000000004, 0.38823530077934265, 0.38823530077934265),
(0.90000000000000002, 0.31372550129890442, 0.31372550129890442), (1.0,
0.21568627655506134, 0.21568627655506134)],
'green': [(0.0, 0.0, 0.0), (0.10000000000000001,
0.18823529779911041, 0.18823529779911041), (0.20000000000000001,
0.42745098471641541, 0.42745098471641541), (0.29999999999999999,
0.68235296010971069, 0.68235296010971069), (0.40000000000000002,
0.87843137979507446, 0.87843137979507446), (0.5, 1.0, 1.0),
(0.59999999999999998, 0.93725490570068359, 0.93725490570068359),
(0.69999999999999996, 0.85098040103912354, 0.85098040103912354),
(0.80000000000000004, 0.74117648601531982, 0.74117648601531982),
(0.90000000000000002, 0.59607845544815063, 0.59607845544815063),
(1.0, 0.40784314274787903, 0.40784314274787903)],
'red': [(0.0, 0.64705884456634521, 0.64705884456634521),
(0.10000000000000001, 0.84313726425170898, 0.84313726425170898),
(0.20000000000000001, 0.95686274766921997, 0.95686274766921997),
(0.29999999999999999, 0.99215686321258545, 0.99215686321258545),
(0.40000000000000002, 0.99607843160629272, 0.99607843160629272),
(0.5, 1.0, 1.0), (0.59999999999999998, 0.85098040103912354,
0.85098040103912354), (0.69999999999999996, 0.65098041296005249,
0.65098041296005249), (0.80000000000000004, 0.40000000596046448,
0.40000000596046448), (0.90000000000000002, 0.10196078568696976,
0.10196078568696976), (1.0, 0.0, 0.0)]}
_Reds_data = {'blue': [(0.0, 0.94117647409439087,
0.94117647409439087), (0.125, 0.82352942228317261,
0.82352942228317261), (0.25, 0.63137257099151611,
0.63137257099151611), (0.375, 0.44705882668495178,
0.44705882668495178), (0.5, 0.29019609093666077, 0.29019609093666077),
(0.625, 0.17254902422428131, 0.17254902422428131), (0.75,
0.11372549086809158, 0.11372549086809158), (0.875,
0.08235294371843338, 0.08235294371843338), (1.0, 0.050980392843484879,
0.050980392843484879)],
'green': [(0.0, 0.96078431606292725, 0.96078431606292725), (0.125,
0.87843137979507446, 0.87843137979507446), (0.25,
0.73333334922790527, 0.73333334922790527), (0.375,
0.57254904508590698, 0.57254904508590698), (0.5,
0.41568627953529358, 0.41568627953529358), (0.625,
0.23137255012989044, 0.23137255012989044), (0.75,
0.094117648899555206, 0.094117648899555206), (0.875,
0.058823529630899429, 0.058823529630899429), (1.0, 0.0, 0.0)],
'red': [(0.0, 1.0, 1.0), (0.125, 0.99607843160629272,
0.99607843160629272), (0.25, 0.98823529481887817,
0.98823529481887817), (0.375, 0.98823529481887817,
0.98823529481887817), (0.5, 0.9843137264251709,
0.9843137264251709), (0.625, 0.93725490570068359,
0.93725490570068359), (0.75, 0.79607844352722168,
0.79607844352722168), (0.875, 0.64705884456634521,
0.64705884456634521), (1.0, 0.40392157435417175,
0.40392157435417175)]}
_Set1_data = {'blue': [(0.0, 0.10980392247438431,
0.10980392247438431), (0.125, 0.72156864404678345,
0.72156864404678345), (0.25, 0.29019609093666077,
0.29019609093666077), (0.375, 0.63921570777893066,
0.63921570777893066), (0.5, 0.0, 0.0), (0.625, 0.20000000298023224,
0.20000000298023224), (0.75, 0.15686275064945221,
0.15686275064945221), (0.875, 0.74901962280273438,
0.74901962280273438), (1.0, 0.60000002384185791,
0.60000002384185791)],
'green': [(0.0, 0.10196078568696976, 0.10196078568696976), (0.125,
0.49411764740943909, 0.49411764740943909), (0.25,
0.68627452850341797, 0.68627452850341797), (0.375,
0.30588236451148987, 0.30588236451148987), (0.5,
0.49803921580314636, 0.49803921580314636), (0.625, 1.0, 1.0),
(0.75, 0.33725491166114807, 0.33725491166114807), (0.875,
0.5058823823928833, 0.5058823823928833), (1.0,
0.60000002384185791, 0.60000002384185791)],
'red': [(0.0, 0.89411765336990356, 0.89411765336990356), (0.125,
0.21568627655506134, 0.21568627655506134), (0.25,
0.30196079611778259, 0.30196079611778259), (0.375,
0.59607845544815063, 0.59607845544815063), (0.5, 1.0, 1.0),
(0.625, 1.0, 1.0), (0.75, 0.65098041296005249,
0.65098041296005249), (0.875, 0.9686274528503418,
0.9686274528503418), (1.0, 0.60000002384185791,
0.60000002384185791)]}
_Set2_data = {'blue': [(0.0, 0.64705884456634521,
0.64705884456634521), (0.14285714285714285, 0.38431373238563538,
0.38431373238563538), (0.2857142857142857, 0.79607844352722168,
0.79607844352722168), (0.42857142857142855, 0.76470589637756348,
0.76470589637756348), (0.5714285714285714, 0.32941177487373352,
0.32941177487373352), (0.7142857142857143, 0.18431372940540314,
0.18431372940540314), (0.8571428571428571, 0.58039218187332153,
0.58039218187332153), (1.0, 0.70196080207824707,
0.70196080207824707)],
'green': [(0.0, 0.7607843279838562, 0.7607843279838562),
(0.14285714285714285, 0.55294120311737061, 0.55294120311737061),
(0.2857142857142857, 0.62745100259780884, 0.62745100259780884),
(0.42857142857142855, 0.54117649793624878, 0.54117649793624878),
(0.5714285714285714, 0.84705883264541626, 0.84705883264541626),
(0.7142857142857143, 0.85098040103912354, 0.85098040103912354),
(0.8571428571428571, 0.76862746477127075, 0.76862746477127075),
(1.0, 0.70196080207824707, 0.70196080207824707)],
'red': [(0.0, 0.40000000596046448, 0.40000000596046448),
(0.14285714285714285, 0.98823529481887817, 0.98823529481887817),
(0.2857142857142857, 0.55294120311737061, 0.55294120311737061),
(0.42857142857142855, 0.90588235855102539, 0.90588235855102539),
(0.5714285714285714, 0.65098041296005249, 0.65098041296005249),
(0.7142857142857143, 1.0, 1.0), (0.8571428571428571,
0.89803922176361084, 0.89803922176361084), (1.0,
0.70196080207824707, 0.70196080207824707)]}
_Set3_data = {'blue': [(0.0, 0.78039216995239258,
0.78039216995239258), (0.090909090909090912, 0.70196080207824707,
0.70196080207824707), (0.18181818181818182, 0.85490196943283081,
0.85490196943283081), (0.27272727272727271, 0.44705882668495178,
0.44705882668495178), (0.36363636363636365, 0.82745099067687988,
0.82745099067687988), (0.45454545454545453, 0.38431373238563538,
0.38431373238563538), (0.54545454545454541, 0.4117647111415863,
0.4117647111415863), (0.63636363636363635, 0.89803922176361084,
0.89803922176361084), (0.72727272727272729, 0.85098040103912354,
0.85098040103912354), (0.81818181818181823, 0.74117648601531982,
0.74117648601531982), (0.90909090909090906, 0.77254903316497803,
0.77254903316497803), (1.0, 0.43529412150382996,
0.43529412150382996)],
'green': [(0.0, 0.82745099067687988, 0.82745099067687988),
(0.090909090909090912, 1.0, 1.0), (0.18181818181818182,
0.729411780834198, 0.729411780834198), (0.27272727272727271,
0.50196081399917603, 0.50196081399917603), (0.36363636363636365,
0.69411766529083252, 0.69411766529083252), (0.45454545454545453,
0.70588237047195435, 0.70588237047195435), (0.54545454545454541,
0.87058824300765991, 0.87058824300765991), (0.63636363636363635,
0.80392158031463623, 0.80392158031463623), (0.72727272727272729,
0.85098040103912354, 0.85098040103912354), (0.81818181818181823,
0.50196081399917603, 0.50196081399917603), (0.90909090909090906,
0.92156863212585449, 0.92156863212585449), (1.0,
0.92941176891326904, 0.92941176891326904)],
'red': [(0.0, 0.55294120311737061, 0.55294120311737061),
(0.090909090909090912, 1.0, 1.0), (0.18181818181818182,
0.7450980544090271, 0.7450980544090271), (0.27272727272727271,
0.9843137264251709, 0.9843137264251709), (0.36363636363636365,
0.50196081399917603, 0.50196081399917603), (0.45454545454545453,
0.99215686321258545, 0.99215686321258545), (0.54545454545454541,
0.70196080207824707, 0.70196080207824707), (0.63636363636363635,
0.98823529481887817, 0.98823529481887817), (0.72727272727272729,
0.85098040103912354, 0.85098040103912354), (0.81818181818181823,
0.73725491762161255, 0.73725491762161255), (0.90909090909090906,
0.80000001192092896, 0.80000001192092896), (1.0, 1.0, 1.0)]}
_Spectral_data = {'blue': [(0.0, 0.25882354378700256,
0.25882354378700256), (0.10000000000000001, 0.30980393290519714,
0.30980393290519714), (0.20000000000000001, 0.26274511218070984,
0.26274511218070984), (0.29999999999999999, 0.3803921639919281,
0.3803921639919281), (0.40000000000000002, 0.54509806632995605,
0.54509806632995605), (0.5, 0.74901962280273438, 0.74901962280273438),
(0.59999999999999998, 0.59607845544815063, 0.59607845544815063),
(0.69999999999999996, 0.64313727617263794, 0.64313727617263794),
(0.80000000000000004, 0.64705884456634521, 0.64705884456634521),
(0.90000000000000002, 0.74117648601531982, 0.74117648601531982), (1.0,
0.63529413938522339, 0.63529413938522339)],
'green': [(0.0, 0.0039215688593685627, 0.0039215688593685627),
(0.10000000000000001, 0.24313725531101227, 0.24313725531101227),
(0.20000000000000001, 0.42745098471641541, 0.42745098471641541),
(0.29999999999999999, 0.68235296010971069, 0.68235296010971069),
(0.40000000000000002, 0.87843137979507446, 0.87843137979507446),
(0.5, 1.0, 1.0), (0.59999999999999998, 0.96078431606292725,
0.96078431606292725), (0.69999999999999996, 0.86666667461395264,
0.86666667461395264), (0.80000000000000004, 0.7607843279838562,
0.7607843279838562), (0.90000000000000002, 0.53333336114883423,
0.53333336114883423), (1.0, 0.30980393290519714,
0.30980393290519714)],
'red': [(0.0, 0.61960786581039429, 0.61960786581039429),
(0.10000000000000001, 0.83529412746429443, 0.83529412746429443),
(0.20000000000000001, 0.95686274766921997, 0.95686274766921997),
(0.29999999999999999, 0.99215686321258545, 0.99215686321258545),
(0.40000000000000002, 0.99607843160629272, 0.99607843160629272),
(0.5, 1.0, 1.0), (0.59999999999999998, 0.90196079015731812,
0.90196079015731812), (0.69999999999999996, 0.67058825492858887,
0.67058825492858887), (0.80000000000000004, 0.40000000596046448,
0.40000000596046448), (0.90000000000000002, 0.19607843458652496,
0.19607843458652496), (1.0, 0.36862745881080627,
0.36862745881080627)]}
_YlGn_data = {'blue': [(0.0, 0.89803922176361084,
0.89803922176361084), (0.125, 0.72549021244049072,
0.72549021244049072), (0.25, 0.63921570777893066,
0.63921570777893066), (0.375, 0.55686277151107788,
0.55686277151107788), (0.5, 0.47450980544090271, 0.47450980544090271),
(0.625, 0.364705890417099, 0.364705890417099), (0.75,
0.26274511218070984, 0.26274511218070984), (0.875,
0.21568627655506134, 0.21568627655506134), (1.0, 0.16078431904315948,
0.16078431904315948)],
'green': [(0.0, 1.0, 1.0), (0.125, 0.98823529481887817,
0.98823529481887817), (0.25, 0.94117647409439087,
0.94117647409439087), (0.375, 0.86666667461395264,
0.86666667461395264), (0.5, 0.7764706015586853,
0.7764706015586853), (0.625, 0.67058825492858887,
0.67058825492858887), (0.75, 0.51764708757400513,
0.51764708757400513), (0.875, 0.40784314274787903,
0.40784314274787903), (1.0, 0.27058824896812439,
0.27058824896812439)],
'red': [(0.0, 1.0, 1.0), (0.125, 0.9686274528503418,
0.9686274528503418), (0.25, 0.85098040103912354,
0.85098040103912354), (0.375, 0.67843139171600342,
0.67843139171600342), (0.5, 0.47058823704719543,
0.47058823704719543), (0.625, 0.25490197539329529,
0.25490197539329529), (0.75, 0.13725490868091583,
0.13725490868091583), (0.875, 0.0, 0.0), (1.0, 0.0, 0.0)]}
_YlGnBu_data = {'blue': [(0.0, 0.85098040103912354,
0.85098040103912354), (0.125, 0.69411766529083252,
0.69411766529083252), (0.25, 0.70588237047195435,
0.70588237047195435), (0.375, 0.73333334922790527,
0.73333334922790527), (0.5, 0.76862746477127075, 0.76862746477127075),
(0.625, 0.75294119119644165, 0.75294119119644165), (0.75,
0.65882354974746704, 0.65882354974746704), (0.875,
0.58039218187332153, 0.58039218187332153), (1.0, 0.34509804844856262,
0.34509804844856262)],
'green': [(0.0, 1.0, 1.0), (0.125, 0.97254902124404907,
0.97254902124404907), (0.25, 0.91372549533843994,
0.91372549533843994), (0.375, 0.80392158031463623,
0.80392158031463623), (0.5, 0.7137255072593689,
0.7137255072593689), (0.625, 0.56862747669219971,
0.56862747669219971), (0.75, 0.36862745881080627,
0.36862745881080627), (0.875, 0.20392157137393951,
0.20392157137393951), (1.0, 0.11372549086809158,
0.11372549086809158)],
'red': [(0.0, 1.0, 1.0), (0.125, 0.92941176891326904,
0.92941176891326904), (0.25, 0.78039216995239258,
0.78039216995239258), (0.375, 0.49803921580314636,
0.49803921580314636), (0.5, 0.25490197539329529,
0.25490197539329529), (0.625, 0.11372549086809158,
0.11372549086809158), (0.75, 0.13333334028720856,
0.13333334028720856), (0.875, 0.14509804546833038,
0.14509804546833038), (1.0, 0.031372550874948502,
0.031372550874948502)]}
_YlOrBr_data = {'blue': [(0.0, 0.89803922176361084,
0.89803922176361084), (0.125, 0.73725491762161255,
0.73725491762161255), (0.25, 0.56862747669219971,
0.56862747669219971), (0.375, 0.30980393290519714,
0.30980393290519714), (0.5, 0.16078431904315948, 0.16078431904315948),
(0.625, 0.078431375324726105, 0.078431375324726105), (0.75,
0.0078431377187371254, 0.0078431377187371254), (0.875,
0.015686275437474251, 0.015686275437474251), (1.0,
0.023529412224888802, 0.023529412224888802)],
'green': [(0.0, 1.0, 1.0), (0.125, 0.9686274528503418,
0.9686274528503418), (0.25, 0.89019608497619629,
0.89019608497619629), (0.375, 0.76862746477127075,
0.76862746477127075), (0.5, 0.60000002384185791,
0.60000002384185791), (0.625, 0.43921568989753723,
0.43921568989753723), (0.75, 0.29803922772407532,
0.29803922772407532), (0.875, 0.20392157137393951,
0.20392157137393951), (1.0, 0.14509804546833038,
0.14509804546833038)],
'red': [(0.0, 1.0, 1.0), (0.125, 1.0, 1.0), (0.25,
0.99607843160629272, 0.99607843160629272), (0.375,
0.99607843160629272, 0.99607843160629272), (0.5,
0.99607843160629272, 0.99607843160629272), (0.625,
0.92549020051956177, 0.92549020051956177), (0.75,
0.80000001192092896, 0.80000001192092896), (0.875,
0.60000002384185791, 0.60000002384185791), (1.0,
0.40000000596046448, 0.40000000596046448)]}
_YlOrRd_data = {'blue': [(0.0, 0.80000001192092896,
0.80000001192092896), (0.125, 0.62745100259780884,
0.62745100259780884), (0.25, 0.46274510025978088,
0.46274510025978088), (0.375, 0.29803922772407532,
0.29803922772407532), (0.5, 0.23529411852359772, 0.23529411852359772),
(0.625, 0.16470588743686676, 0.16470588743686676), (0.75,
0.10980392247438431, 0.10980392247438431), (0.875,
0.14901961386203766, 0.14901961386203766), (1.0, 0.14901961386203766,
0.14901961386203766)],
'green': [(0.0, 1.0, 1.0), (0.125, 0.92941176891326904,
0.92941176891326904), (0.25, 0.85098040103912354,
0.85098040103912354), (0.375, 0.69803923368453979,
0.69803923368453979), (0.5, 0.55294120311737061,
0.55294120311737061), (0.625, 0.30588236451148987,
0.30588236451148987), (0.75, 0.10196078568696976,
0.10196078568696976), (0.875, 0.0, 0.0), (1.0, 0.0, 0.0)],
'red': [(0.0, 1.0, 1.0), (0.125, 1.0, 1.0), (0.25,
0.99607843160629272, 0.99607843160629272), (0.375,
0.99607843160629272, 0.99607843160629272), (0.5,
0.99215686321258545, 0.99215686321258545), (0.625,
0.98823529481887817, 0.98823529481887817), (0.75,
0.89019608497619629, 0.89019608497619629), (0.875,
0.74117648601531982, 0.74117648601531982), (1.0,
0.50196081399917603, 0.50196081399917603)]}
# The next 7 palettes are from the Yorick scientific visalisation package,
# an evolution of the GIST package, both by David H. Munro.
# They are released under a BSD-like license (see LICENSE_YORICK in
# the license directory of the matplotlib source distribution).
#
# Most palette functions have been reduced to simple function descriptions
# by Reinier Heeres, since the rgb components were mostly straight lines.
# gist_earth_data and gist_ncar_data were simplified by a script and some
# manual effort.
_gist_earth_data = \
{'red': (
(0.0, 0.0, 0.0000),
(0.2824, 0.1882, 0.1882),
(0.4588, 0.2714, 0.2714),
(0.5490, 0.4719, 0.4719),
(0.6980, 0.7176, 0.7176),
(0.7882, 0.7553, 0.7553),
(1.0000, 0.9922, 0.9922),
), 'green': (
(0.0, 0.0, 0.0000),
(0.0275, 0.0000, 0.0000),
(0.1098, 0.1893, 0.1893),
(0.1647, 0.3035, 0.3035),
(0.2078, 0.3841, 0.3841),
(0.2824, 0.5020, 0.5020),
(0.5216, 0.6397, 0.6397),
(0.6980, 0.7171, 0.7171),
(0.7882, 0.6392, 0.6392),
(0.7922, 0.6413, 0.6413),
(0.8000, 0.6447, 0.6447),
(0.8078, 0.6481, 0.6481),
(0.8157, 0.6549, 0.6549),
(0.8667, 0.6991, 0.6991),
(0.8745, 0.7103, 0.7103),
(0.8824, 0.7216, 0.7216),
(0.8902, 0.7323, 0.7323),
(0.8980, 0.7430, 0.7430),
(0.9412, 0.8275, 0.8275),
(0.9569, 0.8635, 0.8635),
(0.9647, 0.8816, 0.8816),
(0.9961, 0.9733, 0.9733),
(1.0000, 0.9843, 0.9843),
), 'blue': (
(0.0, 0.0, 0.0000),
(0.0039, 0.1684, 0.1684),
(0.0078, 0.2212, 0.2212),
(0.0275, 0.4329, 0.4329),
(0.0314, 0.4549, 0.4549),
(0.2824, 0.5004, 0.5004),
(0.4667, 0.2748, 0.2748),
(0.5451, 0.3205, 0.3205),
(0.7843, 0.3961, 0.3961),
(0.8941, 0.6651, 0.6651),
(1.0000, 0.9843, 0.9843),
)}
_gist_gray_data = {
'red': gfunc[3],
'green': gfunc[3],
'blue': gfunc[3],
}
_gist_heat_data = {
'red': lambda x: 1.5 * x,
'green': lambda x: 2 * x - 1,
'blue': lambda x: 4 * x - 3,
}
_gist_ncar_data = \
{'red': (
(0.0, 0.0, 0.0000),
(0.3098, 0.0000, 0.0000),
(0.3725, 0.3993, 0.3993),
(0.4235, 0.5003, 0.5003),
(0.5333, 1.0000, 1.0000),
(0.7922, 1.0000, 1.0000),
(0.8471, 0.6218, 0.6218),
(0.8980, 0.9235, 0.9235),
(1.0000, 0.9961, 0.9961),
), 'green': (
(0.0, 0.0, 0.0000),
(0.0510, 0.3722, 0.3722),
(0.1059, 0.0000, 0.0000),
(0.1569, 0.7202, 0.7202),
(0.1608, 0.7537, 0.7537),
(0.1647, 0.7752, 0.7752),
(0.2157, 1.0000, 1.0000),
(0.2588, 0.9804, 0.9804),
(0.2706, 0.9804, 0.9804),
(0.3176, 1.0000, 1.0000),
(0.3686, 0.8081, 0.8081),
(0.4275, 1.0000, 1.0000),
(0.5216, 1.0000, 1.0000),
(0.6314, 0.7292, 0.7292),
(0.6863, 0.2796, 0.2796),
(0.7451, 0.0000, 0.0000),
(0.7922, 0.0000, 0.0000),
(0.8431, 0.1753, 0.1753),
(0.8980, 0.5000, 0.5000),
(1.0000, 0.9725, 0.9725),
), 'blue': (
(0.0, 0.5020, 0.5020),
(0.0510, 0.0222, 0.0222),
(0.1098, 1.0000, 1.0000),
(0.2039, 1.0000, 1.0000),
(0.2627, 0.6145, 0.6145),
(0.3216, 0.0000, 0.0000),
(0.4157, 0.0000, 0.0000),
(0.4745, 0.2342, 0.2342),
(0.5333, 0.0000, 0.0000),
(0.5804, 0.0000, 0.0000),
(0.6314, 0.0549, 0.0549),
(0.6902, 0.0000, 0.0000),
(0.7373, 0.0000, 0.0000),
(0.7922, 0.9738, 0.9738),
(0.8000, 1.0000, 1.0000),
(0.8431, 1.0000, 1.0000),
(0.8980, 0.9341, 0.9341),
(1.0000, 0.9961, 0.9961),
)}
_gist_rainbow_data = (
(0.000, (1.00, 0.00, 0.16)),
(0.030, (1.00, 0.00, 0.00)),
(0.215, (1.00, 1.00, 0.00)),
(0.400, (0.00, 1.00, 0.00)),
(0.586, (0.00, 1.00, 1.00)),
(0.770, (0.00, 0.00, 1.00)),
(0.954, (1.00, 0.00, 1.00)),
(1.000, (1.00, 0.00, 0.75))
)
_gist_stern_data = {
'red': (
(0.000, 0.000, 0.000), (0.0547, 1.000, 1.000),
(0.250, 0.027, 0.250), # (0.2500, 0.250, 0.250),
(1.000, 1.000, 1.000)),
'green': ((0, 0, 0), (1, 1, 1)),
'blue': (
(0.000, 0.000, 0.000), (0.500, 1.000, 1.000),
(0.735, 0.000, 0.000), (1.000, 1.000, 1.000))
}
_gist_yarg_data = {
'red': lambda x: 1 - x,
'green': lambda x: 1 - x,
'blue': lambda x: 1 - x,
}
# This bipolar color map was generated from CoolWarmFloat33.csv of
# "Diverging Color Maps for Scientific Visualization" by Kenneth Moreland.
# <http://www.kennethmoreland.com/color-maps/>
_coolwarm_data = {
'red': [
(0.0, 0.2298057, 0.2298057),
(0.03125, 0.26623388, 0.26623388),
(0.0625, 0.30386891, 0.30386891),
(0.09375, 0.342804478, 0.342804478),
(0.125, 0.38301334, 0.38301334),
(0.15625, 0.424369608, 0.424369608),
(0.1875, 0.46666708, 0.46666708),
(0.21875, 0.509635204, 0.509635204),
(0.25, 0.552953156, 0.552953156),
(0.28125, 0.596262162, 0.596262162),
(0.3125, 0.639176211, 0.639176211),
(0.34375, 0.681291281, 0.681291281),
(0.375, 0.722193294, 0.722193294),
(0.40625, 0.761464949, 0.761464949),
(0.4375, 0.798691636, 0.798691636),
(0.46875, 0.833466556, 0.833466556),
(0.5, 0.865395197, 0.865395197),
(0.53125, 0.897787179, 0.897787179),
(0.5625, 0.924127593, 0.924127593),
(0.59375, 0.944468518, 0.944468518),
(0.625, 0.958852946, 0.958852946),
(0.65625, 0.96732803, 0.96732803),
(0.6875, 0.969954137, 0.969954137),
(0.71875, 0.966811177, 0.966811177),
(0.75, 0.958003065, 0.958003065),
(0.78125, 0.943660866, 0.943660866),
(0.8125, 0.923944917, 0.923944917),
(0.84375, 0.89904617, 0.89904617),
(0.875, 0.869186849, 0.869186849),
(0.90625, 0.834620542, 0.834620542),
(0.9375, 0.795631745, 0.795631745),
(0.96875, 0.752534934, 0.752534934),
(1.0, 0.705673158, 0.705673158)],
'green': [
(0.0, 0.298717966, 0.298717966),
(0.03125, 0.353094838, 0.353094838),
(0.0625, 0.406535296, 0.406535296),
(0.09375, 0.458757618, 0.458757618),
(0.125, 0.50941904, 0.50941904),
(0.15625, 0.558148092, 0.558148092),
(0.1875, 0.604562568, 0.604562568),
(0.21875, 0.648280772, 0.648280772),
(0.25, 0.688929332, 0.688929332),
(0.28125, 0.726149107, 0.726149107),
(0.3125, 0.759599947, 0.759599947),
(0.34375, 0.788964712, 0.788964712),
(0.375, 0.813952739, 0.813952739),
(0.40625, 0.834302879, 0.834302879),
(0.4375, 0.849786142, 0.849786142),
(0.46875, 0.860207984, 0.860207984),
(0.5, 0.86541021, 0.86541021),
(0.53125, 0.848937047, 0.848937047),
(0.5625, 0.827384882, 0.827384882),
(0.59375, 0.800927443, 0.800927443),
(0.625, 0.769767752, 0.769767752),
(0.65625, 0.734132809, 0.734132809),
(0.6875, 0.694266682, 0.694266682),
(0.71875, 0.650421156, 0.650421156),
(0.75, 0.602842431, 0.602842431),
(0.78125, 0.551750968, 0.551750968),
(0.8125, 0.49730856, 0.49730856),
(0.84375, 0.439559467, 0.439559467),
(0.875, 0.378313092, 0.378313092),
(0.90625, 0.312874446, 0.312874446),
(0.9375, 0.24128379, 0.24128379),
(0.96875, 0.157246067, 0.157246067),
(1.0, 0.01555616, 0.01555616)],
'blue': [
(0.0, 0.753683153, 0.753683153),
(0.03125, 0.801466763, 0.801466763),
(0.0625, 0.84495867, 0.84495867),
(0.09375, 0.883725899, 0.883725899),
(0.125, 0.917387822, 0.917387822),
(0.15625, 0.945619588, 0.945619588),
(0.1875, 0.968154911, 0.968154911),
(0.21875, 0.98478814, 0.98478814),
(0.25, 0.995375608, 0.995375608),
(0.28125, 0.999836203, 0.999836203),
(0.3125, 0.998151185, 0.998151185),
(0.34375, 0.990363227, 0.990363227),
(0.375, 0.976574709, 0.976574709),
(0.40625, 0.956945269, 0.956945269),
(0.4375, 0.931688648, 0.931688648),
(0.46875, 0.901068838, 0.901068838),
(0.5, 0.865395561, 0.865395561),
(0.53125, 0.820880546, 0.820880546),
(0.5625, 0.774508472, 0.774508472),
(0.59375, 0.726736146, 0.726736146),
(0.625, 0.678007945, 0.678007945),
(0.65625, 0.628751763, 0.628751763),
(0.6875, 0.579375448, 0.579375448),
(0.71875, 0.530263762, 0.530263762),
(0.75, 0.481775914, 0.481775914),
(0.78125, 0.434243684, 0.434243684),
(0.8125, 0.387970225, 0.387970225),
(0.84375, 0.343229596, 0.343229596),
(0.875, 0.300267182, 0.300267182),
(0.90625, 0.259301199, 0.259301199),
(0.9375, 0.220525627, 0.220525627),
(0.96875, 0.184115123, 0.184115123),
(1.0, 0.150232812, 0.150232812)]
}
# Implementation of Carey Rappaport's CMRmap.
# See `A Color Map for Effective Black-and-White Rendering of Color-Scale
# Images' by Carey Rappaport
# http://www.mathworks.com/matlabcentral/fileexchange/2662-cmrmap-m
_CMRmap_data = {'red': ((0.000, 0.00, 0.00),
(0.125, 0.15, 0.15),
(0.250, 0.30, 0.30),
(0.375, 0.60, 0.60),
(0.500, 1.00, 1.00),
(0.625, 0.90, 0.90),
(0.750, 0.90, 0.90),
(0.875, 0.90, 0.90),
(1.000, 1.00, 1.00)),
'green': ((0.000, 0.00, 0.00),
(0.125, 0.15, 0.15),
(0.250, 0.15, 0.15),
(0.375, 0.20, 0.20),
(0.500, 0.25, 0.25),
(0.625, 0.50, 0.50),
(0.750, 0.75, 0.75),
(0.875, 0.90, 0.90),
(1.000, 1.00, 1.00)),
'blue': ((0.000, 0.00, 0.00),
(0.125, 0.50, 0.50),
(0.250, 0.75, 0.75),
(0.375, 0.50, 0.50),
(0.500, 0.15, 0.15),
(0.625, 0.00, 0.00),
(0.750, 0.10, 0.10),
(0.875, 0.50, 0.50),
(1.000, 1.00, 1.00))}
# An MIT licensed, colorblind-friendly heatmap from Wistia:
# https://github.com/wistia/heatmap-palette
# http://wistia.com/blog/heatmaps-for-colorblindness
#
# >>> import matplotlib.colors as c
# >>> colors = ["#e4ff7a", "#ffe81a", "#ffbd00", "#ffa000", "#fc7f00"]
# >>> cm = c.LinearSegmentedColormap.from_list('wistia', colors)
# >>> _wistia_data = cm._segmentdata
# >>> del _wistia_data['alpha']
#
_wistia_data = {
'red': [(0.0, 0.8941176470588236, 0.8941176470588236),
(0.25, 1.0, 1.0),
(0.5, 1.0, 1.0),
(0.75, 1.0, 1.0),
(1.0, 0.9882352941176471, 0.9882352941176471)],
'green': [(0.0, 1.0, 1.0),
(0.25, 0.9098039215686274, 0.9098039215686274),
(0.5, 0.7411764705882353, 0.7411764705882353),
(0.75, 0.6274509803921569, 0.6274509803921569),
(1.0, 0.4980392156862745, 0.4980392156862745)],
'blue': [(0.0, 0.47843137254901963, 0.47843137254901963),
(0.25, 0.10196078431372549, 0.10196078431372549),
(0.5, 0.0, 0.0),
(0.75, 0.0, 0.0),
(1.0, 0.0, 0.0)],
}
datad = {
'afmhot': _afmhot_data,
'autumn': _autumn_data,
'bone': _bone_data,
'binary': _binary_data,
'bwr': _bwr_data,
'brg': _brg_data,
'CMRmap': _CMRmap_data,
'cool': _cool_data,
'copper': _copper_data,
'cubehelix': _cubehelix_data,
'flag': _flag_data,
'gnuplot': _gnuplot_data,
'gnuplot2': _gnuplot2_data,
'gray': _gray_data,
'hot': _hot_data,
'hsv': _hsv_data,
'jet': _jet_data,
'ocean': _ocean_data,
'pink': _pink_data,
'prism': _prism_data,
'rainbow': _rainbow_data,
'seismic': _seismic_data,
'spring': _spring_data,
'summer': _summer_data,
'terrain': _terrain_data,
'winter': _winter_data,
'nipy_spectral': _nipy_spectral_data,
'spectral': _nipy_spectral_data, # alias for backward compatibility
}
datad['Accent'] = _Accent_data
datad['Blues'] = _Blues_data
datad['BrBG'] = _BrBG_data
datad['BuGn'] = _BuGn_data
datad['BuPu'] = _BuPu_data
datad['Dark2'] = _Dark2_data
datad['GnBu'] = _GnBu_data
datad['Greens'] = _Greens_data
datad['Greys'] = _Greys_data
datad['Oranges'] = _Oranges_data
datad['OrRd'] = _OrRd_data
datad['Paired'] = _Paired_data
datad['Pastel1'] = _Pastel1_data
datad['Pastel2'] = _Pastel2_data
datad['PiYG'] = _PiYG_data
datad['PRGn'] = _PRGn_data
datad['PuBu'] = _PuBu_data
datad['PuBuGn'] = _PuBuGn_data
datad['PuOr'] = _PuOr_data
datad['PuRd'] = _PuRd_data
datad['Purples'] = _Purples_data
datad['RdBu'] = _RdBu_data
datad['RdGy'] = _RdGy_data
datad['RdPu'] = _RdPu_data
datad['RdYlBu'] = _RdYlBu_data
datad['RdYlGn'] = _RdYlGn_data
datad['Reds'] = _Reds_data
datad['Set1'] = _Set1_data
datad['Set2'] = _Set2_data
datad['Set3'] = _Set3_data
datad['Spectral'] = _Spectral_data
datad['YlGn'] = _YlGn_data
datad['YlGnBu'] = _YlGnBu_data
datad['YlOrBr'] = _YlOrBr_data
datad['YlOrRd'] = _YlOrRd_data
datad['gist_earth'] = _gist_earth_data
datad['gist_gray'] = _gist_gray_data
datad['gist_heat'] = _gist_heat_data
datad['gist_ncar'] = _gist_ncar_data
datad['gist_rainbow'] = _gist_rainbow_data
datad['gist_stern'] = _gist_stern_data
datad['gist_yarg'] = _gist_yarg_data
datad['coolwarm'] = _coolwarm_data
datad['Wistia'] = _wistia_data
| mit |
abhishekgahlot/scikit-learn | examples/tree/plot_tree_regression_multioutput.py | 43 | 1791 | """
===================================================================
Multi-output Decision Tree Regression
===================================================================
An example to illustrate multi-output regression with decision tree.
The :ref:`decision trees <tree>`
is used to predict simultaneously the noisy x and y observations of a circle
given a single underlying feature. As a result, it learns local linear
regressions approximating the circle.
We can see that if the maximum depth of the tree (controlled by the
`max_depth` parameter) is set too high, the decision trees learn too fine
details of the training data and learn from the noise, i.e. they overfit.
"""
print(__doc__)
import numpy as np
import matplotlib.pyplot as plt
from sklearn.tree import DecisionTreeRegressor
# Create a random dataset
rng = np.random.RandomState(1)
X = np.sort(200 * rng.rand(100, 1) - 100, axis=0)
y = np.array([np.pi * np.sin(X).ravel(), np.pi * np.cos(X).ravel()]).T
y[::5, :] += (0.5 - rng.rand(20, 2))
# Fit regression model
clf_1 = DecisionTreeRegressor(max_depth=2)
clf_2 = DecisionTreeRegressor(max_depth=5)
clf_3 = DecisionTreeRegressor(max_depth=8)
clf_1.fit(X, y)
clf_2.fit(X, y)
clf_3.fit(X, y)
# Predict
X_test = np.arange(-100.0, 100.0, 0.01)[:, np.newaxis]
y_1 = clf_1.predict(X_test)
y_2 = clf_2.predict(X_test)
y_3 = clf_3.predict(X_test)
# Plot the results
plt.figure()
plt.scatter(y[:, 0], y[:, 1], c="k", label="data")
plt.scatter(y_1[:, 0], y_1[:, 1], c="g", label="max_depth=2")
plt.scatter(y_2[:, 0], y_2[:, 1], c="r", label="max_depth=5")
plt.scatter(y_3[:, 0], y_3[:, 1], c="b", label="max_depth=8")
plt.xlim([-6, 6])
plt.ylim([-6, 6])
plt.xlabel("data")
plt.ylabel("target")
plt.title("Multi-output Decision Tree Regression")
plt.legend()
plt.show()
| bsd-3-clause |
toastedcornflakes/scikit-learn | examples/ensemble/plot_adaboost_twoclass.py | 347 | 3268 | """
==================
Two-class AdaBoost
==================
This example fits an AdaBoosted decision stump on a non-linearly separable
classification dataset composed of two "Gaussian quantiles" clusters
(see :func:`sklearn.datasets.make_gaussian_quantiles`) and plots the decision
boundary and decision scores. The distributions of decision scores are shown
separately for samples of class A and B. The predicted class label for each
sample is determined by the sign of the decision score. Samples with decision
scores greater than zero are classified as B, and are otherwise classified
as A. The magnitude of a decision score determines the degree of likeness with
the predicted class label. Additionally, a new dataset could be constructed
containing a desired purity of class B, for example, by only selecting samples
with a decision score above some value.
"""
print(__doc__)
# Author: Noel Dawe <noel.dawe@gmail.com>
#
# License: BSD 3 clause
import numpy as np
import matplotlib.pyplot as plt
from sklearn.ensemble import AdaBoostClassifier
from sklearn.tree import DecisionTreeClassifier
from sklearn.datasets import make_gaussian_quantiles
# Construct dataset
X1, y1 = make_gaussian_quantiles(cov=2.,
n_samples=200, n_features=2,
n_classes=2, random_state=1)
X2, y2 = make_gaussian_quantiles(mean=(3, 3), cov=1.5,
n_samples=300, n_features=2,
n_classes=2, random_state=1)
X = np.concatenate((X1, X2))
y = np.concatenate((y1, - y2 + 1))
# Create and fit an AdaBoosted decision tree
bdt = AdaBoostClassifier(DecisionTreeClassifier(max_depth=1),
algorithm="SAMME",
n_estimators=200)
bdt.fit(X, y)
plot_colors = "br"
plot_step = 0.02
class_names = "AB"
plt.figure(figsize=(10, 5))
# Plot the decision boundaries
plt.subplot(121)
x_min, x_max = X[:, 0].min() - 1, X[:, 0].max() + 1
y_min, y_max = X[:, 1].min() - 1, X[:, 1].max() + 1
xx, yy = np.meshgrid(np.arange(x_min, x_max, plot_step),
np.arange(y_min, y_max, plot_step))
Z = bdt.predict(np.c_[xx.ravel(), yy.ravel()])
Z = Z.reshape(xx.shape)
cs = plt.contourf(xx, yy, Z, cmap=plt.cm.Paired)
plt.axis("tight")
# Plot the training points
for i, n, c in zip(range(2), class_names, plot_colors):
idx = np.where(y == i)
plt.scatter(X[idx, 0], X[idx, 1],
c=c, cmap=plt.cm.Paired,
label="Class %s" % n)
plt.xlim(x_min, x_max)
plt.ylim(y_min, y_max)
plt.legend(loc='upper right')
plt.xlabel('x')
plt.ylabel('y')
plt.title('Decision Boundary')
# Plot the two-class decision scores
twoclass_output = bdt.decision_function(X)
plot_range = (twoclass_output.min(), twoclass_output.max())
plt.subplot(122)
for i, n, c in zip(range(2), class_names, plot_colors):
plt.hist(twoclass_output[y == i],
bins=10,
range=plot_range,
facecolor=c,
label='Class %s' % n,
alpha=.5)
x1, x2, y1, y2 = plt.axis()
plt.axis((x1, x2, y1, y2 * 1.2))
plt.legend(loc='upper right')
plt.ylabel('Samples')
plt.xlabel('Score')
plt.title('Decision Scores')
plt.tight_layout()
plt.subplots_adjust(wspace=0.35)
plt.show()
| bsd-3-clause |
tardis-sn/tardis | tardis/model/base.py | 1 | 27746 | import os
import logging
import numpy as np
import pandas as pd
from astropy import units as u
from tardis import constants
from tardis.util.base import quantity_linspace
from tardis.io.parsers.csvy import load_csvy
from tardis.io.model_reader import (
read_density_file,
read_abundances_file,
read_uniform_abundances,
parse_csv_abundances,
)
from tardis.io.config_validator import validate_dict
from tardis.io.config_reader import Configuration
from tardis.io.util import HDFWriterMixin
from tardis.io.decay import IsotopeAbundances
from tardis.model.density import HomologousDensity
from pyne import nucname
logger = logging.getLogger(__name__)
class Radial1DModel(HDFWriterMixin):
"""
An object that hold information about the individual shells.
Parameters
----------
velocity : np.ndarray
An array with n+1 (for n shells) velocities "cut" to the provided
boundaries
.. note:: To access the entire, "uncut", velocity array, use `raw_velocity`
homologous_density : HomologousDensity
abundance : pd.DataFrame
time_explosion : astropy.units.Quantity
Time since explosion
t_inner : astropy.units.Quantity
luminosity_requested : astropy.units.quantity.Quantity
t_radiative : astropy.units.Quantity
Radiative temperature for the shells
dilution_factor : np.ndarray
If None, the dilution_factor will be initialized with the geometric
dilution factor.
v_boundary_inner : astropy.units.Quantity
v_boundary_outer : astropy.units.Quantity
raw_velocity : np.ndarray
The complete array of the velocities, without being cut by
`v_boundary_inner` and `v_boundary_outer`
electron_densities : astropy.units.quantity.Quantity
Attributes
----------
w : numpy.ndarray
Shortcut for `dilution_factor`
t_rad : astropy.units.quantity.Quantity
Shortcut for `t_radiative`
radius : astropy.units.quantity.Quantity
r_inner : astropy.units.quantity.Quantity
r_outer : astropy.units.quantity.Quantity
r_middle : astropy.units.quantity.Quantity
v_inner : astropy.units.quantity.Quantity
v_outer : astropy.units.quantity.Quantity
v_middle : astropy.units.quantity.Quantity
density : astropy.units.quantity.Quantity
volume : astropy.units.quantity.Quantity
no_of_shells : int
The number of shells as formed by `v_boundary_inner` and
`v_boundary_outer`
no_of_raw_shells : int
"""
hdf_properties = [
"t_inner",
"w",
"t_radiative",
"v_inner",
"v_outer",
"homologous_density",
"r_inner",
]
hdf_name = "model"
def __init__(
self,
velocity,
homologous_density,
abundance,
isotope_abundance,
time_explosion,
t_inner,
luminosity_requested=None,
t_radiative=None,
dilution_factor=None,
v_boundary_inner=None,
v_boundary_outer=None,
electron_densities=None,
):
self._v_boundary_inner = None
self._v_boundary_outer = None
self._velocity = None
self.raw_velocity = velocity
self.v_boundary_inner = v_boundary_inner
self.v_boundary_outer = v_boundary_outer
self.homologous_density = homologous_density
self._abundance = abundance
self.time_explosion = time_explosion
self._electron_densities = electron_densities
self.raw_abundance = self._abundance
self.raw_isotope_abundance = isotope_abundance
if t_inner is None:
if luminosity_requested is not None:
self.t_inner = (
(
luminosity_requested
/ (
4
* np.pi
* self.r_inner[0] ** 2
* constants.sigma_sb
)
)
** 0.25
).to("K")
else:
raise ValueError(
"Both t_inner and luminosity_requested cannot " "be None."
)
else:
self.t_inner = t_inner
if t_radiative is None:
lambda_wien_inner = constants.b_wien / self.t_inner
self._t_radiative = constants.b_wien / (
lambda_wien_inner
* (1 + (self.v_middle - self.v_boundary_inner) / constants.c)
)
else:
# self._t_radiative = t_radiative[self.v_boundary_inner_index + 1:self.v_boundary_outer_index]
self._t_radiative = t_radiative
if dilution_factor is None:
self._dilution_factor = 0.5 * (
1
- np.sqrt(
1 - (self.r_inner[0] ** 2 / self.r_middle ** 2).to(1).value
)
)
else:
# self.dilution_factor = dilution_factor[self.v_boundary_inner_index + 1:self.v_boundary_outer_index]
self._dilution_factor = dilution_factor
@property
def w(self):
return self.dilution_factor
@w.setter
def w(self, value):
self.dilution_factor = value
@property
def t_rad(self):
return self.t_radiative
@t_rad.setter
def t_rad(self, value):
self.t_radiative = value
@property
def dilution_factor(self):
if len(self._dilution_factor) == self.no_of_shells:
return self._dilution_factor
# if self.v_boundary_inner in self.raw_velocity:
# v_inner_ind = np.argwhere(self.raw_velocity == self.v_boundary_inner)[0][0]
# else:
# v_inner_ind = np.searchsorted(self.raw_velocity, self.v_boundary_inner) - 1
# if self.v_boundary_outer in self.raw_velocity:
# v_outer_ind = np.argwhere(self.raw_velocity == self.v_boundary_outer)[0][0]
# else:
# v_outer_ind = np.searchsorted(self.raw_velocity, self.v_boundary_outer)
return self._dilution_factor[
self.v_boundary_inner_index + 1 : self.v_boundary_outer_index + 1
]
@dilution_factor.setter
def dilution_factor(self, value):
if len(value) == len(self._dilution_factor):
self._dilution_factor = value
elif len(value) == self.no_of_shells:
# if self.v_boundary_inner in self.raw_velocity:
# v_inner_ind = np.argwhere(self.raw_velocity == self.v_boundary_inner)[0][0]
# else:
# v_inner_ind = np.searchsorted(self.raw_velocity, self.v_boundary_inner) - 1
# if self.v_boundary_outer in self.raw_velocity:
# v_outer_ind = np.argwhere(self.raw_velocity == self.v_boundary_outer)[0][0]
# else:
# v_outer_ind = np.searchsorted(self.raw_velocity, self.v_boundary_outer)
# assert v_outer_ind - v_inner_ind == self.no_of_shells, "trad shape different from number of shells"
self._dilution_factor[
self.v_boundary_inner_index
+ 1 : self.v_boundary_outer_index
+ 1
] = value
else:
raise ValueError(
"Trying to set dilution_factor for unmatching number"
"of shells."
)
@property
def t_radiative(self):
if len(self._t_radiative) == self.no_of_shells:
return self._t_radiative
# if self.v_boundary_inner in self.raw_velocity:
# v_inner_ind = np.argwhere(self.raw_velocity == self.v_boundary_inner)[0][0]
# else:
# v_inner_ind = np.searchsorted(self.raw_velocity, self.v_boundary_inner) - 1
# if self.v_boundary_outer in self.raw_velocity:
# v_outer_ind = np.argwhere(self.raw_velocity == self.v_boundary_outer)[0][0]
# else:
# v_outer_ind = np.searchsorted(self.raw_velocity, self.v_boundary_outer)
return self._t_radiative[
self.v_boundary_inner_index + 1 : self.v_boundary_outer_index + 1
]
@t_radiative.setter
def t_radiative(self, value):
if len(value) == len(self._t_radiative):
self._t_radiative = value
elif len(value) == self.no_of_shells:
# if self.v_boundary_inner in self.raw_velocity:
# v_inner_ind = np.argwhere(self.raw_velocity == self.v_boundary_inner)[0][0]
# else:
# v_inner_ind = np.searchsorted(self.raw_velocity, self.v_boundary_inner) - 1
# if self.v_boundary_outer in self.raw_velocity:
# v_outer_ind = np.argwhere(self.raw_velocity == self.v_boundary_outer)[0][0]
# else:
# v_outer_ind = np.searchsorted(self.raw_velocity, self.v_boundary_outer)
# assert v_outer_ind - v_inner_ind == self.no_of_shells, "trad shape different from number of shells"
self._t_radiative[
self.v_boundary_inner_index
+ 1 : self.v_boundary_outer_index
+ 1
] = value
else:
raise ValueError(
"Trying to set t_radiative for unmatching number" "of shells."
)
@property
def radius(self):
return self.time_explosion * self.velocity
@property
def r_inner(self):
return self.time_explosion * self.v_inner
@property
def r_outer(self):
return self.time_explosion * self.v_outer
@property
def r_middle(self):
return 0.5 * self.r_inner + 0.5 * self.r_outer
@property
def velocity(self):
if self._velocity is None:
self._velocity = self.raw_velocity[
self.v_boundary_inner_index : self.v_boundary_outer_index + 1
]
self._velocity[0] = self.v_boundary_inner
self._velocity[-1] = self.v_boundary_outer
return self._velocity
@property
def v_inner(self):
return self.velocity[:-1]
@property
def v_outer(self):
return self.velocity[1:]
@property
def v_middle(self):
return 0.5 * self.v_inner + 0.5 * self.v_outer
@property
def density(self):
density = (
self.homologous_density.calculate_density_at_time_of_simulation(
self.time_explosion
)
)
return density[
self.v_boundary_inner_index + 1 : self.v_boundary_outer_index + 1
]
@property
def abundance(self):
if not self.raw_isotope_abundance.empty:
self._abundance = self.raw_isotope_abundance.decay(
self.time_explosion
).merge(self.raw_abundance)
abundance = self._abundance.loc[
:, self.v_boundary_inner_index : self.v_boundary_outer_index - 1
]
abundance.columns = range(len(abundance.columns))
return abundance
@property
def volume(self):
return ((4.0 / 3) * np.pi * (self.r_outer ** 3 - self.r_inner ** 3)).cgs
@property
def no_of_shells(self):
return len(self.velocity) - 1
@property
def no_of_raw_shells(self):
return len(self.raw_velocity) - 1
@property
def v_boundary_inner(self):
if self._v_boundary_inner is None:
return self.raw_velocity[0]
if self._v_boundary_inner < 0 * u.km / u.s:
return self.raw_velocity[0]
return self._v_boundary_inner
@v_boundary_inner.setter
def v_boundary_inner(self, value):
if value is not None:
if value > 0 * u.km / u.s:
value = u.Quantity(value, self.v_boundary_inner.unit)
if value > self.v_boundary_outer:
raise ValueError(
"v_boundary_inner must not be higher than "
"v_boundary_outer."
)
if value > self.raw_velocity[-1]:
raise ValueError(
"v_boundary_inner is outside of " "the model range."
)
if value < self.raw_velocity[0]:
raise ValueError(
"v_boundary_inner is lower than the lowest shell in the model."
)
self._v_boundary_inner = value
# Invalidate the cached cut-down velocity array
self._velocity = None
@property
def v_boundary_outer(self):
if self._v_boundary_outer is None:
return self.raw_velocity[-1]
if self._v_boundary_outer < 0 * u.km / u.s:
return self.raw_velocity[-1]
return self._v_boundary_outer
@v_boundary_outer.setter
def v_boundary_outer(self, value):
if value is not None:
if value > 0 * u.km / u.s:
value = u.Quantity(value, self.v_boundary_outer.unit)
if value < self.v_boundary_inner:
raise ValueError(
"v_boundary_outer must not be smaller than "
"v_boundary_inner."
)
if value < self.raw_velocity[0]:
raise ValueError(
"v_boundary_outer is outside of " "the model range."
)
if value > self.raw_velocity[-1]:
raise ValueError(
"v_boundary_outer is larger than the largest shell in the model."
)
self._v_boundary_outer = value
# Invalidate the cached cut-down velocity array
self._velocity = None
@property
def v_boundary_inner_index(self):
if self.v_boundary_inner in self.raw_velocity:
v_inner_ind = np.argwhere(
self.raw_velocity == self.v_boundary_inner
)[0][0]
else:
v_inner_ind = (
np.searchsorted(self.raw_velocity, self.v_boundary_inner) - 1
)
return v_inner_ind
@property
def v_boundary_outer_index(self):
if self.v_boundary_outer in self.raw_velocity:
v_outer_ind = np.argwhere(
self.raw_velocity == self.v_boundary_outer
)[0][0]
else:
v_outer_ind = np.searchsorted(
self.raw_velocity, self.v_boundary_outer
)
return v_outer_ind
# @property
# def v_boundary_inner_index(self):
# if self.v_boundary_inner <= self.raw_velocity[0]:
# return 0
# else:
# idx = max(0,
# self.raw_velocity.searchsorted(self.v_boundary_inner) - 1)
# # check for zero volume of designated first cell
# if np.isclose(self.v_boundary_inner, self.raw_velocity[idx + 1],
# atol=1e-8 * u.km / u.s) and (self.v_boundary_inner <=
# self.raw_velocity[idx + 1]):
# idx += 1
# return idx
#
# @property
# def v_boundary_outer_index(self):
# if self.v_boundary_outer >= self.raw_velocity[-1]:
# return None
# return self.raw_velocity.searchsorted(self.v_boundary_outer) + 1
@classmethod
def from_config(cls, config):
"""
Create a new Radial1DModel instance from a Configuration object.
Parameters
----------
config : tardis.io.config_reader.Configuration
Returns
-------
Radial1DModel
"""
time_explosion = config.supernova.time_explosion.cgs
structure = config.model.structure
electron_densities = None
temperature = None
if structure.type == "specific":
velocity = quantity_linspace(
structure.velocity.start,
structure.velocity.stop,
structure.velocity.num + 1,
).cgs
homologous_density = HomologousDensity.from_config(config)
elif structure.type == "file":
if os.path.isabs(structure.filename):
structure_fname = structure.filename
else:
structure_fname = os.path.join(
config.config_dirname, structure.filename
)
(
time_0,
velocity,
density_0,
electron_densities,
temperature,
) = read_density_file(structure_fname, structure.filetype)
density_0 = density_0.insert(0, 0)
homologous_density = HomologousDensity(density_0, time_0)
else:
raise NotImplementedError
# Note: This is the number of shells *without* taking in mind the
# v boundaries.
no_of_shells = len(velocity) - 1
if temperature:
t_radiative = temperature
elif config.plasma.initial_t_rad > 0 * u.K:
t_radiative = (
np.ones(no_of_shells + 1) * config.plasma.initial_t_rad
)
else:
t_radiative = None
if config.plasma.initial_t_inner < 0.0 * u.K:
luminosity_requested = config.supernova.luminosity_requested
t_inner = None
else:
luminosity_requested = None
t_inner = config.plasma.initial_t_inner
abundances_section = config.model.abundances
isotope_abundance = pd.DataFrame()
if abundances_section.type == "uniform":
abundance, isotope_abundance = read_uniform_abundances(
abundances_section, no_of_shells
)
elif abundances_section.type == "file":
if os.path.isabs(abundances_section.filename):
abundances_fname = abundances_section.filename
else:
abundances_fname = os.path.join(
config.config_dirname, abundances_section.filename
)
index, abundance, isotope_abundance = read_abundances_file(
abundances_fname, abundances_section.filetype
)
abundance = abundance.replace(np.nan, 0.0)
abundance = abundance[abundance.sum(axis=1) > 0]
norm_factor = abundance.sum(axis=0) + isotope_abundance.sum(axis=0)
if np.any(np.abs(norm_factor - 1) > 1e-12):
logger.warning(
"Abundances have not been normalized to 1." " - normalizing"
)
abundance /= norm_factor
isotope_abundance /= norm_factor
isotope_abundance = IsotopeAbundances(isotope_abundance)
return cls(
velocity=velocity,
homologous_density=homologous_density,
abundance=abundance,
isotope_abundance=isotope_abundance,
time_explosion=time_explosion,
t_radiative=t_radiative,
t_inner=t_inner,
luminosity_requested=luminosity_requested,
dilution_factor=None,
v_boundary_inner=structure.get("v_inner_boundary", None),
v_boundary_outer=structure.get("v_outer_boundary", None),
electron_densities=electron_densities,
)
@classmethod
def from_csvy(cls, config):
"""
Create a new Radial1DModel instance from a Configuration object.
Parameters
----------
config : tardis.io.config_reader.Configuration
Returns
-------
Radial1DModel
"""
CSVY_SUPPORTED_COLUMNS = {
"velocity",
"density",
"t_rad",
"dilution_factor",
}
if os.path.isabs(config.csvy_model):
csvy_model_fname = config.csvy_model
else:
csvy_model_fname = os.path.join(
config.config_dirname, config.csvy_model
)
csvy_model_config, csvy_model_data = load_csvy(csvy_model_fname)
base_dir = os.path.abspath(os.path.dirname(__file__))
schema_dir = os.path.join(base_dir, "..", "io", "schemas")
csvy_schema_file = os.path.join(schema_dir, "csvy_model.yml")
csvy_model_config = Configuration(
validate_dict(csvy_model_config, schemapath=csvy_schema_file)
)
if hasattr(csvy_model_data, "columns"):
abund_names = set(
[
name
for name in csvy_model_data.columns
if nucname.iselement(name) or nucname.isnuclide(name)
]
)
unsupported_columns = (
set(csvy_model_data.columns)
- abund_names
- CSVY_SUPPORTED_COLUMNS
)
field_names = set(
[field["name"] for field in csvy_model_config.datatype.fields]
)
assert (
set(csvy_model_data.columns) - field_names == set()
), "CSVY columns exist without field descriptions"
assert (
field_names - set(csvy_model_data.columns) == set()
), "CSVY field descriptions exist without corresponding csv data"
if unsupported_columns != set():
logger.warning(
"The following columns are specified in the csvy"
"model file, but are IGNORED by TARDIS: %s"
% (str(unsupported_columns))
)
time_explosion = config.supernova.time_explosion.cgs
electron_densities = None
temperature = None
# if hasattr(csvy_model_config, 'v_inner_boundary'):
# v_boundary_inner = csvy_model_config.v_inner_boundary
# else:
# v_boundary_inner = None
# if hasattr(csvy_model_config, 'v_outer_boundary'):
# v_boundary_outer = csvy_model_config.v_outer_boundary
# else:
# v_boundary_outer = None
if hasattr(config, "model"):
if hasattr(config.model, "v_inner_boundary"):
v_boundary_inner = config.model.v_inner_boundary
else:
v_boundary_inner = None
if hasattr(config.model, "v_outer_boundary"):
v_boundary_outer = config.model.v_outer_boundary
else:
v_boundary_outer = None
else:
v_boundary_inner = None
v_boundary_outer = None
if hasattr(csvy_model_config, "velocity"):
velocity = quantity_linspace(
csvy_model_config.velocity.start,
csvy_model_config.velocity.stop,
csvy_model_config.velocity.num + 1,
).cgs
else:
velocity_field_index = [
field["name"] for field in csvy_model_config.datatype.fields
].index("velocity")
velocity_unit = u.Unit(
csvy_model_config.datatype.fields[velocity_field_index]["unit"]
)
velocity = csvy_model_data["velocity"].values * velocity_unit
velocity = velocity.to("cm/s")
if hasattr(csvy_model_config, "density"):
homologous_density = HomologousDensity.from_csvy(
config, csvy_model_config
)
else:
time_0 = csvy_model_config.model_density_time_0
density_field_index = [
field["name"] for field in csvy_model_config.datatype.fields
].index("density")
density_unit = u.Unit(
csvy_model_config.datatype.fields[density_field_index]["unit"]
)
density_0 = csvy_model_data["density"].values * density_unit
density_0 = density_0.to("g/cm^3")[1:]
density_0 = density_0.insert(0, 0)
homologous_density = HomologousDensity(density_0, time_0)
no_of_shells = len(velocity) - 1
# TODO -- implement t_radiative
# t_radiative = None
if temperature:
t_radiative = temperature
elif hasattr(csvy_model_data, "columns"):
if "t_rad" in csvy_model_data.columns:
t_rad_field_index = [
field["name"] for field in csvy_model_config.datatype.fields
].index("t_rad")
t_rad_unit = u.Unit(
csvy_model_config.datatype.fields[t_rad_field_index]["unit"]
)
t_radiative = (
csvy_model_data["t_rad"].iloc[0:].values * t_rad_unit
)
else:
t_radiative = None
dilution_factor = None
if hasattr(csvy_model_data, "columns"):
if "dilution_factor" in csvy_model_data.columns:
dilution_factor = (
csvy_model_data["dilution_factor"].iloc[0:].to_numpy()
)
elif config.plasma.initial_t_rad > 0 * u.K:
t_radiative = np.ones(no_of_shells) * config.plasma.initial_t_rad
else:
t_radiative = None
if config.plasma.initial_t_inner < 0.0 * u.K:
luminosity_requested = config.supernova.luminosity_requested
t_inner = None
else:
luminosity_requested = None
t_inner = config.plasma.initial_t_inner
if hasattr(csvy_model_config, "abundance"):
abundances_section = csvy_model_config.abundance
abundance, isotope_abundance = read_uniform_abundances(
abundances_section, no_of_shells
)
else:
index, abundance, isotope_abundance = parse_csv_abundances(
csvy_model_data
)
abundance = abundance.loc[:, 1:]
abundance.columns = np.arange(abundance.shape[1])
isotope_abundance = isotope_abundance.loc[:, 1:]
isotope_abundance.columns = np.arange(isotope_abundance.shape[1])
abundance = abundance.replace(np.nan, 0.0)
abundance = abundance[abundance.sum(axis=1) > 0]
isotope_abundance = isotope_abundance.replace(np.nan, 0.0)
isotope_abundance = isotope_abundance[isotope_abundance.sum(axis=1) > 0]
norm_factor = abundance.sum(axis=0) + isotope_abundance.sum(axis=0)
if np.any(np.abs(norm_factor - 1) > 1e-12):
logger.warning(
"Abundances have not been normalized to 1." " - normalizing"
)
abundance /= norm_factor
isotope_abundance /= norm_factor
# isotope_abundance = IsotopeAbundances(isotope_abundance)
isotope_abundance = IsotopeAbundances(
isotope_abundance, time_0=csvy_model_config.model_isotope_time_0
)
# isotope_abundance.time_0 = csvy_model_config.model_isotope_time_0
return cls(
velocity=velocity,
homologous_density=homologous_density,
abundance=abundance,
isotope_abundance=isotope_abundance,
time_explosion=time_explosion,
t_radiative=t_radiative,
t_inner=t_inner,
luminosity_requested=luminosity_requested,
dilution_factor=dilution_factor,
v_boundary_inner=v_boundary_inner,
v_boundary_outer=v_boundary_outer,
electron_densities=electron_densities,
)
| bsd-3-clause |
Titan-C/slaveparticles | examples/spins/plot_z_half_multiorb.py | 1 | 1049 | # -*- coding: utf-8 -*-
"""
======================================================
Drop of quasiparticle weight by increasing interaction
======================================================
The quasiparticle weight of the electronic system drops as the local interaction
is increased. Multi orbital degenerate systems are studied
"""
from __future__ import division, absolute_import, print_function
import slaveparticles.utils.plotter as ssplt
import numpy as np
import matplotlib.pyplot as plt
#Degenerate bands
def plot_degbandshalffill():
"""Plot of Quasiparticle weight for degenerate
half-filled bands, showing the Mott transition"""
ulim = [3.45, 5.15, 6.85, 8.55]
bands = range(1, 5)
for band, u_int in zip(bands, ulim):
name = 'Z_half_'+str(band)+'band'
dop = [0.5]
data = ssplt.calc_z(band, dop, np.arange(0, u_int, 0.1),0., name)
plt.plot(data['u_int'], data['zeta'][0, :, 0], label='N={}'.format(str(band)))
ssplt.label_saves('Z_half_multiorb.png')
plot_degbandshalffill()
| gpl-3.0 |
kochhar/cric | cric/inning.py | 1 | 3313 | import math
import numpy as np
import pandas as pd
import pickers as pck
def create_innings_dataframe(number, innings):
"""Given an cricsheet innings convert it into a data frame"""
delivery_ids, outcomes = split_id_outcome(pck.pick_deliveries(innings))
# heirarchical index by over and delivery. eg: 4.3 => (4, 3)
outcome_index = heirarchical_delivery_index(number, delivery_ids)
# flatten the outcome into columns
outcome_rows = [flat_delivery_outcome(o) for o in outcomes]
inning_df = pd.DataFrame(outcome_rows, index=outcome_index)
if 'rb' in inning_df.columns:
inning_df.insert(6, 'rb_cum', inning_df.groupby('batsman').rb.cumsum())
if 're' in inning_df.columns:
inning_df.insert(7, 're_cum', inning_df.re.cumsum())
if 'rt' in inning_df.columns:
# add a cummulative run counter
inning_df.insert(8, 'rt_cum', inning_df.rt.cumsum())
inning_df['w_cum'] = 0
if 'wpo' in inning_df.columns:
# set the wicket counter = 1 where a person was out
inning_df.loc[inning_df.wpo.notnull(), 'w_cum'] = 1
inning_df['w_cum'] = inning_df.w_cum.cumsum()
return inning_df
def inning_summary(inning_frame, team):
"""Takes a inning data frame and tries to produce a summary of the inning"""
if len(inning_frame):
last_ball = inning_frame.iloc[-1]
rt_cum = last_ball['rt_cum'] if 'rt' in inning_frame.columns else '-'
w_cum = last_ball['w_cum'] if 'w_cum' in inning_frame.columns else '-'
return "%s %d/%0s" % (team, rt_cum, w_cum)
else:
return "%s -" % (team,)
def split_id_outcome(deliveries):
"""Splits an iterable of deliveries into the [delivery_ids], [outcomes]"""
# each delivery is a map { delivery_id => outcome }
# d.items() will return a list of [(k, v)] pairs in the delivery
# a list of maps is converted into a list of tuples
id_outcome_tuples = [d.items()[0] for d in deliveries]
# id_outcome_tuples is essentially a pair of lists zipped together into a
# list of pairs.
#
# zip is its own inverse. i.e. we can unzip a list of pairs using zip
return zip(*id_outcome_tuples)
def heirarchical_delivery_index(inum, delivery_ids):
"""Given a list of delivery ids of the form Over.ball, converts it into a
2-level pandas Index with overs and balls"""
# unzip a list of pairs into a pair of lists
balls, overs = zip(*[math.modf(did) for did in delivery_ids])
balls = [round(b, 1) for b in balls]
# construct the 2d index
index = pd.MultiIndex.from_arrays([np.array([inum]*len(balls)), np.array(overs), np.array(balls)], names=['inning', 'over', 'ball'])
return index
def flat_delivery_outcome(do):
flat_do = do.copy()
if 'runs' in flat_do:
flat_do['rb'] = flat_do['runs']['batsman']
flat_do['re'] = flat_do['runs']['extras']
flat_do['rt'] = flat_do['runs']['total']
flat_do.pop('runs')
if 'wicket' in flat_do:
flat_do['wk'] = flat_do['wicket']['kind']
flat_do['wpo'] = flat_do['wicket']['player_out']
if 'fielders' in flat_do['wicket']:
flat_do['wf'] = flat_do['wicket']['fielders']
flat_do.pop('wicket')
if 'extras' in flat_do:
flat_do.pop('extras', None)
return flat_do
| agpl-3.0 |
roxyboy/scikit-learn | sklearn/linear_model/stochastic_gradient.py | 130 | 50966 | # Authors: Peter Prettenhofer <peter.prettenhofer@gmail.com> (main author)
# Mathieu Blondel (partial_fit support)
#
# License: BSD 3 clause
"""Classification and regression using Stochastic Gradient Descent (SGD)."""
import numpy as np
import scipy.sparse as sp
from abc import ABCMeta, abstractmethod
from ..externals.joblib import Parallel, delayed
from .base import LinearClassifierMixin, SparseCoefMixin
from ..base import BaseEstimator, RegressorMixin
from ..feature_selection.from_model import _LearntSelectorMixin
from ..utils import (check_array, check_random_state, check_X_y,
deprecated)
from ..utils.extmath import safe_sparse_dot
from ..utils.multiclass import _check_partial_fit_first_call
from ..utils.validation import check_is_fitted
from ..externals import six
from .sgd_fast import plain_sgd, average_sgd
from ..utils.fixes import astype
from ..utils.seq_dataset import ArrayDataset, CSRDataset
from ..utils import compute_class_weight
from .sgd_fast import Hinge
from .sgd_fast import SquaredHinge
from .sgd_fast import Log
from .sgd_fast import ModifiedHuber
from .sgd_fast import SquaredLoss
from .sgd_fast import Huber
from .sgd_fast import EpsilonInsensitive
from .sgd_fast import SquaredEpsilonInsensitive
LEARNING_RATE_TYPES = {"constant": 1, "optimal": 2, "invscaling": 3,
"pa1": 4, "pa2": 5}
PENALTY_TYPES = {"none": 0, "l2": 2, "l1": 1, "elasticnet": 3}
SPARSE_INTERCEPT_DECAY = 0.01
"""For sparse data intercept updates are scaled by this decay factor to avoid
intercept oscillation."""
DEFAULT_EPSILON = 0.1
"""Default value of ``epsilon`` parameter. """
class BaseSGD(six.with_metaclass(ABCMeta, BaseEstimator, SparseCoefMixin)):
"""Base class for SGD classification and regression."""
def __init__(self, loss, penalty='l2', alpha=0.0001, C=1.0,
l1_ratio=0.15, fit_intercept=True, n_iter=5, shuffle=True,
verbose=0, epsilon=0.1, random_state=None,
learning_rate="optimal", eta0=0.0, power_t=0.5,
warm_start=False, average=False):
self.loss = loss
self.penalty = penalty
self.learning_rate = learning_rate
self.epsilon = epsilon
self.alpha = alpha
self.C = C
self.l1_ratio = l1_ratio
self.fit_intercept = fit_intercept
self.n_iter = n_iter
self.shuffle = shuffle
self.random_state = random_state
self.verbose = verbose
self.eta0 = eta0
self.power_t = power_t
self.warm_start = warm_start
self.average = average
self._validate_params()
self.coef_ = None
if self.average > 0:
self.standard_coef_ = None
self.average_coef_ = None
# iteration count for learning rate schedule
# must not be int (e.g. if ``learning_rate=='optimal'``)
self.t_ = None
def set_params(self, *args, **kwargs):
super(BaseSGD, self).set_params(*args, **kwargs)
self._validate_params()
return self
@abstractmethod
def fit(self, X, y):
"""Fit model."""
def _validate_params(self):
"""Validate input params. """
if not isinstance(self.shuffle, bool):
raise ValueError("shuffle must be either True or False")
if self.n_iter <= 0:
raise ValueError("n_iter must be > zero")
if not (0.0 <= self.l1_ratio <= 1.0):
raise ValueError("l1_ratio must be in [0, 1]")
if self.alpha < 0.0:
raise ValueError("alpha must be >= 0")
if self.learning_rate in ("constant", "invscaling"):
if self.eta0 <= 0.0:
raise ValueError("eta0 must be > 0")
# raises ValueError if not registered
self._get_penalty_type(self.penalty)
self._get_learning_rate_type(self.learning_rate)
if self.loss not in self.loss_functions:
raise ValueError("The loss %s is not supported. " % self.loss)
def _get_loss_function(self, loss):
"""Get concrete ``LossFunction`` object for str ``loss``. """
try:
loss_ = self.loss_functions[loss]
loss_class, args = loss_[0], loss_[1:]
if loss in ('huber', 'epsilon_insensitive',
'squared_epsilon_insensitive'):
args = (self.epsilon, )
return loss_class(*args)
except KeyError:
raise ValueError("The loss %s is not supported. " % loss)
def _get_learning_rate_type(self, learning_rate):
try:
return LEARNING_RATE_TYPES[learning_rate]
except KeyError:
raise ValueError("learning rate %s "
"is not supported. " % learning_rate)
def _get_penalty_type(self, penalty):
penalty = str(penalty).lower()
try:
return PENALTY_TYPES[penalty]
except KeyError:
raise ValueError("Penalty %s is not supported. " % penalty)
def _validate_sample_weight(self, sample_weight, n_samples):
"""Set the sample weight array."""
if sample_weight is None:
# uniform sample weights
sample_weight = np.ones(n_samples, dtype=np.float64, order='C')
else:
# user-provided array
sample_weight = np.asarray(sample_weight, dtype=np.float64,
order="C")
if sample_weight.shape[0] != n_samples:
raise ValueError("Shapes of X and sample_weight do not match.")
return sample_weight
def _allocate_parameter_mem(self, n_classes, n_features, coef_init=None,
intercept_init=None):
"""Allocate mem for parameters; initialize if provided."""
if n_classes > 2:
# allocate coef_ for multi-class
if coef_init is not None:
coef_init = np.asarray(coef_init, order="C")
if coef_init.shape != (n_classes, n_features):
raise ValueError("Provided ``coef_`` does not match dataset. ")
self.coef_ = coef_init
else:
self.coef_ = np.zeros((n_classes, n_features),
dtype=np.float64, order="C")
# allocate intercept_ for multi-class
if intercept_init is not None:
intercept_init = np.asarray(intercept_init, order="C")
if intercept_init.shape != (n_classes, ):
raise ValueError("Provided intercept_init "
"does not match dataset.")
self.intercept_ = intercept_init
else:
self.intercept_ = np.zeros(n_classes, dtype=np.float64,
order="C")
else:
# allocate coef_ for binary problem
if coef_init is not None:
coef_init = np.asarray(coef_init, dtype=np.float64,
order="C")
coef_init = coef_init.ravel()
if coef_init.shape != (n_features,):
raise ValueError("Provided coef_init does not "
"match dataset.")
self.coef_ = coef_init
else:
self.coef_ = np.zeros(n_features,
dtype=np.float64,
order="C")
# allocate intercept_ for binary problem
if intercept_init is not None:
intercept_init = np.asarray(intercept_init, dtype=np.float64)
if intercept_init.shape != (1,) and intercept_init.shape != ():
raise ValueError("Provided intercept_init "
"does not match dataset.")
self.intercept_ = intercept_init.reshape(1,)
else:
self.intercept_ = np.zeros(1, dtype=np.float64, order="C")
# initialize average parameters
if self.average > 0:
self.standard_coef_ = self.coef_
self.standard_intercept_ = self.intercept_
self.average_coef_ = np.zeros(self.coef_.shape,
dtype=np.float64,
order="C")
self.average_intercept_ = np.zeros(self.standard_intercept_.shape,
dtype=np.float64,
order="C")
def _make_dataset(X, y_i, sample_weight):
"""Create ``Dataset`` abstraction for sparse and dense inputs.
This also returns the ``intercept_decay`` which is different
for sparse datasets.
"""
if sp.issparse(X):
dataset = CSRDataset(X.data, X.indptr, X.indices, y_i, sample_weight)
intercept_decay = SPARSE_INTERCEPT_DECAY
else:
dataset = ArrayDataset(X, y_i, sample_weight)
intercept_decay = 1.0
return dataset, intercept_decay
def _prepare_fit_binary(est, y, i):
"""Initialization for fit_binary.
Returns y, coef, intercept.
"""
y_i = np.ones(y.shape, dtype=np.float64, order="C")
y_i[y != est.classes_[i]] = -1.0
average_intercept = 0
average_coef = None
if len(est.classes_) == 2:
if not est.average:
coef = est.coef_.ravel()
intercept = est.intercept_[0]
else:
coef = est.standard_coef_.ravel()
intercept = est.standard_intercept_[0]
average_coef = est.average_coef_.ravel()
average_intercept = est.average_intercept_[0]
else:
if not est.average:
coef = est.coef_[i]
intercept = est.intercept_[i]
else:
coef = est.standard_coef_[i]
intercept = est.standard_intercept_[i]
average_coef = est.average_coef_[i]
average_intercept = est.average_intercept_[i]
return y_i, coef, intercept, average_coef, average_intercept
def fit_binary(est, i, X, y, alpha, C, learning_rate, n_iter,
pos_weight, neg_weight, sample_weight):
"""Fit a single binary classifier.
The i'th class is considered the "positive" class.
"""
# if average is not true, average_coef, and average_intercept will be
# unused
y_i, coef, intercept, average_coef, average_intercept = \
_prepare_fit_binary(est, y, i)
assert y_i.shape[0] == y.shape[0] == sample_weight.shape[0]
dataset, intercept_decay = _make_dataset(X, y_i, sample_weight)
penalty_type = est._get_penalty_type(est.penalty)
learning_rate_type = est._get_learning_rate_type(learning_rate)
# XXX should have random_state_!
random_state = check_random_state(est.random_state)
# numpy mtrand expects a C long which is a signed 32 bit integer under
# Windows
seed = random_state.randint(0, np.iinfo(np.int32).max)
if not est.average:
return plain_sgd(coef, intercept, est.loss_function,
penalty_type, alpha, C, est.l1_ratio,
dataset, n_iter, int(est.fit_intercept),
int(est.verbose), int(est.shuffle), seed,
pos_weight, neg_weight,
learning_rate_type, est.eta0,
est.power_t, est.t_, intercept_decay)
else:
standard_coef, standard_intercept, average_coef, \
average_intercept = average_sgd(coef, intercept, average_coef,
average_intercept,
est.loss_function, penalty_type,
alpha, C, est.l1_ratio, dataset,
n_iter, int(est.fit_intercept),
int(est.verbose), int(est.shuffle),
seed, pos_weight, neg_weight,
learning_rate_type, est.eta0,
est.power_t, est.t_,
intercept_decay,
est.average)
if len(est.classes_) == 2:
est.average_intercept_[0] = average_intercept
else:
est.average_intercept_[i] = average_intercept
return standard_coef, standard_intercept
class BaseSGDClassifier(six.with_metaclass(ABCMeta, BaseSGD,
LinearClassifierMixin)):
loss_functions = {
"hinge": (Hinge, 1.0),
"squared_hinge": (SquaredHinge, 1.0),
"perceptron": (Hinge, 0.0),
"log": (Log, ),
"modified_huber": (ModifiedHuber, ),
"squared_loss": (SquaredLoss, ),
"huber": (Huber, DEFAULT_EPSILON),
"epsilon_insensitive": (EpsilonInsensitive, DEFAULT_EPSILON),
"squared_epsilon_insensitive": (SquaredEpsilonInsensitive,
DEFAULT_EPSILON),
}
@abstractmethod
def __init__(self, loss="hinge", penalty='l2', alpha=0.0001, l1_ratio=0.15,
fit_intercept=True, n_iter=5, shuffle=True, verbose=0,
epsilon=DEFAULT_EPSILON, n_jobs=1, random_state=None,
learning_rate="optimal", eta0=0.0, power_t=0.5,
class_weight=None, warm_start=False, average=False):
super(BaseSGDClassifier, self).__init__(loss=loss, penalty=penalty,
alpha=alpha, l1_ratio=l1_ratio,
fit_intercept=fit_intercept,
n_iter=n_iter, shuffle=shuffle,
verbose=verbose,
epsilon=epsilon,
random_state=random_state,
learning_rate=learning_rate,
eta0=eta0, power_t=power_t,
warm_start=warm_start,
average=average)
self.class_weight = class_weight
self.classes_ = None
self.n_jobs = int(n_jobs)
def _partial_fit(self, X, y, alpha, C,
loss, learning_rate, n_iter,
classes, sample_weight,
coef_init, intercept_init):
X, y = check_X_y(X, y, 'csr', dtype=np.float64, order="C")
n_samples, n_features = X.shape
self._validate_params()
_check_partial_fit_first_call(self, classes)
n_classes = self.classes_.shape[0]
# Allocate datastructures from input arguments
self._expanded_class_weight = compute_class_weight(self.class_weight,
self.classes_, y)
sample_weight = self._validate_sample_weight(sample_weight, n_samples)
if self.coef_ is None or coef_init is not None:
self._allocate_parameter_mem(n_classes, n_features,
coef_init, intercept_init)
elif n_features != self.coef_.shape[-1]:
raise ValueError("Number of features %d does not match previous data %d."
% (n_features, self.coef_.shape[-1]))
self.loss_function = self._get_loss_function(loss)
if self.t_ is None:
self.t_ = 1.0
# delegate to concrete training procedure
if n_classes > 2:
self._fit_multiclass(X, y, alpha=alpha, C=C,
learning_rate=learning_rate,
sample_weight=sample_weight, n_iter=n_iter)
elif n_classes == 2:
self._fit_binary(X, y, alpha=alpha, C=C,
learning_rate=learning_rate,
sample_weight=sample_weight, n_iter=n_iter)
else:
raise ValueError("The number of class labels must be "
"greater than one.")
return self
def _fit(self, X, y, alpha, C, loss, learning_rate, coef_init=None,
intercept_init=None, sample_weight=None):
if hasattr(self, "classes_"):
self.classes_ = None
X, y = check_X_y(X, y, 'csr', dtype=np.float64, order="C")
n_samples, n_features = X.shape
# labels can be encoded as float, int, or string literals
# np.unique sorts in asc order; largest class id is positive class
classes = np.unique(y)
if self.warm_start and self.coef_ is not None:
if coef_init is None:
coef_init = self.coef_
if intercept_init is None:
intercept_init = self.intercept_
else:
self.coef_ = None
self.intercept_ = None
if self.average > 0:
self.standard_coef_ = self.coef_
self.standard_intercept_ = self.intercept_
self.average_coef_ = None
self.average_intercept_ = None
# Clear iteration count for multiple call to fit.
self.t_ = None
self._partial_fit(X, y, alpha, C, loss, learning_rate, self.n_iter,
classes, sample_weight, coef_init, intercept_init)
return self
def _fit_binary(self, X, y, alpha, C, sample_weight,
learning_rate, n_iter):
"""Fit a binary classifier on X and y. """
coef, intercept = fit_binary(self, 1, X, y, alpha, C,
learning_rate, n_iter,
self._expanded_class_weight[1],
self._expanded_class_weight[0],
sample_weight)
self.t_ += n_iter * X.shape[0]
# need to be 2d
if self.average > 0:
if self.average <= self.t_ - 1:
self.coef_ = self.average_coef_.reshape(1, -1)
self.intercept_ = self.average_intercept_
else:
self.coef_ = self.standard_coef_.reshape(1, -1)
self.standard_intercept_ = np.atleast_1d(intercept)
self.intercept_ = self.standard_intercept_
else:
self.coef_ = coef.reshape(1, -1)
# intercept is a float, need to convert it to an array of length 1
self.intercept_ = np.atleast_1d(intercept)
def _fit_multiclass(self, X, y, alpha, C, learning_rate,
sample_weight, n_iter):
"""Fit a multi-class classifier by combining binary classifiers
Each binary classifier predicts one class versus all others. This
strategy is called OVA: One Versus All.
"""
# Use joblib to fit OvA in parallel.
result = Parallel(n_jobs=self.n_jobs, backend="threading",
verbose=self.verbose)(
delayed(fit_binary)(self, i, X, y, alpha, C, learning_rate,
n_iter, self._expanded_class_weight[i], 1.,
sample_weight)
for i in range(len(self.classes_)))
for i, (_, intercept) in enumerate(result):
self.intercept_[i] = intercept
self.t_ += n_iter * X.shape[0]
if self.average > 0:
if self.average <= self.t_ - 1.0:
self.coef_ = self.average_coef_
self.intercept_ = self.average_intercept_
else:
self.coef_ = self.standard_coef_
self.standard_intercept_ = np.atleast_1d(intercept)
self.intercept_ = self.standard_intercept_
def partial_fit(self, X, y, classes=None, sample_weight=None):
"""Fit linear model with Stochastic Gradient Descent.
Parameters
----------
X : {array-like, sparse matrix}, shape (n_samples, n_features)
Subset of the training data
y : numpy array, shape (n_samples,)
Subset of the target values
classes : array, shape (n_classes,)
Classes across all calls to partial_fit.
Can be obtained by via `np.unique(y_all)`, where y_all is the
target vector of the entire dataset.
This argument is required for the first call to partial_fit
and can be omitted in the subsequent calls.
Note that y doesn't need to contain all labels in `classes`.
sample_weight : array-like, shape (n_samples,), optional
Weights applied to individual samples.
If not provided, uniform weights are assumed.
Returns
-------
self : returns an instance of self.
"""
if self.class_weight in ['balanced', 'auto']:
raise ValueError("class_weight '{0}' is not supported for "
"partial_fit. In order to use 'balanced' weights, "
"use compute_class_weight('{0}', classes, y). "
"In place of y you can us a large enough sample "
"of the full training set target to properly "
"estimate the class frequency distributions. "
"Pass the resulting weights as the class_weight "
"parameter.".format(self.class_weight))
return self._partial_fit(X, y, alpha=self.alpha, C=1.0, loss=self.loss,
learning_rate=self.learning_rate, n_iter=1,
classes=classes, sample_weight=sample_weight,
coef_init=None, intercept_init=None)
def fit(self, X, y, coef_init=None, intercept_init=None, sample_weight=None):
"""Fit linear model with Stochastic Gradient Descent.
Parameters
----------
X : {array-like, sparse matrix}, shape (n_samples, n_features)
Training data
y : numpy array, shape (n_samples,)
Target values
coef_init : array, shape (n_classes, n_features)
The initial coefficients to warm-start the optimization.
intercept_init : array, shape (n_classes,)
The initial intercept to warm-start the optimization.
sample_weight : array-like, shape (n_samples,), optional
Weights applied to individual samples.
If not provided, uniform weights are assumed. These weights will
be multiplied with class_weight (passed through the
contructor) if class_weight is specified
Returns
-------
self : returns an instance of self.
"""
return self._fit(X, y, alpha=self.alpha, C=1.0,
loss=self.loss, learning_rate=self.learning_rate,
coef_init=coef_init, intercept_init=intercept_init,
sample_weight=sample_weight)
class SGDClassifier(BaseSGDClassifier, _LearntSelectorMixin):
"""Linear classifiers (SVM, logistic regression, a.o.) with SGD training.
This estimator implements regularized linear models with stochastic
gradient descent (SGD) learning: the gradient of the loss is estimated
each sample at a time and the model is updated along the way with a
decreasing strength schedule (aka learning rate). SGD allows minibatch
(online/out-of-core) learning, see the partial_fit method.
For best results using the default learning rate schedule, the data should
have zero mean and unit variance.
This implementation works with data represented as dense or sparse arrays
of floating point values for the features. The model it fits can be
controlled with the loss parameter; by default, it fits a linear support
vector machine (SVM).
The regularizer is a penalty added to the loss function that shrinks model
parameters towards the zero vector using either the squared euclidean norm
L2 or the absolute norm L1 or a combination of both (Elastic Net). If the
parameter update crosses the 0.0 value because of the regularizer, the
update is truncated to 0.0 to allow for learning sparse models and achieve
online feature selection.
Read more in the :ref:`User Guide <sgd>`.
Parameters
----------
loss : str, 'hinge', 'log', 'modified_huber', 'squared_hinge',\
'perceptron', or a regression loss: 'squared_loss', 'huber',\
'epsilon_insensitive', or 'squared_epsilon_insensitive'
The loss function to be used. Defaults to 'hinge', which gives a
linear SVM.
The 'log' loss gives logistic regression, a probabilistic classifier.
'modified_huber' is another smooth loss that brings tolerance to
outliers as well as probability estimates.
'squared_hinge' is like hinge but is quadratically penalized.
'perceptron' is the linear loss used by the perceptron algorithm.
The other losses are designed for regression but can be useful in
classification as well; see SGDRegressor for a description.
penalty : str, 'none', 'l2', 'l1', or 'elasticnet'
The penalty (aka regularization term) to be used. Defaults to 'l2'
which is the standard regularizer for linear SVM models. 'l1' and
'elasticnet' might bring sparsity to the model (feature selection)
not achievable with 'l2'.
alpha : float
Constant that multiplies the regularization term. Defaults to 0.0001
l1_ratio : float
The Elastic Net mixing parameter, with 0 <= l1_ratio <= 1.
l1_ratio=0 corresponds to L2 penalty, l1_ratio=1 to L1.
Defaults to 0.15.
fit_intercept : bool
Whether the intercept should be estimated or not. If False, the
data is assumed to be already centered. Defaults to True.
n_iter : int, optional
The number of passes over the training data (aka epochs). The number
of iterations is set to 1 if using partial_fit.
Defaults to 5.
shuffle : bool, optional
Whether or not the training data should be shuffled after each epoch.
Defaults to True.
random_state : int seed, RandomState instance, or None (default)
The seed of the pseudo random number generator to use when
shuffling the data.
verbose : integer, optional
The verbosity level
epsilon : float
Epsilon in the epsilon-insensitive loss functions; only if `loss` is
'huber', 'epsilon_insensitive', or 'squared_epsilon_insensitive'.
For 'huber', determines the threshold at which it becomes less
important to get the prediction exactly right.
For epsilon-insensitive, any differences between the current prediction
and the correct label are ignored if they are less than this threshold.
n_jobs : integer, optional
The number of CPUs to use to do the OVA (One Versus All, for
multi-class problems) computation. -1 means 'all CPUs'. Defaults
to 1.
learning_rate : string, optional
The learning rate schedule:
constant: eta = eta0
optimal: eta = 1.0 / (t + t0) [default]
invscaling: eta = eta0 / pow(t, power_t)
where t0 is chosen by a heuristic proposed by Leon Bottou.
eta0 : double
The initial learning rate for the 'constant' or 'invscaling'
schedules. The default value is 0.0 as eta0 is not used by the
default schedule 'optimal'.
power_t : double
The exponent for inverse scaling learning rate [default 0.5].
class_weight : dict, {class_label: weight} or "balanced" or None, optional
Preset for the class_weight fit parameter.
Weights associated with classes. If not given, all classes
are supposed to have weight one.
The "balanced" mode uses the values of y to automatically adjust
weights inversely proportional to class frequencies in the input data
as ``n_samples / (n_classes * np.bincount(y))``
warm_start : bool, optional
When set to True, reuse the solution of the previous call to fit as
initialization, otherwise, just erase the previous solution.
average : bool or int, optional
When set to True, computes the averaged SGD weights and stores the
result in the ``coef_`` attribute. If set to an int greater than 1,
averaging will begin once the total number of samples seen reaches
average. So average=10 will begin averaging after seeing 10 samples.
Attributes
----------
coef_ : array, shape (1, n_features) if n_classes == 2 else (n_classes,\
n_features)
Weights assigned to the features.
intercept_ : array, shape (1,) if n_classes == 2 else (n_classes,)
Constants in decision function.
Examples
--------
>>> import numpy as np
>>> from sklearn import linear_model
>>> X = np.array([[-1, -1], [-2, -1], [1, 1], [2, 1]])
>>> Y = np.array([1, 1, 2, 2])
>>> clf = linear_model.SGDClassifier()
>>> clf.fit(X, Y)
... #doctest: +NORMALIZE_WHITESPACE
SGDClassifier(alpha=0.0001, average=False, class_weight=None, epsilon=0.1,
eta0=0.0, fit_intercept=True, l1_ratio=0.15,
learning_rate='optimal', loss='hinge', n_iter=5, n_jobs=1,
penalty='l2', power_t=0.5, random_state=None, shuffle=True,
verbose=0, warm_start=False)
>>> print(clf.predict([[-0.8, -1]]))
[1]
See also
--------
LinearSVC, LogisticRegression, Perceptron
"""
def __init__(self, loss="hinge", penalty='l2', alpha=0.0001, l1_ratio=0.15,
fit_intercept=True, n_iter=5, shuffle=True, verbose=0,
epsilon=DEFAULT_EPSILON, n_jobs=1, random_state=None,
learning_rate="optimal", eta0=0.0, power_t=0.5,
class_weight=None, warm_start=False, average=False):
super(SGDClassifier, self).__init__(
loss=loss, penalty=penalty, alpha=alpha, l1_ratio=l1_ratio,
fit_intercept=fit_intercept, n_iter=n_iter, shuffle=shuffle,
verbose=verbose, epsilon=epsilon, n_jobs=n_jobs,
random_state=random_state, learning_rate=learning_rate, eta0=eta0,
power_t=power_t, class_weight=class_weight, warm_start=warm_start,
average=average)
def _check_proba(self):
check_is_fitted(self, "t_")
if self.loss not in ("log", "modified_huber"):
raise AttributeError("probability estimates are not available for"
" loss=%r" % self.loss)
@property
def predict_proba(self):
"""Probability estimates.
This method is only available for log loss and modified Huber loss.
Multiclass probability estimates are derived from binary (one-vs.-rest)
estimates by simple normalization, as recommended by Zadrozny and
Elkan.
Binary probability estimates for loss="modified_huber" are given by
(clip(decision_function(X), -1, 1) + 1) / 2.
Parameters
----------
X : {array-like, sparse matrix}, shape (n_samples, n_features)
Returns
-------
array, shape (n_samples, n_classes)
Returns the probability of the sample for each class in the model,
where classes are ordered as they are in `self.classes_`.
References
----------
Zadrozny and Elkan, "Transforming classifier scores into multiclass
probability estimates", SIGKDD'02,
http://www.research.ibm.com/people/z/zadrozny/kdd2002-Transf.pdf
The justification for the formula in the loss="modified_huber"
case is in the appendix B in:
http://jmlr.csail.mit.edu/papers/volume2/zhang02c/zhang02c.pdf
"""
self._check_proba()
return self._predict_proba
def _predict_proba(self, X):
if self.loss == "log":
return self._predict_proba_lr(X)
elif self.loss == "modified_huber":
binary = (len(self.classes_) == 2)
scores = self.decision_function(X)
if binary:
prob2 = np.ones((scores.shape[0], 2))
prob = prob2[:, 1]
else:
prob = scores
np.clip(scores, -1, 1, prob)
prob += 1.
prob /= 2.
if binary:
prob2[:, 0] -= prob
prob = prob2
else:
# the above might assign zero to all classes, which doesn't
# normalize neatly; work around this to produce uniform
# probabilities
prob_sum = prob.sum(axis=1)
all_zero = (prob_sum == 0)
if np.any(all_zero):
prob[all_zero, :] = 1
prob_sum[all_zero] = len(self.classes_)
# normalize
prob /= prob_sum.reshape((prob.shape[0], -1))
return prob
else:
raise NotImplementedError("predict_(log_)proba only supported when"
" loss='log' or loss='modified_huber' "
"(%r given)" % self.loss)
@property
def predict_log_proba(self):
"""Log of probability estimates.
This method is only available for log loss and modified Huber loss.
When loss="modified_huber", probability estimates may be hard zeros
and ones, so taking the logarithm is not possible.
See ``predict_proba`` for details.
Parameters
----------
X : array-like, shape (n_samples, n_features)
Returns
-------
T : array-like, shape (n_samples, n_classes)
Returns the log-probability of the sample for each class in the
model, where classes are ordered as they are in
`self.classes_`.
"""
self._check_proba()
return self._predict_log_proba
def _predict_log_proba(self, X):
return np.log(self.predict_proba(X))
class BaseSGDRegressor(BaseSGD, RegressorMixin):
loss_functions = {
"squared_loss": (SquaredLoss, ),
"huber": (Huber, DEFAULT_EPSILON),
"epsilon_insensitive": (EpsilonInsensitive, DEFAULT_EPSILON),
"squared_epsilon_insensitive": (SquaredEpsilonInsensitive,
DEFAULT_EPSILON),
}
@abstractmethod
def __init__(self, loss="squared_loss", penalty="l2", alpha=0.0001,
l1_ratio=0.15, fit_intercept=True, n_iter=5, shuffle=True,
verbose=0, epsilon=DEFAULT_EPSILON, random_state=None,
learning_rate="invscaling", eta0=0.01, power_t=0.25,
warm_start=False, average=False):
super(BaseSGDRegressor, self).__init__(loss=loss, penalty=penalty,
alpha=alpha, l1_ratio=l1_ratio,
fit_intercept=fit_intercept,
n_iter=n_iter, shuffle=shuffle,
verbose=verbose,
epsilon=epsilon,
random_state=random_state,
learning_rate=learning_rate,
eta0=eta0, power_t=power_t,
warm_start=warm_start,
average=average)
def _partial_fit(self, X, y, alpha, C, loss, learning_rate,
n_iter, sample_weight,
coef_init, intercept_init):
X, y = check_X_y(X, y, "csr", copy=False, order='C', dtype=np.float64)
y = astype(y, np.float64, copy=False)
n_samples, n_features = X.shape
self._validate_params()
# Allocate datastructures from input arguments
sample_weight = self._validate_sample_weight(sample_weight, n_samples)
if self.coef_ is None:
self._allocate_parameter_mem(1, n_features,
coef_init, intercept_init)
elif n_features != self.coef_.shape[-1]:
raise ValueError("Number of features %d does not match previous data %d."
% (n_features, self.coef_.shape[-1]))
if self.average > 0 and self.average_coef_ is None:
self.average_coef_ = np.zeros(n_features,
dtype=np.float64,
order="C")
self.average_intercept_ = np.zeros(1,
dtype=np.float64,
order="C")
self._fit_regressor(X, y, alpha, C, loss, learning_rate,
sample_weight, n_iter)
return self
def partial_fit(self, X, y, sample_weight=None):
"""Fit linear model with Stochastic Gradient Descent.
Parameters
----------
X : {array-like, sparse matrix}, shape (n_samples, n_features)
Subset of training data
y : numpy array of shape (n_samples,)
Subset of target values
sample_weight : array-like, shape (n_samples,), optional
Weights applied to individual samples.
If not provided, uniform weights are assumed.
Returns
-------
self : returns an instance of self.
"""
return self._partial_fit(X, y, self.alpha, C=1.0,
loss=self.loss,
learning_rate=self.learning_rate, n_iter=1,
sample_weight=sample_weight,
coef_init=None, intercept_init=None)
def _fit(self, X, y, alpha, C, loss, learning_rate, coef_init=None,
intercept_init=None, sample_weight=None):
if self.warm_start and self.coef_ is not None:
if coef_init is None:
coef_init = self.coef_
if intercept_init is None:
intercept_init = self.intercept_
else:
self.coef_ = None
self.intercept_ = None
if self.average > 0:
self.standard_intercept_ = self.intercept_
self.standard_coef_ = self.coef_
self.average_coef_ = None
self.average_intercept_ = None
# Clear iteration count for multiple call to fit.
self.t_ = None
return self._partial_fit(X, y, alpha, C, loss, learning_rate,
self.n_iter, sample_weight,
coef_init, intercept_init)
def fit(self, X, y, coef_init=None, intercept_init=None,
sample_weight=None):
"""Fit linear model with Stochastic Gradient Descent.
Parameters
----------
X : {array-like, sparse matrix}, shape (n_samples, n_features)
Training data
y : numpy array, shape (n_samples,)
Target values
coef_init : array, shape (n_features,)
The initial coefficients to warm-start the optimization.
intercept_init : array, shape (1,)
The initial intercept to warm-start the optimization.
sample_weight : array-like, shape (n_samples,), optional
Weights applied to individual samples (1. for unweighted).
Returns
-------
self : returns an instance of self.
"""
return self._fit(X, y, alpha=self.alpha, C=1.0,
loss=self.loss, learning_rate=self.learning_rate,
coef_init=coef_init,
intercept_init=intercept_init,
sample_weight=sample_weight)
@deprecated(" and will be removed in 0.19.")
def decision_function(self, X):
"""Predict using the linear model
Parameters
----------
X : {array-like, sparse matrix}, shape (n_samples, n_features)
Returns
-------
array, shape (n_samples,)
Predicted target values per element in X.
"""
return self._decision_function(X)
def _decision_function(self, X):
"""Predict using the linear model
Parameters
----------
X : {array-like, sparse matrix}, shape (n_samples, n_features)
Returns
-------
array, shape (n_samples,)
Predicted target values per element in X.
"""
check_is_fitted(self, ["t_", "coef_", "intercept_"], all_or_any=all)
X = check_array(X, accept_sparse='csr')
scores = safe_sparse_dot(X, self.coef_.T,
dense_output=True) + self.intercept_
return scores.ravel()
def predict(self, X):
"""Predict using the linear model
Parameters
----------
X : {array-like, sparse matrix}, shape (n_samples, n_features)
Returns
-------
array, shape (n_samples,)
Predicted target values per element in X.
"""
return self._decision_function(X)
def _fit_regressor(self, X, y, alpha, C, loss, learning_rate,
sample_weight, n_iter):
dataset, intercept_decay = _make_dataset(X, y, sample_weight)
loss_function = self._get_loss_function(loss)
penalty_type = self._get_penalty_type(self.penalty)
learning_rate_type = self._get_learning_rate_type(learning_rate)
if self.t_ is None:
self.t_ = 1.0
random_state = check_random_state(self.random_state)
# numpy mtrand expects a C long which is a signed 32 bit integer under
# Windows
seed = random_state.randint(0, np.iinfo(np.int32).max)
if self.average > 0:
self.standard_coef_, self.standard_intercept_, \
self.average_coef_, self.average_intercept_ =\
average_sgd(self.standard_coef_,
self.standard_intercept_[0],
self.average_coef_,
self.average_intercept_[0],
loss_function,
penalty_type,
alpha, C,
self.l1_ratio,
dataset,
n_iter,
int(self.fit_intercept),
int(self.verbose),
int(self.shuffle),
seed,
1.0, 1.0,
learning_rate_type,
self.eta0, self.power_t, self.t_,
intercept_decay, self.average)
self.average_intercept_ = np.atleast_1d(self.average_intercept_)
self.standard_intercept_ = np.atleast_1d(self.standard_intercept_)
self.t_ += n_iter * X.shape[0]
if self.average <= self.t_ - 1.0:
self.coef_ = self.average_coef_
self.intercept_ = self.average_intercept_
else:
self.coef_ = self.standard_coef_
self.intercept_ = self.standard_intercept_
else:
self.coef_, self.intercept_ = \
plain_sgd(self.coef_,
self.intercept_[0],
loss_function,
penalty_type,
alpha, C,
self.l1_ratio,
dataset,
n_iter,
int(self.fit_intercept),
int(self.verbose),
int(self.shuffle),
seed,
1.0, 1.0,
learning_rate_type,
self.eta0, self.power_t, self.t_,
intercept_decay)
self.t_ += n_iter * X.shape[0]
self.intercept_ = np.atleast_1d(self.intercept_)
class SGDRegressor(BaseSGDRegressor, _LearntSelectorMixin):
"""Linear model fitted by minimizing a regularized empirical loss with SGD
SGD stands for Stochastic Gradient Descent: the gradient of the loss is
estimated each sample at a time and the model is updated along the way with
a decreasing strength schedule (aka learning rate).
The regularizer is a penalty added to the loss function that shrinks model
parameters towards the zero vector using either the squared euclidean norm
L2 or the absolute norm L1 or a combination of both (Elastic Net). If the
parameter update crosses the 0.0 value because of the regularizer, the
update is truncated to 0.0 to allow for learning sparse models and achieve
online feature selection.
This implementation works with data represented as dense numpy arrays of
floating point values for the features.
Read more in the :ref:`User Guide <sgd>`.
Parameters
----------
loss : str, 'squared_loss', 'huber', 'epsilon_insensitive', \
or 'squared_epsilon_insensitive'
The loss function to be used. Defaults to 'squared_loss' which refers
to the ordinary least squares fit. 'huber' modifies 'squared_loss' to
focus less on getting outliers correct by switching from squared to
linear loss past a distance of epsilon. 'epsilon_insensitive' ignores
errors less than epsilon and is linear past that; this is the loss
function used in SVR. 'squared_epsilon_insensitive' is the same but
becomes squared loss past a tolerance of epsilon.
penalty : str, 'none', 'l2', 'l1', or 'elasticnet'
The penalty (aka regularization term) to be used. Defaults to 'l2'
which is the standard regularizer for linear SVM models. 'l1' and
'elasticnet' might bring sparsity to the model (feature selection)
not achievable with 'l2'.
alpha : float
Constant that multiplies the regularization term. Defaults to 0.0001
l1_ratio : float
The Elastic Net mixing parameter, with 0 <= l1_ratio <= 1.
l1_ratio=0 corresponds to L2 penalty, l1_ratio=1 to L1.
Defaults to 0.15.
fit_intercept : bool
Whether the intercept should be estimated or not. If False, the
data is assumed to be already centered. Defaults to True.
n_iter : int, optional
The number of passes over the training data (aka epochs). The number
of iterations is set to 1 if using partial_fit.
Defaults to 5.
shuffle : bool, optional
Whether or not the training data should be shuffled after each epoch.
Defaults to True.
random_state : int seed, RandomState instance, or None (default)
The seed of the pseudo random number generator to use when
shuffling the data.
verbose : integer, optional
The verbosity level.
epsilon : float
Epsilon in the epsilon-insensitive loss functions; only if `loss` is
'huber', 'epsilon_insensitive', or 'squared_epsilon_insensitive'.
For 'huber', determines the threshold at which it becomes less
important to get the prediction exactly right.
For epsilon-insensitive, any differences between the current prediction
and the correct label are ignored if they are less than this threshold.
learning_rate : string, optional
The learning rate:
constant: eta = eta0
optimal: eta = 1.0/(alpha * t)
invscaling: eta = eta0 / pow(t, power_t) [default]
eta0 : double, optional
The initial learning rate [default 0.01].
power_t : double, optional
The exponent for inverse scaling learning rate [default 0.25].
warm_start : bool, optional
When set to True, reuse the solution of the previous call to fit as
initialization, otherwise, just erase the previous solution.
average : bool or int, optional
When set to True, computes the averaged SGD weights and stores the
result in the ``coef_`` attribute. If set to an int greater than 1,
averaging will begin once the total number of samples seen reaches
average. So ``average=10 will`` begin averaging after seeing 10 samples.
Attributes
----------
coef_ : array, shape (n_features,)
Weights assigned to the features.
intercept_ : array, shape (1,)
The intercept term.
average_coef_ : array, shape (n_features,)
Averaged weights assigned to the features.
average_intercept_ : array, shape (1,)
The averaged intercept term.
Examples
--------
>>> import numpy as np
>>> from sklearn import linear_model
>>> n_samples, n_features = 10, 5
>>> np.random.seed(0)
>>> y = np.random.randn(n_samples)
>>> X = np.random.randn(n_samples, n_features)
>>> clf = linear_model.SGDRegressor()
>>> clf.fit(X, y)
... #doctest: +NORMALIZE_WHITESPACE
SGDRegressor(alpha=0.0001, average=False, epsilon=0.1, eta0=0.01,
fit_intercept=True, l1_ratio=0.15, learning_rate='invscaling',
loss='squared_loss', n_iter=5, penalty='l2', power_t=0.25,
random_state=None, shuffle=True, verbose=0, warm_start=False)
See also
--------
Ridge, ElasticNet, Lasso, SVR
"""
def __init__(self, loss="squared_loss", penalty="l2", alpha=0.0001,
l1_ratio=0.15, fit_intercept=True, n_iter=5, shuffle=True,
verbose=0, epsilon=DEFAULT_EPSILON, random_state=None,
learning_rate="invscaling", eta0=0.01, power_t=0.25,
warm_start=False, average=False):
super(SGDRegressor, self).__init__(loss=loss, penalty=penalty,
alpha=alpha, l1_ratio=l1_ratio,
fit_intercept=fit_intercept,
n_iter=n_iter, shuffle=shuffle,
verbose=verbose,
epsilon=epsilon,
random_state=random_state,
learning_rate=learning_rate,
eta0=eta0, power_t=power_t,
warm_start=warm_start,
average=average)
| bsd-3-clause |
antonxy/audiosync | tests/chirp_test.py | 1 | 2542 | import analyse_audio
import numpy as np
import matplotlib.pyplot as plt
def chirp_single_test(length, freq0, freq1, noise_factor=0):
chirp = analyse_audio.generate_chirp(freq0, freq1, length, 48000)
zeros = np.zeros(chirp.size)
signal = np.append(np.append(zeros, chirp), zeros)
if noise_factor != 0:
noise = np.random.random(signal.size) * noise_factor
noised_signal = np.add(signal, noise)
else:
noised_signal = signal
expected = chirp.size * 2
result = analyse_audio.find_sync_signal(noised_signal, 48000, analyse_audio.generate_chirp(freq0, freq1, length, 48000))
return result - expected
def chirp_length_test():
n = 20
noise_level = 1.5
f0 = 3000
f1 = 6000
length_list = []
std_dev_list = []
max_dev_list = []
for length in np.logspace(-4, -1, 20):
samples = np.array([])
for i in range(n):
samples = np.append(samples, chirp_single_test(length, f0, f1, noise_level))
length_list.append(length)
std_dev = np.sqrt(np.sum(np.square(samples)) / samples.size)
std_dev_list.append(std_dev)
max_dev_list.append(np.amax(np.abs(samples)))
print('Std dev {} at length {}'.format(std_dev, length))
plt.plot(length_list, std_dev_list, 'bo-')
plt.plot(length_list, max_dev_list, 'ro-')
plt.title('Trials per value: {}, Noise level: {}, f0: {}, f1: {}'.format(n, noise_level, f0, f1))
plt.ylabel('std dev [samples]')
plt.xlabel('length [s]')
plt.show()
def chirp_noise_test():
n = 50
length = 0.1
f0 = 3000
f1 = 6000
noise_level_list = []
std_dev_list = []
max_dev_list = []
for noise_level in np.linspace(0, 50, 20):
samples = np.array([])
for i in range(n):
samples = np.append(samples, chirp_single_test(length, f0, f1, noise_level))
noise_level_list.append(noise_level)
std_dev = np.sqrt(np.sum(np.square(samples)) / samples.size)
std_dev_list.append(std_dev)
max_dev_list.append(np.amax(np.abs(samples)))
print('Std dev {} at noise level {}'.format(std_dev, noise_level))
plt.plot(noise_level_list, std_dev_list, 'bo-')
plt.plot(noise_level_list, max_dev_list, 'ro-')
plt.title('Trials per value: {}, Noise level: {}, f0: {}, f1: {}'.format(n, noise_level, f0, f1))
plt.ylabel('std dev [samples]')
plt.xlabel('noise level')
plt.show()
if __name__ == '__main__':
chirp_noise_test() | mit |
opcon/plutokore | scripts/calculate-luminosity.py | 2 | 1986 | #!/bin/env python3
import os
import sys
if os.path.exists(os.path.expanduser('~/plutokore')):
sys.path.append(os.path.expanduser('~/plutokore'))
else:
sys.path.append(os.path.expanduser('~/uni/plutokore'))
import plutokore as pk
import matplotlib as mpl
mpl.use('PS')
import matplotlib.pyplot as plot
import numpy as np
import argparse
from plutokore import radio
from numba import jit
from astropy.convolution import convolve, Gaussian2DKernel
import astropy.units as u
from astropy.cosmology import Planck15 as cosmo
import pathlib
def calculate_total_luminosity(*, sim_dir, out_dir, output_number):
redshift=0.1
beamsize=5 * u.arcsec
pixel_size = 1.8 * u.arcsec
vmin = -3.0
vmax = 2.0
# load sim config
uv, env, jet = pk.configuration.load_simulation_info(sim_dir + 'config.yaml')
# create our figure
fig, ax = plot.subplots(figsize=(2, 2))
# calculate beam radius
sigma_beam = (beamsize / 2.355)
# calculate kpc per arcsec
kpc_per_arcsec = cosmo.kpc_proper_per_arcmin(redshift).to(u.kpc / u.arcsec)
# load timestep data file
d = pk.simulations.load_timestep_data(output_number, sim_dir)
# calculate luminosity and unraytraced flux
l = radio.get_luminosity(d, uv, redshift, beamsize)
return l.sum()
def main():
parser = argparse.ArgumentParser()
parser.add_argument('simulation_directory', help='Simulation directory', type=str)
parser.add_argument('output_directory', help='Output directory', type=str)
parser.add_argument('output', help='Output number', type=int, nargs = '+')
args = parser.parse_args()
# create output directory if needed
pathlib.Path(args.output_directory).mkdir(parents = True, exist_ok = True)
for i in args.output:
tot = calculate_total_luminosity(sim_dir = args.simulation_directory,
out_dir = args.output_directory,
output_number = i)
print(tot)
if __name__ == '__main__':
main()
| mit |
shikhardb/scikit-learn | examples/exercises/plot_cv_diabetes.py | 231 | 2527 | """
===============================================
Cross-validation on diabetes Dataset Exercise
===============================================
A tutorial exercise which uses cross-validation with linear models.
This exercise is used in the :ref:`cv_estimators_tut` part of the
:ref:`model_selection_tut` section of the :ref:`stat_learn_tut_index`.
"""
from __future__ import print_function
print(__doc__)
import numpy as np
import matplotlib.pyplot as plt
from sklearn import cross_validation, datasets, linear_model
diabetes = datasets.load_diabetes()
X = diabetes.data[:150]
y = diabetes.target[:150]
lasso = linear_model.Lasso()
alphas = np.logspace(-4, -.5, 30)
scores = list()
scores_std = list()
for alpha in alphas:
lasso.alpha = alpha
this_scores = cross_validation.cross_val_score(lasso, X, y, n_jobs=1)
scores.append(np.mean(this_scores))
scores_std.append(np.std(this_scores))
plt.figure(figsize=(4, 3))
plt.semilogx(alphas, scores)
# plot error lines showing +/- std. errors of the scores
plt.semilogx(alphas, np.array(scores) + np.array(scores_std) / np.sqrt(len(X)),
'b--')
plt.semilogx(alphas, np.array(scores) - np.array(scores_std) / np.sqrt(len(X)),
'b--')
plt.ylabel('CV score')
plt.xlabel('alpha')
plt.axhline(np.max(scores), linestyle='--', color='.5')
##############################################################################
# Bonus: how much can you trust the selection of alpha?
# To answer this question we use the LassoCV object that sets its alpha
# parameter automatically from the data by internal cross-validation (i.e. it
# performs cross-validation on the training data it receives).
# We use external cross-validation to see how much the automatically obtained
# alphas differ across different cross-validation folds.
lasso_cv = linear_model.LassoCV(alphas=alphas)
k_fold = cross_validation.KFold(len(X), 3)
print("Answer to the bonus question:",
"how much can you trust the selection of alpha?")
print()
print("Alpha parameters maximising the generalization score on different")
print("subsets of the data:")
for k, (train, test) in enumerate(k_fold):
lasso_cv.fit(X[train], y[train])
print("[fold {0}] alpha: {1:.5f}, score: {2:.5f}".
format(k, lasso_cv.alpha_, lasso_cv.score(X[test], y[test])))
print()
print("Answer: Not very much since we obtained different alphas for different")
print("subsets of the data and moreover, the scores for these alphas differ")
print("quite substantially.")
plt.show()
| bsd-3-clause |
detrout/debian-statsmodels | tools/backport_pr.py | 30 | 5263 | #!/usr/bin/env python
"""
Backport pull requests to a particular branch.
Usage: backport_pr.py branch [PR]
e.g.:
python tools/backport_pr.py 0.13.1 123
to backport PR #123 onto branch 0.13.1
or
python tools/backport_pr.py 1.x
to see what PRs are marked for backport that have yet to be applied.
Copied from IPython 9e82bc5
https://github.com/ipython/ipython/blob/master/tools/backport_pr.py
"""
from __future__ import print_function
import os
import re
import sys
from subprocess import Popen, PIPE, check_call, check_output
from urllib import urlopen
from gh_api import (
get_issues_list,
get_pull_request,
get_pull_request_files,
is_pull_request,
get_milestone_id,
)
from pandas import Series
def find_rejects(root='.'):
for dirname, dirs, files in os.walk(root):
for fname in files:
if fname.endswith('.rej'):
yield os.path.join(dirname, fname)
def get_current_branch():
branches = check_output(['git', 'branch'])
for branch in branches.splitlines():
if branch.startswith('*'):
return branch[1:].strip()
def backport_pr(branch, num, project='statsmodels/statsmodels'):
current_branch = get_current_branch()
if branch != current_branch:
check_call(['git', 'checkout', branch])
check_call(['git', 'pull'])
pr = get_pull_request(project, num, auth=True)
files = get_pull_request_files(project, num, auth=True)
patch_url = pr['patch_url']
title = pr['title']
description = pr['body']
fname = "PR%i.patch" % num
if os.path.exists(fname):
print("using patch from {fname}".format(**locals()))
with open(fname) as f:
patch = f.read()
else:
req = urlopen(patch_url)
patch = req.read()
msg = "Backport PR #%i: %s" % (num, title) + '\n\n' + description
check = Popen(['git', 'apply', '--check', '--verbose'], stdin=PIPE)
a,b = check.communicate(patch)
if check.returncode:
print("patch did not apply, saving to {fname}".format(**locals()))
print("edit {fname} until `cat {fname} | git apply --check` succeeds".format(**locals()))
print("then run tools/backport_pr.py {num} again".format(**locals()))
if not os.path.exists(fname):
with open(fname, 'wb') as f:
f.write(patch)
return 1
p = Popen(['git', 'apply'], stdin=PIPE)
a,b = p.communicate(patch)
filenames = [ f['filename'] for f in files ]
check_call(['git', 'add'] + filenames)
check_call(['git', 'commit', '-m', msg])
print("PR #%i applied, with msg:" % num)
print()
print(msg)
print()
if branch != current_branch:
check_call(['git', 'checkout', current_branch])
return 0
backport_re = re.compile(r"[Bb]ackport.*?(\d+)")
def already_backported(branch, since_tag=None):
"""return set of PRs that have been backported already"""
if since_tag is None:
since_tag = check_output(['git','describe', branch, '--abbrev=0']).decode('utf8').strip()
cmd = ['git', 'log', '%s..%s' % (since_tag, branch), '--oneline']
lines = check_output(cmd).decode('utf8')
return set(int(num) for num in backport_re.findall(lines))
def should_backport(labels=None, milestone=None):
"""return set of PRs marked for backport"""
if labels is None and milestone is None:
raise ValueError("Specify one of labels or milestone.")
elif labels is not None and milestone is not None:
raise ValueError("Specify only one of labels or milestone.")
if labels is not None:
issues = get_issues_list("statsmodels/statsmodels",
labels=labels,
state='closed',
auth=True,
)
else:
milestone_id = get_milestone_id("statsmodels/statsmodels", milestone,
auth=True)
issues = get_issues_list("statsmodels/statsmodels",
milestone=milestone_id,
state='closed',
auth=True,
)
should_backport = []
merged_dates = []
for issue in issues:
if not is_pull_request(issue):
continue
pr = get_pull_request("statsmodels/statsmodels", issue['number'],
auth=True)
if not pr['merged']:
print ("Marked PR closed without merge: %i" % pr['number'])
continue
if pr['number'] not in should_backport:
merged_dates.append(pr['merged_at'])
should_backport.append(pr['number'])
return Series(merged_dates, index=should_backport)
if __name__ == '__main__':
if len(sys.argv) < 2:
print(__doc__)
sys.exit(1)
if len(sys.argv) < 3:
branch = sys.argv[1]
already = already_backported(branch)
#NOTE: change this to the label you've used for marking a backport
should = should_backport(milestone="0.5.1")
print ("The following PRs should be backported:")
to_backport = []
if already:
should = should.ix[set(should.index).difference(already)]
should.sort()
for pr, date in should.iteritems():
print (pr)
sys.exit(0)
sys.exit(backport_pr(sys.argv[1], int(sys.argv[2])))
| bsd-3-clause |
anntzer/scikit-learn | examples/calibration/plot_calibration.py | 15 | 4977 | """
======================================
Probability calibration of classifiers
======================================
When performing classification you often want to predict not only
the class label, but also the associated probability. This probability
gives you some kind of confidence on the prediction. However, not all
classifiers provide well-calibrated probabilities, some being over-confident
while others being under-confident. Thus, a separate calibration of predicted
probabilities is often desirable as a postprocessing. This example illustrates
two different methods for this calibration and evaluates the quality of the
returned probabilities using Brier's score
(see https://en.wikipedia.org/wiki/Brier_score).
Compared are the estimated probability using a Gaussian naive Bayes classifier
without calibration, with a sigmoid calibration, and with a non-parametric
isotonic calibration. One can observe that only the non-parametric model is
able to provide a probability calibration that returns probabilities close
to the expected 0.5 for most of the samples belonging to the middle
cluster with heterogeneous labels. This results in a significantly improved
Brier score.
"""
print(__doc__)
# Author: Mathieu Blondel <mathieu@mblondel.org>
# Alexandre Gramfort <alexandre.gramfort@telecom-paristech.fr>
# Balazs Kegl <balazs.kegl@gmail.com>
# Jan Hendrik Metzen <jhm@informatik.uni-bremen.de>
# License: BSD Style.
import numpy as np
import matplotlib.pyplot as plt
from matplotlib import cm
from sklearn.datasets import make_blobs
from sklearn.naive_bayes import GaussianNB
from sklearn.metrics import brier_score_loss
from sklearn.calibration import CalibratedClassifierCV
from sklearn.model_selection import train_test_split
n_samples = 50000
n_bins = 3 # use 3 bins for calibration_curve as we have 3 clusters here
# Generate 3 blobs with 2 classes where the second blob contains
# half positive samples and half negative samples. Probability in this
# blob is therefore 0.5.
centers = [(-5, -5), (0, 0), (5, 5)]
X, y = make_blobs(n_samples=n_samples, centers=centers, shuffle=False,
random_state=42)
y[:n_samples // 2] = 0
y[n_samples // 2:] = 1
sample_weight = np.random.RandomState(42).rand(y.shape[0])
# split train, test for calibration
X_train, X_test, y_train, y_test, sw_train, sw_test = \
train_test_split(X, y, sample_weight, test_size=0.9, random_state=42)
# Gaussian Naive-Bayes with no calibration
clf = GaussianNB()
clf.fit(X_train, y_train) # GaussianNB itself does not support sample-weights
prob_pos_clf = clf.predict_proba(X_test)[:, 1]
# Gaussian Naive-Bayes with isotonic calibration
clf_isotonic = CalibratedClassifierCV(clf, cv=2, method='isotonic')
clf_isotonic.fit(X_train, y_train, sample_weight=sw_train)
prob_pos_isotonic = clf_isotonic.predict_proba(X_test)[:, 1]
# Gaussian Naive-Bayes with sigmoid calibration
clf_sigmoid = CalibratedClassifierCV(clf, cv=2, method='sigmoid')
clf_sigmoid.fit(X_train, y_train, sample_weight=sw_train)
prob_pos_sigmoid = clf_sigmoid.predict_proba(X_test)[:, 1]
print("Brier score losses: (the smaller the better)")
clf_score = brier_score_loss(y_test, prob_pos_clf, sample_weight=sw_test)
print("No calibration: %1.3f" % clf_score)
clf_isotonic_score = brier_score_loss(y_test, prob_pos_isotonic,
sample_weight=sw_test)
print("With isotonic calibration: %1.3f" % clf_isotonic_score)
clf_sigmoid_score = brier_score_loss(y_test, prob_pos_sigmoid,
sample_weight=sw_test)
print("With sigmoid calibration: %1.3f" % clf_sigmoid_score)
# #############################################################################
# Plot the data and the predicted probabilities
plt.figure()
y_unique = np.unique(y)
colors = cm.rainbow(np.linspace(0.0, 1.0, y_unique.size))
for this_y, color in zip(y_unique, colors):
this_X = X_train[y_train == this_y]
this_sw = sw_train[y_train == this_y]
plt.scatter(this_X[:, 0], this_X[:, 1], s=this_sw * 50,
c=color[np.newaxis, :],
alpha=0.5, edgecolor='k',
label="Class %s" % this_y)
plt.legend(loc="best")
plt.title("Data")
plt.figure()
order = np.lexsort((prob_pos_clf, ))
plt.plot(prob_pos_clf[order], 'r', label='No calibration (%1.3f)' % clf_score)
plt.plot(prob_pos_isotonic[order], 'g', linewidth=3,
label='Isotonic calibration (%1.3f)' % clf_isotonic_score)
plt.plot(prob_pos_sigmoid[order], 'b', linewidth=3,
label='Sigmoid calibration (%1.3f)' % clf_sigmoid_score)
plt.plot(np.linspace(0, y_test.size, 51)[1::2],
y_test[order].reshape(25, -1).mean(1),
'k', linewidth=3, label=r'Empirical')
plt.ylim([-0.05, 1.05])
plt.xlabel("Instances sorted according to predicted probability "
"(uncalibrated GNB)")
plt.ylabel("P(y=1)")
plt.legend(loc="upper left")
plt.title("Gaussian naive Bayes probabilities")
plt.show()
| bsd-3-clause |
salbrandi/patella | patella/click_commands.py | 1 | 3221 | # -*- coding: utf-8 -*-
"""
Controls command line operations
The only particularly relevant command now i: patella startup <path>
not all commands retain functionality - this will be updated eventually (read: it might not be)
"""
# \/ Third-Party Packages \/
import os
import os.path
import click
import pandas as pd
# \/ Local Packages \/
from . import htmlparser as htmlparser
from . import patellaserver as flaskapp
class filec:
pass
file1 = filec()
file2 = filec()
file1.df = file2.df = pd.DataFrame({'foo': []})
file1.path = file2.path = ''
file1.name = file2.name = ''
@click.group()
def patella():
pass
@click.command()
@click.argument('url')
@click.option('--filename', default='datafile', help='specify the name of the local file that will be downloaded to the current directory')
@click.option('--filetype', default='.csv', help='specify the file type the scraper will look for')
def scrape_url(url, filetype, filename):
parseobj = htmlparser.find_download_links(url, filetype, filename, download=True)
if type(parseobj) != 'NoneType':
click.echo('ERROR: ' + parseobj['error']) # Error reporting
@click.command()
@click.argument('file_one')
@click.option('--delimiters', default=',:,', help='Specify file type delimiters in format <DELIM>:<DELIM2>')
def load_data(file_one, delimiters):
file1.path = os.getcwd() + '/' + file_one
if os.path.exists(file1.path):
file1.name = file_one
list_delims = delimiters.split(':')
if len(list_delims) == 2:
file1.df = pd.read_table(file1.path, list_delims[0], header=0)
file2.df = htmlparser.get_fe()
os.environ['LOCAL_FILE_PATH'] = file1.path
click.echo('file successfully loaded into Dataframes')
else:
if not os.path.exists(file1.path):
click.echo('no files found with the name ' + file_one + ' in path ' + file1.path)
@click.command()
@click.argument('column')
@click.argument('filename')
def change_index(filename, column):
if filename == file1:
file1.df.set_index(column)
else:
click.echo('no file found with that name')
@click.command()
@click.argument('column_names')
@click.argument('file')
def change_names(file, column_names):
pass
@click.command()
@click.argument('path')
def startserver(path):
flaskapp.startserver(path)
@click.command()
@click.argument('file')
@click.argument('col')
@click.option('--title', default=' ', help='specify the plot title')
@click.option('--x_title', default=' ', help='specify the X axis title')
@click.option('--y_title', default=' ', help='specify the Y axis title')
def plot(file, col, title, x_title, y_title):
file1.path = os.getcwd() + '/data/' + file
file1.df = pd.read_table(file1.path, ',', header=0)
htmlparser.compare(file1.df, htmlparser.get_fe(), col, title, x_title, y_title)
# A test cli command
@click.command()
@click.argument('foo')
def testme(foo):
pass
# add all the subcommands to the patella group
patella.add_command(scrape_url, name='scrape')
patella.add_command(testme, name='test')
patella.add_command(plot)
patella.add_command(load_data, name='load')
patella.add_command(startserver, name='startup') | mit |
AlexRobson/scikit-learn | sklearn/linear_model/tests/test_omp.py | 272 | 7752 | # Author: Vlad Niculae
# Licence: BSD 3 clause
import numpy as np
from sklearn.utils.testing import assert_raises
from sklearn.utils.testing import assert_true
from sklearn.utils.testing import assert_equal
from sklearn.utils.testing import assert_array_equal
from sklearn.utils.testing import assert_array_almost_equal
from sklearn.utils.testing import assert_warns
from sklearn.utils.testing import ignore_warnings
from sklearn.linear_model import (orthogonal_mp, orthogonal_mp_gram,
OrthogonalMatchingPursuit,
OrthogonalMatchingPursuitCV,
LinearRegression)
from sklearn.utils import check_random_state
from sklearn.datasets import make_sparse_coded_signal
n_samples, n_features, n_nonzero_coefs, n_targets = 20, 30, 5, 3
y, X, gamma = make_sparse_coded_signal(n_targets, n_features, n_samples,
n_nonzero_coefs, random_state=0)
G, Xy = np.dot(X.T, X), np.dot(X.T, y)
# this makes X (n_samples, n_features)
# and y (n_samples, 3)
def test_correct_shapes():
assert_equal(orthogonal_mp(X, y[:, 0], n_nonzero_coefs=5).shape,
(n_features,))
assert_equal(orthogonal_mp(X, y, n_nonzero_coefs=5).shape,
(n_features, 3))
def test_correct_shapes_gram():
assert_equal(orthogonal_mp_gram(G, Xy[:, 0], n_nonzero_coefs=5).shape,
(n_features,))
assert_equal(orthogonal_mp_gram(G, Xy, n_nonzero_coefs=5).shape,
(n_features, 3))
def test_n_nonzero_coefs():
assert_true(np.count_nonzero(orthogonal_mp(X, y[:, 0],
n_nonzero_coefs=5)) <= 5)
assert_true(np.count_nonzero(orthogonal_mp(X, y[:, 0], n_nonzero_coefs=5,
precompute=True)) <= 5)
def test_tol():
tol = 0.5
gamma = orthogonal_mp(X, y[:, 0], tol=tol)
gamma_gram = orthogonal_mp(X, y[:, 0], tol=tol, precompute=True)
assert_true(np.sum((y[:, 0] - np.dot(X, gamma)) ** 2) <= tol)
assert_true(np.sum((y[:, 0] - np.dot(X, gamma_gram)) ** 2) <= tol)
def test_with_without_gram():
assert_array_almost_equal(
orthogonal_mp(X, y, n_nonzero_coefs=5),
orthogonal_mp(X, y, n_nonzero_coefs=5, precompute=True))
def test_with_without_gram_tol():
assert_array_almost_equal(
orthogonal_mp(X, y, tol=1.),
orthogonal_mp(X, y, tol=1., precompute=True))
def test_unreachable_accuracy():
assert_array_almost_equal(
orthogonal_mp(X, y, tol=0),
orthogonal_mp(X, y, n_nonzero_coefs=n_features))
assert_array_almost_equal(
assert_warns(RuntimeWarning, orthogonal_mp, X, y, tol=0,
precompute=True),
orthogonal_mp(X, y, precompute=True,
n_nonzero_coefs=n_features))
def test_bad_input():
assert_raises(ValueError, orthogonal_mp, X, y, tol=-1)
assert_raises(ValueError, orthogonal_mp, X, y, n_nonzero_coefs=-1)
assert_raises(ValueError, orthogonal_mp, X, y,
n_nonzero_coefs=n_features + 1)
assert_raises(ValueError, orthogonal_mp_gram, G, Xy, tol=-1)
assert_raises(ValueError, orthogonal_mp_gram, G, Xy, n_nonzero_coefs=-1)
assert_raises(ValueError, orthogonal_mp_gram, G, Xy,
n_nonzero_coefs=n_features + 1)
def test_perfect_signal_recovery():
idx, = gamma[:, 0].nonzero()
gamma_rec = orthogonal_mp(X, y[:, 0], 5)
gamma_gram = orthogonal_mp_gram(G, Xy[:, 0], 5)
assert_array_equal(idx, np.flatnonzero(gamma_rec))
assert_array_equal(idx, np.flatnonzero(gamma_gram))
assert_array_almost_equal(gamma[:, 0], gamma_rec, decimal=2)
assert_array_almost_equal(gamma[:, 0], gamma_gram, decimal=2)
def test_estimator():
omp = OrthogonalMatchingPursuit(n_nonzero_coefs=n_nonzero_coefs)
omp.fit(X, y[:, 0])
assert_equal(omp.coef_.shape, (n_features,))
assert_equal(omp.intercept_.shape, ())
assert_true(np.count_nonzero(omp.coef_) <= n_nonzero_coefs)
omp.fit(X, y)
assert_equal(omp.coef_.shape, (n_targets, n_features))
assert_equal(omp.intercept_.shape, (n_targets,))
assert_true(np.count_nonzero(omp.coef_) <= n_targets * n_nonzero_coefs)
omp.set_params(fit_intercept=False, normalize=False)
omp.fit(X, y[:, 0])
assert_equal(omp.coef_.shape, (n_features,))
assert_equal(omp.intercept_, 0)
assert_true(np.count_nonzero(omp.coef_) <= n_nonzero_coefs)
omp.fit(X, y)
assert_equal(omp.coef_.shape, (n_targets, n_features))
assert_equal(omp.intercept_, 0)
assert_true(np.count_nonzero(omp.coef_) <= n_targets * n_nonzero_coefs)
def test_identical_regressors():
newX = X.copy()
newX[:, 1] = newX[:, 0]
gamma = np.zeros(n_features)
gamma[0] = gamma[1] = 1.
newy = np.dot(newX, gamma)
assert_warns(RuntimeWarning, orthogonal_mp, newX, newy, 2)
def test_swapped_regressors():
gamma = np.zeros(n_features)
# X[:, 21] should be selected first, then X[:, 0] selected second,
# which will take X[:, 21]'s place in case the algorithm does
# column swapping for optimization (which is the case at the moment)
gamma[21] = 1.0
gamma[0] = 0.5
new_y = np.dot(X, gamma)
new_Xy = np.dot(X.T, new_y)
gamma_hat = orthogonal_mp(X, new_y, 2)
gamma_hat_gram = orthogonal_mp_gram(G, new_Xy, 2)
assert_array_equal(np.flatnonzero(gamma_hat), [0, 21])
assert_array_equal(np.flatnonzero(gamma_hat_gram), [0, 21])
def test_no_atoms():
y_empty = np.zeros_like(y)
Xy_empty = np.dot(X.T, y_empty)
gamma_empty = ignore_warnings(orthogonal_mp)(X, y_empty, 1)
gamma_empty_gram = ignore_warnings(orthogonal_mp)(G, Xy_empty, 1)
assert_equal(np.all(gamma_empty == 0), True)
assert_equal(np.all(gamma_empty_gram == 0), True)
def test_omp_path():
path = orthogonal_mp(X, y, n_nonzero_coefs=5, return_path=True)
last = orthogonal_mp(X, y, n_nonzero_coefs=5, return_path=False)
assert_equal(path.shape, (n_features, n_targets, 5))
assert_array_almost_equal(path[:, :, -1], last)
path = orthogonal_mp_gram(G, Xy, n_nonzero_coefs=5, return_path=True)
last = orthogonal_mp_gram(G, Xy, n_nonzero_coefs=5, return_path=False)
assert_equal(path.shape, (n_features, n_targets, 5))
assert_array_almost_equal(path[:, :, -1], last)
def test_omp_return_path_prop_with_gram():
path = orthogonal_mp(X, y, n_nonzero_coefs=5, return_path=True,
precompute=True)
last = orthogonal_mp(X, y, n_nonzero_coefs=5, return_path=False,
precompute=True)
assert_equal(path.shape, (n_features, n_targets, 5))
assert_array_almost_equal(path[:, :, -1], last)
def test_omp_cv():
y_ = y[:, 0]
gamma_ = gamma[:, 0]
ompcv = OrthogonalMatchingPursuitCV(normalize=True, fit_intercept=False,
max_iter=10, cv=5)
ompcv.fit(X, y_)
assert_equal(ompcv.n_nonzero_coefs_, n_nonzero_coefs)
assert_array_almost_equal(ompcv.coef_, gamma_)
omp = OrthogonalMatchingPursuit(normalize=True, fit_intercept=False,
n_nonzero_coefs=ompcv.n_nonzero_coefs_)
omp.fit(X, y_)
assert_array_almost_equal(ompcv.coef_, omp.coef_)
def test_omp_reaches_least_squares():
# Use small simple data; it's a sanity check but OMP can stop early
rng = check_random_state(0)
n_samples, n_features = (10, 8)
n_targets = 3
X = rng.randn(n_samples, n_features)
Y = rng.randn(n_samples, n_targets)
omp = OrthogonalMatchingPursuit(n_nonzero_coefs=n_features)
lstsq = LinearRegression()
omp.fit(X, Y)
lstsq.fit(X, Y)
assert_array_almost_equal(omp.coef_, lstsq.coef_)
| bsd-3-clause |
fraka6/trading-with-python | lib/functions.py | 76 | 11627 | # -*- coding: utf-8 -*-
"""
twp support functions
@author: Jev Kuznetsov
Licence: GPL v2
"""
from scipy import polyfit, polyval
import datetime as dt
#from datetime import datetime, date
from pandas import DataFrame, Index, Series
import csv
import matplotlib.pyplot as plt
import numpy as np
import pandas as pd
def nans(shape, dtype=float):
''' create a nan numpy array '''
a = np.empty(shape, dtype)
a.fill(np.nan)
return a
def plotCorrelationMatrix(price, thresh = None):
''' plot a correlation matrix as a heatmap image
inputs:
price: prices DataFrame
thresh: correlation threshold to use for checking, default None
'''
symbols = price.columns.tolist()
R = price.pct_change()
correlationMatrix = R.corr()
if thresh is not None:
correlationMatrix = correlationMatrix > thresh
plt.imshow(abs(correlationMatrix.values),interpolation='none')
plt.xticks(range(len(symbols)),symbols)
plt.yticks(range(len(symbols)),symbols)
plt.colorbar()
plt.title('Correlation matrix')
return correlationMatrix
def pca(A):
""" performs principal components analysis
(PCA) on the n-by-p DataFrame A
Rows of A correspond to observations, columns to variables.
Returns :
coeff : principal components, column-wise
transform: A in principal component space
latent : eigenvalues
"""
# computing eigenvalues and eigenvectors of covariance matrix
M = (A - A.mean()).T # subtract the mean (along columns)
[latent,coeff] = np.linalg.eig(np.cov(M)) # attention:not always sorted
idx = np.argsort(latent) # sort eigenvalues
idx = idx[::-1] # in ascending order
coeff = coeff[:,idx]
latent = latent[idx]
score = np.dot(coeff.T,A.T) # projection of the data in the new space
transform = DataFrame(index = A.index, data = score.T)
return coeff,transform,latent
def pos2pnl(price,position , ibTransactionCost=False ):
"""
calculate pnl based on price and position
Inputs:
---------
price: series or dataframe of price
position: number of shares at each time. Column names must be same as in price
ibTransactionCost: use bundled Interactive Brokers transaction cost of 0.005$/share
Returns a portfolio DataFrame
"""
delta=position.diff()
port = DataFrame(index=price.index)
if isinstance(price,Series): # no need to sum along 1 for series
port['cash'] = (-delta*price).cumsum()
port['stock'] = (position*price)
else: # dealing with DataFrame here
port['cash'] = (-delta*price).sum(axis=1).cumsum()
port['stock'] = (position*price).sum(axis=1)
if ibTransactionCost:
tc = -0.005*position.diff().abs() # basic transaction cost
tc[(tc>-1) & (tc<0)] = -1 # everything under 1$ will be ceil'd to 1$
if isinstance(price,DataFrame):
tc = tc.sum(axis=1)
port['tc'] = tc.cumsum()
else:
port['tc'] = 0.
port['total'] = port['stock']+port['cash']+port['tc']
return port
def tradeBracket(price,entryBar,maxTradeLength,bracket):
'''
trade a symmetrical bracket on price series, return price delta and exit bar #
Input
------
price : series of price values
entryBar: entry bar number
maxTradeLength : max trade duration in bars
bracket : allowed price deviation
'''
lastBar = min(entryBar+maxTradeLength,len(price)-1)
p = price[entryBar:lastBar]-price[entryBar]
idxOutOfBound = np.nonzero(abs(p)>bracket) # find indices where price comes out of bracket
if idxOutOfBound[0].any(): # found match
priceDelta = p[idxOutOfBound[0][0]]
exitBar = idxOutOfBound[0][0]+entryBar
else: # all in bracket, exiting based on time
priceDelta = p[-1]
exitBar = lastBar
return priceDelta, exitBar
def estimateBeta(priceY,priceX,algo = 'standard'):
'''
estimate stock Y vs stock X beta using iterative linear
regression. Outliers outside 3 sigma boundary are filtered out
Parameters
--------
priceX : price series of x (usually market)
priceY : price series of y (estimate beta of this price)
Returns
--------
beta : stockY beta relative to stock X
'''
X = DataFrame({'x':priceX,'y':priceY})
if algo=='returns':
ret = (X/X.shift(1)-1).dropna().values
#print len(ret)
x = ret[:,0]
y = ret[:,1]
# filter high values
low = np.percentile(x,20)
high = np.percentile(x,80)
iValid = (x>low) & (x<high)
x = x[iValid]
y = y[iValid]
iteration = 1
nrOutliers = 1
while iteration < 10 and nrOutliers > 0 :
(a,b) = polyfit(x,y,1)
yf = polyval([a,b],x)
#plot(x,y,'x',x,yf,'r-')
err = yf-y
idxOutlier = abs(err) > 3*np.std(err)
nrOutliers =sum(idxOutlier)
beta = a
#print 'Iteration: %i beta: %.2f outliers: %i' % (iteration,beta, nrOutliers)
x = x[~idxOutlier]
y = y[~idxOutlier]
iteration += 1
elif algo=='log':
x = np.log(X['x'])
y = np.log(X['y'])
(a,b) = polyfit(x,y,1)
beta = a
elif algo=='standard':
ret =np.log(X).diff().dropna()
beta = ret['x'].cov(ret['y'])/ret['x'].var()
else:
raise TypeError("unknown algorithm type, use 'standard', 'log' or 'returns'")
return beta
def estimateVolatility(ohlc, N=10, algo='YangZhang'):
"""
Volatility estimation
Possible algorithms: ['YangZhang', 'CC']
"""
cc = np.log(ohlc.close/ohlc.close.shift(1))
if algo == 'YangZhang': # Yang-zhang volatility
ho = np.log(ohlc.high/ohlc.open)
lo = np.log(ohlc.low/ohlc.open)
co = np.log(ohlc.close/ohlc.open)
oc = np.log(ohlc.open/ohlc.close.shift(1))
oc_sq = oc**2
cc_sq = cc**2
rs = ho*(ho-co)+lo*(lo-co)
close_vol = pd.rolling_sum(cc_sq, window=N) * (1.0 / (N - 1.0))
open_vol = pd.rolling_sum(oc_sq, window=N) * (1.0 / (N - 1.0))
window_rs = pd.rolling_sum(rs, window=N) * (1.0 / (N - 1.0))
result = (open_vol + 0.164333 * close_vol + 0.835667 * window_rs).apply(np.sqrt) * np.sqrt(252)
result[:N-1] = np.nan
elif algo == 'CC': # standard close-close estimator
result = np.sqrt(252)*np.sqrt(((pd.rolling_sum(cc**2,N))/N))
else:
raise ValueError('Unknown algo type.')
return result*100
def rank(current,past):
''' calculate a relative rank 0..1 for a value against series '''
return (current>past).sum()/float(past.count())
def returns(df):
return (df/df.shift(1)-1)
def logReturns(df):
t = np.log(df)
return t-t.shift(1)
def dateTimeToDate(idx):
''' convert datetime index to date '''
dates = []
for dtm in idx:
dates.append(dtm.date())
return dates
def readBiggerScreener(fName):
''' import data from Bigger Capital screener '''
with open(fName,'rb') as f:
reader = csv.reader(f)
rows = [row for row in reader]
header = rows[0]
data = [[] for i in range(len(header))]
for row in rows[1:]:
for i,elm in enumerate(row):
try:
data[i].append(float(elm))
except Exception:
data[i].append(str(elm))
return DataFrame(dict(zip(header,data)),index=Index(range(len(data[0]))))[header]
def sharpe(pnl):
return np.sqrt(250)*pnl.mean()/pnl.std()
def drawdown(s):
"""
calculate max drawdown and duration
Input:
s, price or cumulative pnl curve $
Returns:
drawdown : vector of drawdwon values
duration : vector of drawdown duration
"""
# convert to array if got pandas series, 10x speedup
if isinstance(s,pd.Series):
idx = s.index
s = s.values
returnSeries = True
else:
returnSeries = False
if s.min() < 0: # offset if signal minimum is less than zero
s = s-s.min()
highwatermark = np.zeros(len(s))
drawdown = np.zeros(len(s))
drawdowndur = np.zeros(len(s))
for t in range(1,len(s)):
highwatermark[t] = max(highwatermark[t-1], s[t])
drawdown[t] = (highwatermark[t]-s[t])
drawdowndur[t]= (0 if drawdown[t] == 0 else drawdowndur[t-1]+1)
if returnSeries:
return pd.Series(index=idx,data=drawdown), pd.Series(index=idx,data=drawdowndur)
else:
return drawdown , drawdowndur
def profitRatio(pnl):
'''
calculate profit ratio as sum(pnl)/drawdown
Input: pnl - daily pnl, Series or DataFrame
'''
def processVector(pnl): # process a single column
s = pnl.fillna(0)
dd = drawdown(s)[0]
p = s.sum()/dd.max()
return p
if isinstance(pnl,Series):
return processVector(pnl)
elif isinstance(pnl,DataFrame):
p = Series(index = pnl.columns)
for col in pnl.columns:
p[col] = processVector(pnl[col])
return p
else:
raise TypeError("Input must be DataFrame or Series, not "+str(type(pnl)))
def candlestick(df,width=0.5, colorup='b', colordown='r'):
''' plot a candlestick chart of a dataframe '''
O = df['open'].values
H = df['high'].values
L = df['low'].values
C = df['close'].values
fig = plt.gcf()
ax = plt.axes()
#ax.hold(True)
X = df.index
#plot high and low
ax.bar(X,height=H-L,bottom=L,width=0.1,color='k')
idxUp = C>O
ax.bar(X[idxUp],height=(C-O)[idxUp],bottom=O[idxUp],width=width,color=colorup)
idxDown = C<=O
ax.bar(X[idxDown],height=(O-C)[idxDown],bottom=C[idxDown],width=width,color=colordown)
try:
fig.autofmt_xdate()
except Exception: # pragma: no cover
pass
ax.grid(True)
#ax.bar(x,height=H-L,bottom=L,width=0.01,color='k')
def datetime2matlab(t):
''' convert datetime timestamp to matlab numeric timestamp '''
mdn = t + dt.timedelta(days = 366)
frac = (t-dt.datetime(t.year,t.month,t.day,0,0,0)).seconds / (24.0 * 60.0 * 60.0)
return mdn.toordinal() + frac
def getDataSources(fName = None):
''' return data sources directories for this machine.
directories are defined in datasources.ini or provided filepath'''
import socket
from ConfigParser import ConfigParser
pcName = socket.gethostname()
p = ConfigParser()
p.optionxform = str
if fName is None:
fName = 'datasources.ini'
p.read(fName)
if pcName not in p.sections():
raise NameError('Host name section %s not found in file %s' %(pcName,fName))
dataSources = {}
for option in p.options(pcName):
dataSources[option] = p.get(pcName,option)
return dataSources
if __name__ == '__main__':
df = DataFrame({'open':[1,2,3],'high':[5,6,7],'low':[-2,-1,0],'close':[2,1,4]})
plt.clf()
candlestick(df) | bsd-3-clause |
brynpickering/calliope | calliope/core/preprocess/lookup.py | 1 | 10678 | """
Copyright (C) 2013-2018 Calliope contributors listed in AUTHORS.
Licensed under the Apache 2.0 License (see LICENSE file).
lookup.py
~~~~~~~~~~~~~~~~~~
Functionality to create DataArrays for looking up string values between loc_techs
and loc_tech_carriers, to avoid string operations during backend operations.
"""
import xarray as xr
import numpy as np
import pandas as pd
from calliope import exceptions
def add_lookup_arrays(data, model_run):
"""
Take partially completed Calliope Model model_data and add lookup DataArrays
to it.
"""
data_dict = dict(
lookup_loc_carriers=lookup_loc_carriers(model_run),
lookup_loc_techs=lookup_loc_techs_non_conversion(model_run)
)
data.merge(xr.Dataset.from_dict(data_dict), inplace=True)
if model_run.sets['loc_techs_conversion']:
lookup_loc_techs_conversion(data, model_run)
if model_run.sets['loc_techs_conversion_plus']:
lookup_loc_techs_conversion_plus(data, model_run)
if model_run.sets['loc_techs_export']:
lookup_loc_techs_export(data)
if model_run.sets['loc_techs_area']:
lookup_loc_techs_area(data)
def lookup_loc_carriers(model_run):
"""
loc_carriers, used in system_wide balance, are linked to loc_tech_carriers
e.g. `X1::power` will be linked to `X1::chp::power` and `X1::battery::power`
in a comma delimited string, e.g. `X1::chp::power,X1::battery::power`
"""
# get the technologies associated with a certain loc_carrier
lookup_loc_carriers_dict = dict(dims=['loc_carriers'])
data = []
for loc_carrier in model_run.sets['loc_carriers']:
loc_tech_carrier = list(set(
i for i in
model_run.sets['loc_tech_carriers_prod'] +
model_run.sets['loc_tech_carriers_con']
if loc_carrier == '{0}::{2}'.format(*i.split("::"))
))
data.append(",".join(loc_tech_carrier))
lookup_loc_carriers_dict['data'] = data
return lookup_loc_carriers_dict
def lookup_loc_techs_non_conversion(model_run):
"""
loc_techs be linked to their loc_tech_carriers, based on their carrier_in or
carrier_out attribute. E.g. `X1::ccgt` will be linked to `X1::ccgt::power`
as carrier_out for the ccgt is `power`.
"""
lookup_loc_techs_dict = dict(dims=['loc_techs_non_conversion'])
data = []
for loc_tech in model_run.sets['loc_techs_non_conversion']:
# For any non-conversion technology, there is only one carrier (either
# produced or consumed)
loc_tech_carrier = list(set(
i for i in
model_run.sets['loc_tech_carriers_prod'] +
model_run.sets['loc_tech_carriers_con']
if loc_tech == i.rsplit("::", 1)[0]
))
if len(loc_tech_carrier) > 1:
raise exceptions.ModelError(
'More than one carrier associated with '
'non-conversion location:technology `{}`'.format(loc_tech)
)
else:
data.append(loc_tech_carrier[0])
lookup_loc_techs_dict['data'] = data
return lookup_loc_techs_dict
def lookup_loc_techs_conversion(dataset, model_run):
"""
Conversion technologies are seperated from other non-conversion technologies
as there is more than one carrier associated with a single loc_tech. Here,
the link is made per carrier tier (`out` and `in` are the two primary carrier
tiers)
"""
# Get the string name for a loc_tech which includes the carriers in and out
# associated with that technology (for conversion technologies)
carrier_tiers = model_run.sets['carrier_tiers']
loc_techs_conversion_array = xr.DataArray(
data=np.empty(
(len(model_run.sets['loc_techs_conversion']), len(carrier_tiers)),
dtype=np.object
),
dims=['loc_techs_conversion', 'carrier_tiers'],
coords={
'loc_techs_conversion': list(model_run.sets['loc_techs_conversion']),
'carrier_tiers': list(carrier_tiers)
}
)
for loc_tech in model_run.sets['loc_techs_conversion']:
# For any non-conversion technology, there are only two carriers
# (one produced and one consumed)
loc_tech_carrier_in = [
i for i in
model_run.sets['loc_tech_carriers_con']
if loc_tech == i.rsplit("::", 1)[0]
]
loc_tech_carrier_out = [
i for i in
model_run.sets['loc_tech_carriers_prod']
if loc_tech == i.rsplit("::", 1)[0]
]
if len(loc_tech_carrier_in) > 1 or len(loc_tech_carrier_out) > 1:
raise exceptions.ModelError(
'More than one carrier in or out associated with '
'conversion location:technology `{}`'.format(loc_tech)
)
else:
loc_techs_conversion_array.loc[
dict(loc_techs_conversion=loc_tech, carrier_tiers=["in", "out"])
] = [loc_tech_carrier_in[0], loc_tech_carrier_out[0]]
dataset.merge(loc_techs_conversion_array
.to_dataset(name="lookup_loc_techs_conversion"), inplace=True)
return None
def lookup_loc_techs_conversion_plus(dataset, model_run):
"""
Conversion plus technologies are seperated from other technologies
as there is more than one carrier associated with a single loc_tech. Here,
the link is made per carrier tier (`out`, `in`, `out_2`, `in_2`, `out_3`,
`in_3` are the possible carrier tiers). Multiple carriers may be associated
with a single loc_tech tier, so a comma delimited string will be created.
"""
# Get the string name for a loc_tech which includes all the carriers in
# and out associated with that technology (for conversion_plus technologies)
carrier_tiers = model_run.sets['carrier_tiers']
loc_techs_conversion_plus = model_run.sets['loc_techs_conversion_plus']
loc_techs_conversion_plus_array = (
np.empty((len(loc_techs_conversion_plus), len(carrier_tiers)), dtype=np.object)
)
primary_carrier_data = {'_in': [], '_out': []}
for loc_tech_idx, loc_tech in enumerate(loc_techs_conversion_plus):
_tech = loc_tech.split('::', 1)[1]
for k, v in primary_carrier_data.items():
primary_carrier = (
model_run.techs[_tech].essentials.get('primary_carrier' + k, '')
)
v.append(loc_tech + "::" + primary_carrier)
for carrier_tier_idx, carrier_tier in enumerate(carrier_tiers):
# create a list of carriers for the given technology that fits
# the current carrier_tier.
relevant_carriers = model_run.techs[_tech].essentials.get(
'carrier_' + carrier_tier, None)
if relevant_carriers and isinstance(relevant_carriers, list):
loc_tech_carriers = ','.join([loc_tech + "::" + i
for i in relevant_carriers])
elif relevant_carriers:
loc_tech_carriers = loc_tech + "::" + relevant_carriers
else:
continue
loc_techs_conversion_plus_array[loc_tech_idx, carrier_tier_idx] = loc_tech_carriers
for k, v in primary_carrier_data.items():
primary_carrier_data_array = xr.DataArray.from_dict({
'data': v, 'dims': ['loc_techs_conversion_plus']
})
dataset['lookup_primary_loc_tech_carriers' + k] = primary_carrier_data_array
dataset['lookup_loc_techs_conversion_plus'] = xr.DataArray(
data=loc_techs_conversion_plus_array,
dims=['loc_techs_conversion_plus', 'carrier_tiers'],
coords={
'loc_techs_conversion_plus': loc_techs_conversion_plus,
'carrier_tiers': carrier_tiers
}
)
return None
def lookup_loc_techs_export(dataset):
"""
For a given loc_tech, return loc_tech_carrier where the carrier is the export
carrier of that loc_tech
"""
data_dict = dict(dims=['loc_techs_export'], data=[])
for i in dataset.export_carrier:
data_dict['data'].append(i.loc_techs_export.item() + '::' + i.item())
dataset['lookup_loc_techs_export'] = xr.DataArray.from_dict(data_dict)
return None
def lookup_loc_techs_area(dataset):
"""
For a given loc, return loc_techs where the tech is any technology at that
location which defines a resource_area, if it isn't a demand technology.
If there are multiple loc_techs, the result is a comma delimited string of
loc_techs
"""
data_dict = dict(dims=['locs'], data=[])
for loc in dataset.locs:
relevant_loc_techs = [loc_tech for loc_tech in dataset.loc_techs_area.values
if loc == loc_tech.split('::')[0] and
loc_tech not in dataset.loc_techs_demand.values]
data_dict['data'].append(','.join(relevant_loc_techs))
dataset['lookup_loc_techs_area'] = xr.DataArray.from_dict(data_dict)
return None
def lookup_clusters(dataset):
"""
For any given timestep in a time clustered model, get:
1. the first and last timestep of the cluster,
2. the last timestep of the cluster corresponding to a date in the original timeseries
"""
data_dict_first = dict(dims=['timesteps'], data=[])
data_dict_last = dict(dims=['timesteps'], data=[])
for timestep in dataset.timesteps:
t = pd.to_datetime(timestep.item()).date().strftime('%Y-%m-%d')
timestep_first = dataset.timesteps.loc[t][0]
timestep_last = dataset.timesteps.loc[t][-1]
if timestep == timestep_first:
data_dict_first['data'].append(1)
data_dict_last['data'].append(timestep_last.values)
else:
data_dict_first['data'].append(0)
data_dict_last['data'].append(None)
dataset['lookup_cluster_first_timestep'] = xr.DataArray.from_dict(data_dict_first)
dataset['lookup_cluster_last_timestep'] = xr.DataArray.from_dict(data_dict_last)
if 'datesteps' in dataset.dims:
last_timesteps = dict(dims=['datesteps'], data=[])
cluster_date = dataset.timestep_cluster.to_pandas().resample('1D').mean()
for datestep in dataset.datesteps.to_index():
cluster = dataset.lookup_datestep_cluster.loc[datestep.strftime('%Y-%m-%d')].item()
last_timesteps['data'].append(pd.datetime.combine(
cluster_date[cluster_date == cluster].index[0].date(),
dataset.timesteps.to_index().time[-1]
))
dataset['lookup_datestep_last_cluster_timestep'] = xr.DataArray.from_dict(last_timesteps)
return None
| apache-2.0 |
vadimadr/python-algorithms | setup.py | 1 | 1083 | import sys
from setuptools import find_packages, setup
from setuptools.command.test import test as TestCommand
class PyTest(TestCommand):
user_options = [('pytest-args=', 'a', "Arguments to pass to pytest")]
def initialize_options(self):
TestCommand.initialize_options(self)
self.pytest_args = []
def run_tests(self):
import pytest
errno = pytest.main(self.pytest_args)
sys.exit(errno)
requirements = [
'numpy',
'scipy',
'matplotlib'
]
test_requirements = [
'pytest',
'pytest-cov',
'pytest-mock',
'hypothesis',
]
setup(
name='algorithm',
version='0.1',
packages=find_packages(exclude=['test']),
url='https://github.com/vadimadr/python-algorithms.py',
license='MIT',
author='Vadim Andronov', author_email='vadimadr@gmail.com',
description='Implementation of some common algorithms',
zip_safe=False,
package_data={'': 'LICENSE'},
install_requires=requirements,
tests_require=test_requirements,
test_suite='tests',
cmdclass={'test': PyTest},
)
| mit |
cwu2011/scikit-learn | examples/text/mlcomp_sparse_document_classification.py | 292 | 4498 | """
========================================================
Classification of text documents: using a MLComp dataset
========================================================
This is an example showing how the scikit-learn can be used to classify
documents by topics using a bag-of-words approach. This example uses
a scipy.sparse matrix to store the features instead of standard numpy arrays.
The dataset used in this example is the 20 newsgroups dataset and should be
downloaded from the http://mlcomp.org (free registration required):
http://mlcomp.org/datasets/379
Once downloaded unzip the archive somewhere on your filesystem.
For instance in::
% mkdir -p ~/data/mlcomp
% cd ~/data/mlcomp
% unzip /path/to/dataset-379-20news-18828_XXXXX.zip
You should get a folder ``~/data/mlcomp/379`` with a file named ``metadata``
and subfolders ``raw``, ``train`` and ``test`` holding the text documents
organized by newsgroups.
Then set the ``MLCOMP_DATASETS_HOME`` environment variable pointing to
the root folder holding the uncompressed archive::
% export MLCOMP_DATASETS_HOME="~/data/mlcomp"
Then you are ready to run this example using your favorite python shell::
% ipython examples/mlcomp_sparse_document_classification.py
"""
# Author: Olivier Grisel <olivier.grisel@ensta.org>
# License: BSD 3 clause
from __future__ import print_function
from time import time
import sys
import os
import numpy as np
import scipy.sparse as sp
import pylab as pl
from sklearn.datasets import load_mlcomp
from sklearn.feature_extraction.text import TfidfVectorizer
from sklearn.linear_model import SGDClassifier
from sklearn.metrics import confusion_matrix
from sklearn.metrics import classification_report
from sklearn.naive_bayes import MultinomialNB
print(__doc__)
if 'MLCOMP_DATASETS_HOME' not in os.environ:
print("MLCOMP_DATASETS_HOME not set; please follow the above instructions")
sys.exit(0)
# Load the training set
print("Loading 20 newsgroups training set... ")
news_train = load_mlcomp('20news-18828', 'train')
print(news_train.DESCR)
print("%d documents" % len(news_train.filenames))
print("%d categories" % len(news_train.target_names))
print("Extracting features from the dataset using a sparse vectorizer")
t0 = time()
vectorizer = TfidfVectorizer(encoding='latin1')
X_train = vectorizer.fit_transform((open(f).read()
for f in news_train.filenames))
print("done in %fs" % (time() - t0))
print("n_samples: %d, n_features: %d" % X_train.shape)
assert sp.issparse(X_train)
y_train = news_train.target
print("Loading 20 newsgroups test set... ")
news_test = load_mlcomp('20news-18828', 'test')
t0 = time()
print("done in %fs" % (time() - t0))
print("Predicting the labels of the test set...")
print("%d documents" % len(news_test.filenames))
print("%d categories" % len(news_test.target_names))
print("Extracting features from the dataset using the same vectorizer")
t0 = time()
X_test = vectorizer.transform((open(f).read() for f in news_test.filenames))
y_test = news_test.target
print("done in %fs" % (time() - t0))
print("n_samples: %d, n_features: %d" % X_test.shape)
###############################################################################
# Benchmark classifiers
def benchmark(clf_class, params, name):
print("parameters:", params)
t0 = time()
clf = clf_class(**params).fit(X_train, y_train)
print("done in %fs" % (time() - t0))
if hasattr(clf, 'coef_'):
print("Percentage of non zeros coef: %f"
% (np.mean(clf.coef_ != 0) * 100))
print("Predicting the outcomes of the testing set")
t0 = time()
pred = clf.predict(X_test)
print("done in %fs" % (time() - t0))
print("Classification report on test set for classifier:")
print(clf)
print()
print(classification_report(y_test, pred,
target_names=news_test.target_names))
cm = confusion_matrix(y_test, pred)
print("Confusion matrix:")
print(cm)
# Show confusion matrix
pl.matshow(cm)
pl.title('Confusion matrix of the %s classifier' % name)
pl.colorbar()
print("Testbenching a linear classifier...")
parameters = {
'loss': 'hinge',
'penalty': 'l2',
'n_iter': 50,
'alpha': 0.00001,
'fit_intercept': True,
}
benchmark(SGDClassifier, parameters, 'SGD')
print("Testbenching a MultinomialNB classifier...")
parameters = {'alpha': 0.01}
benchmark(MultinomialNB, parameters, 'MultinomialNB')
pl.show()
| bsd-3-clause |
jmausolf/Congressional_Record | Python_Scripts/__speech_classifier2.py | 1 | 11291 | ###################################
### ###
### Joshua G. Mausolf ###
### Department of Sociology ###
### Computation Institute ###
### University of Chicago ###
### ###
###################################
import re
import pandas as pd
import numpy as np
import glob
import os
##########################################################
#Preliminary Functions
##########################################################
def group_text(text, group_size):
"""
groups a text into text groups set by group_size
returns a list of grouped strings
"""
word_list = text.split()
group_list = []
for k in range(len(word_list)):
start = k
end = k + group_size
group_slice = word_list[start: end]
# append only groups of proper length/size
if len(group_slice) == group_size:
group_list.append(" ".join(group_slice))
return group_list
def remove_non_ascii_2(text):
import re
return re.sub(r'[^\x00-\x7F]+', "'", text)
def read_speech(speechfile):
speech = str(speechfile)
f = open(speech, 'rU')
raw = f.read().decode('utf8')
raw1 = raw.replace('.', ' ')
sent = remove_non_ascii_2(raw1)
return sent
def get_url(speechfile):
speech = str(speechfile)
f = open(speech, 'rU')
raw = f.read().decode('utf8')
sent = remove_non_ascii_2(raw)
url = sent.split('\n')[1]
return url
def get_group_set(group_size, text):
group_list = group_text(text, group_size)
group_set = set(group_list)
return group_set
def ngram(n, data):
ngram = get_group_set(n, data)
return ngram
##########################################################
#Establish Keyword Categories with Keywords/Phrases
##########################################################
wall_street = ["lobby", "lobbying", "lobbies", "special interest", "special interests", "revolving door", "campaign donor", "campaign donation", "campaign donations", "bidder", "highest bidder", "campaign contributions", "loophole", "loopholes", "tax shelter", "tax evasion", "write their own rules", "own rules", "Wall Street", "bailout", "bailouts"]
corporate_greed = ["cheat", "cheating", "stacked against", "stacked up against", " stacked against", "good benefits", "decent salary", "stack the deck", "deck got stacked against", "exploit", "exploiting", "protect workers", "protecting workers", "protect laborers", "protecting laborers", "protect Americans", "protecting Americans", "protect employee", "protect employees", "protecting employees", "work safe", "working safely", "safe at work", "work conditions", "innocent", "minimum wage", "pollute", "polluting", "regulate", "regulating", "federal oversight", "financial reform", "gambling", "derivative", "derivatives", "sub-prime", "risky investment", "risky investments", "bust unions", "union", "unions", "labor unions", "dirtiest air", "cheapest labor", "wages", "workplace safety", "Consumer Finance Protection Bureau", "consumer protection", "unions", "union label", "union workers", "CEO", "CEO's", "corporation", "corporations"]
inequality = ["wealth", "wealthy", "income equality", "income inequality", "inequality", "privileged", "rich", "1%", "1 percent", "one percent", "99%", "99 percent", "ninety-nine percent", "ninety nine percent", "fair", "unfair", "fairness", "unfairness", "middle-class", "middle class", "working class", "working-class", "lower class", "poor", "poverty", "rich", "upper class", "equity", "inequity", "egalitarian", "disparity", "unequal", "average American", "average Americans", "Wall Street", "Main Street", "main street", "50 million", " Warren Buffet", "Warren Buffett's secretary", "secretary", "class warfare", "class warefare", "warrior for the middle class", "Giving everybody a shot", "giving everybody a shot", "everybody a fair shot", "work your way up", "working your way up", "starting at the bottom", "blood, sweat and tears", "blood sweat and tears", "blood, sweat, and tears", "willing to work hard", "fair and just", "everybody is included", " folks at the top", "folks at the bottom"]
fair_share = ["fair shot", "fair shake", "gets a fair shake", "pay their fair share", "our fair share", "fair share"]
occupy = ["occupy", "occupying", "OWS", "Occupy Wall Street"]
#Top Keywords Listed by OWS Protestors
#Keywords kepts if >=5 responses for first, second, and third, choices
#These were pooled, and duplicates removed.
#http://occupyresearch.net/orgs/
OWS_survey = ["income inequality", "inequality", "economic conditions", "corruption", "justice", "corporate influence in politics", "corporations", "corporate personhood", "injustice", "social justice", "corporate greed", "anti-capitalism", "greed", "unemployment", "citizens united", "equality", "money in politics", "government corruption", "poverty", "environmental concerns", "democracy", "fairness", "freedom", "change", "inequity", "jobs", "money out of politics", "health care", "financial reform", "solidarity", "war", "movement building", "foreclosures", "frustration", "banks", "politics", "curiosity", "money", "campaign finance reform", "climate change", "education", "disparity", "bailouts", "future", "anger", "hope", "revolution", "humanity", "equity", "children", "police brutality", "rights", "community", "Oligarchy", "0.99", "fascism", "freedom of speech", "food", "civil liberties", "taxes", "peace", "plutocracy", "love", "corporate corruption", "joblessness", "campaign finance", "fraud", "Wall Street", "human rights", "compassion", "accountability", "NDAA", "debt", "tax the rich", "lobbyists", "broken political system", "agreement", "inequality", "corruption", "economy", "justice", "environment", "income inequality", "economic inequality", "healthcare", "capitalism", "corporatism", "economics", "social injustice", "income disparity", "political corruption", "government", "economic justice", "economic disparity", "economic injustice", "civil rights", "wealth disparity", "oppression", "racism", "patriarchy", "sustainability", "homelessness", "corporate power", "workers rights", "student loans", "wall street", "corrupt government", "exploitation", "accountability", "housing", "patriotism", "apathy", "responsibility", "corporations"]
other_terms = ["jobs", "economy", "unemployment"]
terms = wall_street+corporate_greed+inequality+fair_share+occupy+OWS_survey+other_terms
##########################################################
#Speech Phrase Counter Functions
##########################################################
def speech_phrase_counter(ngram1, ngram2, ngram3, ngram4, terms):
#print "FUNCTION TEST"
for term in terms:
for gram in ngram4:
if term == gram:
count = sent.count(gram)
print "Count: ", count, "| ", gram
for gram in ngram3:
if term == gram:
count = sent.count(gram)
print "Count: ", count, "| ", gram
for gram in ngram2:
if term == gram:
count = sent.count(gram)
print "Count: ", count, "| ", gram
for gram in ngram1:
if term == gram:
count = sent.count(gram)
print "Count: ", count, "| ", gram
#speech_phrase_counter(ngram1, ngram2, ngram3, ngram4, terms)
def find_time(text):
#Add Time to Data Frame
try:
try:
time = re.findall(r'\d{1,2}:\d{1,2}\s[A-Z].[A-Z].+', sent)
#time = time0[0].replace('P M ', 'PM').replace('A M ', 'AM')
#df.ix[n, "TIME"] = time
return time[0]
except:
try:
time = re.findall(r'\d{1,2}:\d{1,2}\s[A-Z].[A-Z].+', sent)
#df.ix[n, "TIME"] = time[0]
return time[0]
except:
time = re.findall(r'\d{1,2}(?:(?:AM|PM)|(?::\d{1,2})(?:AM|PM)?)', sent)
#df.ix[n, "TIME"] = time[0]
return time[0]
except:
pass
def return_time(text):
#Add Time to Data Frame
try:
try:
time0 = re.findall(r'\d{1,2}:\d{1,2}\s[A-Z].[A-Z].+', sent)
time = time0[0].replace('P M ', 'PM').replace('A M ', 'AM')
#df.ix[n, "TIME"] = time
return time
except:
try:
time = re.findall(r'\d{1,2}:\d{1,2}\s[A-Z].[A-Z].+', sent)
#df.ix[n, "TIME"] = time[0]
return time[0]
except:
time = re.findall(r'\d{1,2}(?:(?:AM|PM)|(?::\d{1,2})(?:AM|PM)?)', sent)
#df.ix[n, "TIME"] = time[0]
return time[0]
except:
pass
def speech_phrase_counter2(ngram1, ngram2, ngram3, ngram4, terms, df, n):
#print "FUNCTION TEST"
for term in terms:
for gram in ngram4:
if term == gram:
count = sent.count(gram)
df.ix[n, term] = count
for gram in ngram3:
if term == gram:
count = sent.count(gram)
df.ix[n, term] = count
for gram in ngram2:
if term == gram:
count = sent.count(gram)
df.ix[n, term] = count
for gram in ngram1:
if term == gram:
count = sent.count(gram)
df.ix[n, term] = count
def speech_phrase_counter3(ngram1, ngram2, ngram3, ngram4, terms, df, n, sent):
#print "FUNCTION TEST"
for term in terms:
for gram in ngram4:
if term == gram:
count = sent.count(gram)
df.ix[n, term] = count
for gram in ngram3:
if term == gram:
count = sent.count(gram)
df.ix[n, term] = count
for gram in ngram2:
if term == gram:
count = sent.count(gram)
df.ix[n, term] = count
for gram in ngram1:
if term == gram:
count = sent.count(gram)
df.ix[n, term] = count
##########################################################
#Setup Data Frame
##########################################################
def speech_classifier(folder_name, output_file, addtime=0, addloc=0, addcite=0):
#Setup Initial Data Frame
header = ["DATE", "TIME", "LOCATION", "URL"]+terms
index = np.arange(0)
df = pd.DataFrame(columns=header, index = index)
#Get Files in Folder
#os.chdir("Speech_President")
folder = str(folder_name)
outfile = str(output_file)
os.chdir(folder)
speech_files = glob.glob("*.txt")
for speech in speech_files:
print "Analyzing speech file ", speech, "..."
date = speech.split('-', 1)[1].replace(".txt", "")
n = len(df.index)
#Add Row to Data Frame
df.loc[n] = 0
df.ix[n, "DATE"] = date
sent = read_speech(speech)
#Add Time to Data Frame
if addtime == 1:
time = return_time(sent)
if len(str(time)) > 15:
time = str(time)[0:12]
#print "Exception ", time
else:
pass
df.ix[n, "TIME"] = time
else:
pass
#Add Location
if addloc == 1:
try:
time_ = find_time(sent)
location0 = sent
location1 = location0.replace(time_, '|').split('|', 1)[0]
location2 = location1.replace('\n\n', '|').replace('|\n', '|').replace('| ', '').split('|')
X = len(location2)-2
location3 = location2[X]
location = location3.replace('\n', ', ').replace('\t', '')
except:
location = ''
pass
if len(str(location)) > 25:
location = str(location)[0:35]
print "Exception ", location
else:
pass
df.ix[n, "LOCATION"] = location
else:
pass
#Add Citation/URL
if addcite == 1:
url = get_url(speech)
df.ix[n, "URL"] = url
else:
pass
ngram1 = get_group_set(1, sent)
ngram2 = get_group_set(2, sent)
ngram3 = get_group_set(3, sent)
ngram4 = get_group_set(4, sent)
speech_phrase_counter3(ngram1, ngram2, ngram3, ngram4, terms, df, n, sent)
os.chdir("..")
print df
df.to_csv(outfile, encoding='utf-8')
speech_classifier("Congressional_Records", "Congressional_Records_data.csv")
| apache-2.0 |
hdmetor/scikit-learn | sklearn/tests/test_lda.py | 71 | 5883 | import numpy as np
from sklearn.utils.testing import assert_array_equal
from sklearn.utils.testing import assert_array_almost_equal
from sklearn.utils.testing import assert_equal
from sklearn.utils.testing import assert_almost_equal
from sklearn.utils.testing import assert_true
from sklearn.utils.testing import assert_raises
from sklearn.utils.testing import assert_raise_message
from sklearn.datasets import make_blobs
from sklearn import lda
# Data is just 6 separable points in the plane
X = np.array([[-2, -1], [-1, -1], [-1, -2], [1, 1], [1, 2], [2, 1]], dtype='f')
y = np.array([1, 1, 1, 2, 2, 2])
y3 = np.array([1, 1, 2, 2, 3, 3])
# Degenerate data with only one feature (still should be separable)
X1 = np.array([[-2, ], [-1, ], [-1, ], [1, ], [1, ], [2, ]], dtype='f')
solver_shrinkage = [('svd', None), ('lsqr', None), ('eigen', None),
('lsqr', 'auto'), ('lsqr', 0), ('lsqr', 0.43),
('eigen', 'auto'), ('eigen', 0), ('eigen', 0.43)]
def test_lda_predict():
# Test LDA classification.
# This checks that LDA implements fit and predict and returns correct values
# for simple toy data.
for test_case in solver_shrinkage:
solver, shrinkage = test_case
clf = lda.LDA(solver=solver, shrinkage=shrinkage)
y_pred = clf.fit(X, y).predict(X)
assert_array_equal(y_pred, y, 'solver %s' % solver)
# Assert that it works with 1D data
y_pred1 = clf.fit(X1, y).predict(X1)
assert_array_equal(y_pred1, y, 'solver %s' % solver)
# Test probability estimates
y_proba_pred1 = clf.predict_proba(X1)
assert_array_equal((y_proba_pred1[:, 1] > 0.5) + 1, y,
'solver %s' % solver)
y_log_proba_pred1 = clf.predict_log_proba(X1)
assert_array_almost_equal(np.exp(y_log_proba_pred1), y_proba_pred1,
8, 'solver %s' % solver)
# Primarily test for commit 2f34950 -- "reuse" of priors
y_pred3 = clf.fit(X, y3).predict(X)
# LDA shouldn't be able to separate those
assert_true(np.any(y_pred3 != y3), 'solver %s' % solver)
# Test invalid shrinkages
clf = lda.LDA(solver="lsqr", shrinkage=-0.2231)
assert_raises(ValueError, clf.fit, X, y)
clf = lda.LDA(solver="eigen", shrinkage="dummy")
assert_raises(ValueError, clf.fit, X, y)
clf = lda.LDA(solver="svd", shrinkage="auto")
assert_raises(NotImplementedError, clf.fit, X, y)
# Test unknown solver
clf = lda.LDA(solver="dummy")
assert_raises(ValueError, clf.fit, X, y)
def test_lda_coefs():
# Test if the coefficients of the solvers are approximately the same.
n_features = 2
n_classes = 2
n_samples = 1000
X, y = make_blobs(n_samples=n_samples, n_features=n_features,
centers=n_classes, random_state=11)
clf_lda_svd = lda.LDA(solver="svd")
clf_lda_lsqr = lda.LDA(solver="lsqr")
clf_lda_eigen = lda.LDA(solver="eigen")
clf_lda_svd.fit(X, y)
clf_lda_lsqr.fit(X, y)
clf_lda_eigen.fit(X, y)
assert_array_almost_equal(clf_lda_svd.coef_, clf_lda_lsqr.coef_, 1)
assert_array_almost_equal(clf_lda_svd.coef_, clf_lda_eigen.coef_, 1)
assert_array_almost_equal(clf_lda_eigen.coef_, clf_lda_lsqr.coef_, 1)
def test_lda_transform():
# Test LDA transform.
clf = lda.LDA(solver="svd", n_components=1)
X_transformed = clf.fit(X, y).transform(X)
assert_equal(X_transformed.shape[1], 1)
clf = lda.LDA(solver="eigen", n_components=1)
X_transformed = clf.fit(X, y).transform(X)
assert_equal(X_transformed.shape[1], 1)
clf = lda.LDA(solver="lsqr", n_components=1)
clf.fit(X, y)
msg = "transform not implemented for 'lsqr'"
assert_raise_message(NotImplementedError, msg, clf.transform, X)
def test_lda_orthogonality():
# arrange four classes with their means in a kite-shaped pattern
# the longer distance should be transformed to the first component, and
# the shorter distance to the second component.
means = np.array([[0, 0, -1], [0, 2, 0], [0, -2, 0], [0, 0, 5]])
# We construct perfectly symmetric distributions, so the LDA can estimate
# precise means.
scatter = np.array([[0.1, 0, 0], [-0.1, 0, 0], [0, 0.1, 0], [0, -0.1, 0],
[0, 0, 0.1], [0, 0, -0.1]])
X = (means[:, np.newaxis, :] + scatter[np.newaxis, :, :]).reshape((-1, 3))
y = np.repeat(np.arange(means.shape[0]), scatter.shape[0])
# Fit LDA and transform the means
clf = lda.LDA(solver="svd").fit(X, y)
means_transformed = clf.transform(means)
d1 = means_transformed[3] - means_transformed[0]
d2 = means_transformed[2] - means_transformed[1]
d1 /= np.sqrt(np.sum(d1 ** 2))
d2 /= np.sqrt(np.sum(d2 ** 2))
# the transformed within-class covariance should be the identity matrix
assert_almost_equal(np.cov(clf.transform(scatter).T), np.eye(2))
# the means of classes 0 and 3 should lie on the first component
assert_almost_equal(np.abs(np.dot(d1[:2], [1, 0])), 1.0)
# the means of classes 1 and 2 should lie on the second component
assert_almost_equal(np.abs(np.dot(d2[:2], [0, 1])), 1.0)
def test_lda_scaling():
# Test if classification works correctly with differently scaled features.
n = 100
rng = np.random.RandomState(1234)
# use uniform distribution of features to make sure there is absolutely no
# overlap between classes.
x1 = rng.uniform(-1, 1, (n, 3)) + [-10, 0, 0]
x2 = rng.uniform(-1, 1, (n, 3)) + [10, 0, 0]
x = np.vstack((x1, x2)) * [1, 100, 10000]
y = [-1] * n + [1] * n
for solver in ('svd', 'lsqr', 'eigen'):
clf = lda.LDA(solver=solver)
# should be able to separate the data perfectly
assert_equal(clf.fit(x, y).score(x, y), 1.0,
'using covariance: %s' % solver)
| bsd-3-clause |
sangorrin/iwatsu-ds-8812-bringo-dso-application | logic.py | 1 | 13726 | #!/usr/bin/env python
# -*- coding: utf-8 -*-
#
# Copyright (c) 2017 Daniel Sangorrin
#
# Permission is hereby granted, free of charge, to any person obtaining a copy
# of this software and associated documentation files (the "Software"), to deal
# in the Software without restriction, including without limitation the rights
# to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
# copies of the Software, and to permit persons to whom the Software is
# furnished to do so, subject to the following conditions:
#
# The above copyright notice and this permission notice shall be included in all
# copies or substantial portions of the Software.
#
# THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
# IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
# FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
# AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
# LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
# OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
# SOFTWARE.
# Debugging commands:
# $ picocom -c --omap crcrlf -b 115200 -f h /dev/ttyUSB0
# DATE?
import sys
import numpy as np
from PyQt4 import QtGui, QtCore
import matplotlib
from matplotlib.figure import Figure
from matplotlib.backends.backend_qt4agg import (
FigureCanvasQTAgg as FigureCanvas,
NavigationToolbar2QT as NavigationToolbar)
from window import Ui_MainWindow
import serial
import glob
import time
class Main(QtGui.QMainWindow, Ui_MainWindow):
def __init__(self):
super(Main, self).__init__()
self.setupUi(self)
# add a figure to the canvas with a navigation toolbar
self.fig = Figure()
self.ax1f1 = self.fig.add_subplot(111)
self.ax1f1.set_autoscalex_on(False)
self.canvas = FigureCanvas(self.fig)
self.matplot_vlayout.addWidget(self.canvas)
self.toolbar = NavigationToolbar(self.canvas,
self.matplot_widget, coordinates=True)
self.matplot_vlayout.addWidget(self.toolbar)
self.ch1_wave = None
self.ch2_wave = None
self.interval = None
self.points = None
# prepare the serial port combo
serial_list = sorted(glob.glob('/dev/ttyUSB*'), key = lambda x: int(x[11:]))
serial_list += sorted(glob.glob('/dev/ttyACM*'), key = lambda x: int(x[11:]))
serial_list += sorted(glob.glob('/dev/ttyS*'), key = lambda x: int(x[9:]))[:5]
self.serial_combo.addItems(serial_list)
self.serial_port = None
def check_serial_port(self):
if not self.serial_port:
port = str(self.serial_combo.currentText())
try:
self.serial_port = serial.Serial(port,
baudrate=115200,
parity=serial.PARITY_NONE,
stopbits=serial.STOPBITS_ONE,
rtscts=True,
timeout=30.0)
self.serial_combo.setEnabled(False)
except serial.SerialException:
raise Exception("Couln't open the serial port " + port)
# \n ASCII Linefeed (LF)
# \r ASCII Carriage Return (CR)
def readlineCR(self):
rv = ""
while True:
ch = self.serial_port.read()
if ch == '\x06':
return 'ack'
rv += ch
if ch=='\n':
return rv
def _sendCommand(self, cmd):
print "command: " + cmd
try:
self.check_serial_port()
except Exception as e:
self.statusBar.clearMessage()
self.statusBar.showMessage(str(e))
return
self.serial_port.write(str(cmd) + '\r\n')
reply = self.readlineCR()
if reply != 'ack':
self.statusBar.clearMessage()
self.statusBar.showMessage('nack')
return
if str(cmd)[-1] == '?':
reply = self.readlineCR()
return reply.strip('\r\n')
def acquireWave(self, channel):
self._sendCommand('%s:TRA ON' % channel)
self._sendCommand('WAVESRC CH%s' % channel[-1])
reply = self._sendCommand('DTWAVE?')
# ascii data comes in 0.1mv format
wave = [float(item)/10000.0 for item in reply.split(',')]
return wave
def measure(self, channel, mode):
self._sendCommand('DIRM A')
self._sendCommand('MSEL %s, %s' % (channel, mode))
time.sleep(1)
return self._sendCommand('MSRA?')
def Acquire(self, show_plot=True):
self.statusBar.clearMessage()
if not (self.ch1_checkbox.isChecked() or self.ch2_checkbox.isChecked()):
self.statusBar.showMessage('Both channels are disabled')
return
else:
self.statusBar.showMessage('Acquiring...')
if self.longmem_checkbox.isChecked():
self._sendCommand('MLEN LONG')
# TODO: it should work with points = 102400 but it takes too long (try using binary data)
self.points = 30000
else:
self._sendCommand('MLEN SHORT')
self.points = 5120
self._sendCommand('DTPOINTS ' + str(self.points))
tdiv = float(self._sendCommand('TDIV?')) # time/div (total: 10 div)
self.interval = (tdiv * 10) / self.points # sample_rate = 1/interval
x = np.arange(0, self.interval * self.points, self.interval)
if show_plot:
self.ax1f1.clear()
self.ax1f1.set_xlim([0, max(x)])
if self.ch1_checkbox.isChecked():
if self.ch1_lpfilter_checkbox.isChecked():
self._sendCommand('C1:BWL ON')
else:
self._sendCommand('C1:BWL OFF')
self.ch1_wave = self.acquireWave('C1')
if show_plot:
self.ax1f1.plot(x, self.ch1_wave)
period = self.measure('CH1', 'PERIOD')
duty = self.measure('CH1', 'DUTY')
vmean = self.measure('CH1', 'VMEAN')
freq = self.measure('CH1', 'FREQ')
vrms = self.measure('CH1', 'VRMS')
vpp = self.measure('CH1', 'P-P')
tr = self.measure('CH1', 'TR')
tf = self.measure('CH1', 'TF')
pos_pw = self.measure('CH1', '+PW')
neg_pw = self.measure('CH1', '-PW')
pos_peak = self.measure('CH1', '+PEAK')
neg_peak = self.measure('CH1', '-PEAK')
self.ch1_measure_textedit.clear()
self.ch1_measure_textedit.appendPlainText('Period: %s' % period)
self.ch1_measure_textedit.appendPlainText('Duty: %s' % duty)
self.ch1_measure_textedit.appendPlainText('Vmean: %s' % vmean)
self.ch1_measure_textedit.appendPlainText('Freq: %s' % freq)
self.ch1_measure_textedit.appendPlainText('Vrms: %s' % vrms)
self.ch1_measure_textedit.appendPlainText('Vpp: %s' % vpp)
self.ch1_measure_textedit.appendPlainText('Rise: %s' % tr)
self.ch1_measure_textedit.appendPlainText('Fall: %s' % tf)
self.ch1_measure_textedit.appendPlainText('+PW: %s' % pos_pw)
self.ch1_measure_textedit.appendPlainText('-PW: %s' % neg_pw)
self.ch1_measure_textedit.appendPlainText('+PEAK: %s' % pos_peak)
self.ch1_measure_textedit.appendPlainText('-PEAK: %s' % neg_peak)
#v_at_t = self._sendCommand('CURM V_AT_T')
#self.ch1_measure_textedit.appendPlainText('Vcursor: %s' % v_at_t)
#hcur1, hcur2 = self._sendCommand('HCUR?').split(',')
#vcur1, vcur2 = self._sendCommand('VCUR?').split(',')
# <-- low pass filter
# PROBE mode value (mode: AUTO, MANUAL value: 1,10,100,1000 (1:1 10:1 ...)
# AVGCNT 2..256 (requires AVERAGE mode <--
#LEVL 10~90
#Used by: FREQ, PERIOD, +PW, -PW, and DUTY
#MCND base, 11-90, 10-89
#base: T-B or P-P
#Used by: TR and TF
#SKLV ch1-level>, <ch1_edge>, <ch2_level>, <ch2_edge> (RISE or FALL)
#Used by: SKEW
if self.ch2_checkbox.isChecked():
if self.ch2_lpfilter_checkbox.isChecked():
self._sendCommand('C2:BWL ON')
else:
self._sendCommand('C2:BWL OFF')
self.ch2_wave = self.acquireWave('C2')
if show_plot:
self.ax1f1.plot(x, self.ch2_wave)
period = self.measure('CH2', 'PERIOD')
duty = self.measure('CH2', 'DUTY')
vmean = self.measure('CH2', 'VMEAN')
freq = self.measure('CH2', 'FREQ')
vrms = self.measure('CH2', 'VRMS')
vpp = self.measure('CH2', 'P-P')
tr = self.measure('CH2', 'TR')
tf = self.measure('CH2', 'TF')
pos_pw = self.measure('CH2', '+PW')
neg_pw = self.measure('CH2', '-PW')
pos_peak = self.measure('CH2', '+PEAK')
neg_peak = self.measure('CH2', '-PEAK')
self.ch2_measure_textedit.clear()
self.ch2_measure_textedit.appendPlainText('Period: %s' % period)
self.ch2_measure_textedit.appendPlainText('Duty: %s' % duty)
self.ch2_measure_textedit.appendPlainText('Vmean: %s' % vmean)
self.ch2_measure_textedit.appendPlainText('Freq: %s' % freq)
self.ch2_measure_textedit.appendPlainText('Vrms: %s' % vrms)
self.ch2_measure_textedit.appendPlainText('Vpp: %s' % vpp)
self.ch2_measure_textedit.appendPlainText('Rise: %s' % tr)
self.ch2_measure_textedit.appendPlainText('Fall: %s' % tf)
self.ch2_measure_textedit.appendPlainText('+PW: %s' % pos_pw)
self.ch2_measure_textedit.appendPlainText('-PW: %s' % neg_pw)
self.ch2_measure_textedit.appendPlainText('+PEAK: %s' % pos_peak)
self.ch2_measure_textedit.appendPlainText('-PEAK: %s' % neg_peak)
if self.ch1_checkbox.isChecked() and self.ch2_checkbox.isChecked():
skew = self.measure('CH1', 'SKEW')
self.ch1_measure_textedit.appendPlainText('SKEW: %s' % skew)
# Note: should be equal
skew = self.measure('CH2', 'SKEW')
self.ch2_measure_textedit.appendPlainText('SKEW: %s' % skew)
if show_plot:
self.canvas.draw()
self.statusBar.clearMessage()
self.statusBar.showMessage('Acquiring... FINISHED')
def fft(self, wave):
Fk = np.fft.fft(wave)/self.points # Fourier coefficients (divided by n)
nu = np.fft.fftfreq(self.points, self.interval) # Natural frequencies
Fk = np.fft.fftshift(Fk) # Shift zero freq to center
nu = np.fft.fftshift(nu) # Shift zero freq to center
return Fk, nu
def rfft(self, wave):
Fk = np.fft.rfft(wave)/self.points
nu = np.fft.rfftfreq(self.points, self.interval)
return Fk, nu
def calculateFFT(self):
self.statusBar.clearMessage()
if not (self.ch1_checkbox.isChecked() or self.ch2_checkbox.isChecked()):
self.statusBar.showMessage('Both channels are disabled')
return
else:
self.statusBar.showMessage('Calculating FFT...')
if any([not self.ch1_wave, not self.ch2_wave, not self.interval, not self.points]):
self.Acquire(show_plot=False)
self.ax1f1.clear()
#self.ax1f1.set_xscale('log')
#self.ax1f1.set_xticks([100, 1000, 10000, 100000, 1000000, 10000000])
#self.ax1f1.get_xaxis().set_major_formatter(matplotlib.ticker.ScalarFormatter())
if self.ch1_checkbox.isChecked():
Fk, nu = self.rfft(self.ch1_wave)
self.ax1f1.plot(nu, 20*np.log10(np.absolute(Fk))) # Plot spectral power
if self.ch2_checkbox.isChecked():
Fk, nu = self.rfft(self.ch2_wave)
self.ax1f1.plot(nu, 20*np.log10(np.absolute(Fk))) # Plot spectral power
self.canvas.draw()
self.statusBar.clearMessage()
self.statusBar.showMessage('Calculating FFT... FINISHED')
def aset(self):
reply = self._sendCommand('ASET')
self.statusBar.clearMessage()
self.statusBar.showMessage(reply)
def ch_toggled(self):
if self.ch1_checkbox.isChecked():
print 'ch1 enabled'
reply = self._sendCommand('C1:TRA ON')
else:
print 'ch1 disabled'
reply = self._sendCommand('C1:TRA OFF')
if self.ch2_checkbox.isChecked():
print 'ch2 enabled'
reply = self._sendCommand('C2:TRA ON')
else:
print 'ch2 disabled'
reply = self._sendCommand('C2:TRA OFF')
def ch_coupling_changed(self):
print 'ch1 coupling: ' + str(self.ch1_coupling_combo.currentText())
self._sendCommand('C1:CPL ' + str(self.ch1_coupling_combo.currentText()))
print 'ch2 coupling: ' + str(self.ch2_coupling_combo.currentText())
self._sendCommand('C2:CPL ' + str(self.ch2_coupling_combo.currentText()))
def persist_toggled(self):
if self.persist_checkbox.isChecked():
self._sendCommand('PERS ON')
else:
self._sendCommand('PERS OFF')
def equiv_toggled(self):
if self.equiv_checkbox.isChecked():
self._sendCommand('EQU ON')
else:
self._sendCommand('EQU OFF')
if __name__ == '__main__':
app = QtGui.QApplication(sys.argv)
main = Main()
main.show()
sys.exit(app.exec_())
| mit |
rajat1994/scikit-learn | benchmarks/bench_plot_parallel_pairwise.py | 297 | 1247 | # Author: Mathieu Blondel <mathieu@mblondel.org>
# License: BSD 3 clause
import time
import pylab as pl
from sklearn.utils import check_random_state
from sklearn.metrics.pairwise import pairwise_distances
from sklearn.metrics.pairwise import pairwise_kernels
def plot(func):
random_state = check_random_state(0)
one_core = []
multi_core = []
sample_sizes = range(1000, 6000, 1000)
for n_samples in sample_sizes:
X = random_state.rand(n_samples, 300)
start = time.time()
func(X, n_jobs=1)
one_core.append(time.time() - start)
start = time.time()
func(X, n_jobs=-1)
multi_core.append(time.time() - start)
pl.figure('scikit-learn parallel %s benchmark results' % func.__name__)
pl.plot(sample_sizes, one_core, label="one core")
pl.plot(sample_sizes, multi_core, label="multi core")
pl.xlabel('n_samples')
pl.ylabel('Time (s)')
pl.title('Parallel %s' % func.__name__)
pl.legend()
def euclidean_distances(X, n_jobs):
return pairwise_distances(X, metric="euclidean", n_jobs=n_jobs)
def rbf_kernels(X, n_jobs):
return pairwise_kernels(X, metric="rbf", n_jobs=n_jobs, gamma=0.1)
plot(euclidean_distances)
plot(rbf_kernels)
pl.show()
| bsd-3-clause |
sonnyhu/scikit-learn | sklearn/gaussian_process/tests/test_kernels.py | 6 | 11602 | """Testing for kernels for Gaussian processes."""
# Author: Jan Hendrik Metzen <jhm@informatik.uni-bremen.de>
# License: BSD 3 clause
from collections import Hashable
from sklearn.externals.funcsigs import signature
import numpy as np
from sklearn.gaussian_process.kernels import _approx_fprime
from sklearn.metrics.pairwise \
import PAIRWISE_KERNEL_FUNCTIONS, euclidean_distances, pairwise_kernels
from sklearn.gaussian_process.kernels \
import (RBF, Matern, RationalQuadratic, ExpSineSquared, DotProduct,
ConstantKernel, WhiteKernel, PairwiseKernel, KernelOperator,
Exponentiation)
from sklearn.base import clone
from sklearn.utils.testing import (assert_equal, assert_almost_equal,
assert_not_equal, assert_array_equal,
assert_array_almost_equal)
X = np.random.RandomState(0).normal(0, 1, (5, 2))
Y = np.random.RandomState(0).normal(0, 1, (6, 2))
kernel_white = RBF(length_scale=2.0) + WhiteKernel(noise_level=3.0)
kernels = [RBF(length_scale=2.0), RBF(length_scale_bounds=(0.5, 2.0)),
ConstantKernel(constant_value=10.0),
2.0 * RBF(length_scale=0.33, length_scale_bounds="fixed"),
2.0 * RBF(length_scale=0.5), kernel_white,
2.0 * RBF(length_scale=[0.5, 2.0]),
2.0 * Matern(length_scale=0.33, length_scale_bounds="fixed"),
2.0 * Matern(length_scale=0.5, nu=0.5),
2.0 * Matern(length_scale=1.5, nu=1.5),
2.0 * Matern(length_scale=2.5, nu=2.5),
2.0 * Matern(length_scale=[0.5, 2.0], nu=0.5),
3.0 * Matern(length_scale=[2.0, 0.5], nu=1.5),
4.0 * Matern(length_scale=[0.5, 0.5], nu=2.5),
RationalQuadratic(length_scale=0.5, alpha=1.5),
ExpSineSquared(length_scale=0.5, periodicity=1.5),
DotProduct(sigma_0=2.0), DotProduct(sigma_0=2.0) ** 2]
for metric in PAIRWISE_KERNEL_FUNCTIONS:
if metric in ["additive_chi2", "chi2"]:
continue
kernels.append(PairwiseKernel(gamma=1.0, metric=metric))
def test_kernel_gradient():
""" Compare analytic and numeric gradient of kernels. """
for kernel in kernels:
K, K_gradient = kernel(X, eval_gradient=True)
assert_equal(K_gradient.shape[0], X.shape[0])
assert_equal(K_gradient.shape[1], X.shape[0])
assert_equal(K_gradient.shape[2], kernel.theta.shape[0])
def eval_kernel_for_theta(theta):
kernel_clone = kernel.clone_with_theta(theta)
K = kernel_clone(X, eval_gradient=False)
return K
K_gradient_approx = \
_approx_fprime(kernel.theta, eval_kernel_for_theta, 1e-10)
assert_almost_equal(K_gradient, K_gradient_approx, 4)
def test_kernel_theta():
""" Check that parameter vector theta of kernel is set correctly. """
for kernel in kernels:
if isinstance(kernel, KernelOperator) \
or isinstance(kernel, Exponentiation): # skip non-basic kernels
continue
theta = kernel.theta
_, K_gradient = kernel(X, eval_gradient=True)
# Determine kernel parameters that contribute to theta
init_sign = signature(kernel.__class__.__init__).parameters.values()
args = [p.name for p in init_sign if p.name != 'self']
theta_vars = map(lambda s: s.rstrip("_bounds"),
filter(lambda s: s.endswith("_bounds"), args))
assert_equal(
set(hyperparameter.name
for hyperparameter in kernel.hyperparameters),
set(theta_vars))
# Check that values returned in theta are consistent with
# hyperparameter values (being their logarithms)
for i, hyperparameter in enumerate(kernel.hyperparameters):
assert_equal(theta[i],
np.log(getattr(kernel, hyperparameter.name)))
# Fixed kernel parameters must be excluded from theta and gradient.
for i, hyperparameter in enumerate(kernel.hyperparameters):
# create copy with certain hyperparameter fixed
params = kernel.get_params()
params[hyperparameter.name + "_bounds"] = "fixed"
kernel_class = kernel.__class__
new_kernel = kernel_class(**params)
# Check that theta and K_gradient are identical with the fixed
# dimension left out
_, K_gradient_new = new_kernel(X, eval_gradient=True)
assert_equal(theta.shape[0], new_kernel.theta.shape[0] + 1)
assert_equal(K_gradient.shape[2], K_gradient_new.shape[2] + 1)
if i > 0:
assert_equal(theta[:i], new_kernel.theta[:i])
assert_array_equal(K_gradient[..., :i],
K_gradient_new[..., :i])
if i + 1 < len(kernel.hyperparameters):
assert_equal(theta[i+1:], new_kernel.theta[i:])
assert_array_equal(K_gradient[..., i+1:],
K_gradient_new[..., i:])
# Check that values of theta are modified correctly
for i, hyperparameter in enumerate(kernel.hyperparameters):
theta[i] = np.log(42)
kernel.theta = theta
assert_almost_equal(getattr(kernel, hyperparameter.name), 42)
setattr(kernel, hyperparameter.name, 43)
assert_almost_equal(kernel.theta[i], np.log(43))
def test_auto_vs_cross():
""" Auto-correlation and cross-correlation should be consistent. """
for kernel in kernels:
if kernel == kernel_white:
continue # Identity is not satisfied on diagonal
K_auto = kernel(X)
K_cross = kernel(X, X)
assert_almost_equal(K_auto, K_cross, 5)
def test_kernel_diag():
""" Test that diag method of kernel returns consistent results. """
for kernel in kernels:
K_call_diag = np.diag(kernel(X))
K_diag = kernel.diag(X)
assert_almost_equal(K_call_diag, K_diag, 5)
def test_kernel_operator_commutative():
""" Adding kernels and multiplying kernels should be commutative. """
# Check addition
assert_almost_equal((RBF(2.0) + 1.0)(X),
(1.0 + RBF(2.0))(X))
# Check multiplication
assert_almost_equal((3.0 * RBF(2.0))(X),
(RBF(2.0) * 3.0)(X))
def test_kernel_anisotropic():
""" Anisotropic kernel should be consistent with isotropic kernels."""
kernel = 3.0 * RBF([0.5, 2.0])
K = kernel(X)
X1 = np.array(X)
X1[:, 0] *= 4
K1 = 3.0 * RBF(2.0)(X1)
assert_almost_equal(K, K1)
X2 = np.array(X)
X2[:, 1] /= 4
K2 = 3.0 * RBF(0.5)(X2)
assert_almost_equal(K, K2)
# Check getting and setting via theta
kernel.theta = kernel.theta + np.log(2)
assert_array_equal(kernel.theta, np.log([6.0, 1.0, 4.0]))
assert_array_equal(kernel.k2.length_scale, [1.0, 4.0])
def test_kernel_stationary():
""" Test stationarity of kernels."""
for kernel in kernels:
if not kernel.is_stationary():
continue
K = kernel(X, X + 1)
assert_almost_equal(K[0, 0], np.diag(K))
def test_kernel_clone():
""" Test that sklearn's clone works correctly on kernels. """
for kernel in kernels:
kernel_cloned = clone(kernel)
assert_equal(kernel, kernel_cloned)
assert_not_equal(id(kernel), id(kernel_cloned))
for attr in kernel.__dict__.keys():
attr_value = getattr(kernel, attr)
attr_value_cloned = getattr(kernel_cloned, attr)
if attr.startswith("hyperparameter_"):
assert_equal(attr_value.name, attr_value_cloned.name)
assert_equal(attr_value.value_type,
attr_value_cloned.value_type)
assert_array_equal(attr_value.bounds,
attr_value_cloned.bounds)
assert_equal(attr_value.n_elements,
attr_value_cloned.n_elements)
elif np.iterable(attr_value):
for i in range(len(attr_value)):
if np.iterable(attr_value[i]):
assert_array_equal(attr_value[i],
attr_value_cloned[i])
else:
assert_equal(attr_value[i], attr_value_cloned[i])
else:
assert_equal(attr_value, attr_value_cloned)
if not isinstance(attr_value, Hashable):
# modifiable attributes must not be identical
assert_not_equal(id(attr_value), id(attr_value_cloned))
def test_matern_kernel():
""" Test consistency of Matern kernel for special values of nu. """
K = Matern(nu=1.5, length_scale=1.0)(X)
# the diagonal elements of a matern kernel are 1
assert_array_almost_equal(np.diag(K), np.ones(X.shape[0]))
# matern kernel for coef0==0.5 is equal to absolute exponential kernel
K_absexp = np.exp(-euclidean_distances(X, X, squared=False))
K = Matern(nu=0.5, length_scale=1.0)(X)
assert_array_almost_equal(K, K_absexp)
# test that special cases of matern kernel (coef0 in [0.5, 1.5, 2.5])
# result in nearly identical results as the general case for coef0 in
# [0.5 + tiny, 1.5 + tiny, 2.5 + tiny]
tiny = 1e-10
for nu in [0.5, 1.5, 2.5]:
K1 = Matern(nu=nu, length_scale=1.0)(X)
K2 = Matern(nu=nu + tiny, length_scale=1.0)(X)
assert_array_almost_equal(K1, K2)
def test_kernel_versus_pairwise():
"""Check that GP kernels can also be used as pairwise kernels."""
for kernel in kernels:
# Test auto-kernel
if kernel != kernel_white:
# For WhiteKernel: k(X) != k(X,X). This is assumed by
# pairwise_kernels
K1 = kernel(X)
K2 = pairwise_kernels(X, metric=kernel)
assert_array_almost_equal(K1, K2)
# Test cross-kernel
K1 = kernel(X, Y)
K2 = pairwise_kernels(X, Y, metric=kernel)
assert_array_almost_equal(K1, K2)
def test_set_get_params():
"""Check that set_params()/get_params() is consistent with kernel.theta."""
for kernel in kernels:
# Test get_params()
index = 0
params = kernel.get_params()
for hyperparameter in kernel.hyperparameters:
if hyperparameter.bounds is "fixed":
continue
size = hyperparameter.n_elements
if size > 1: # anisotropic kernels
assert_almost_equal(np.exp(kernel.theta[index:index+size]),
params[hyperparameter.name])
index += size
else:
assert_almost_equal(np.exp(kernel.theta[index]),
params[hyperparameter.name])
index += 1
# Test set_params()
index = 0
value = 10 # arbitrary value
for hyperparameter in kernel.hyperparameters:
if hyperparameter.bounds is "fixed":
continue
size = hyperparameter.n_elements
if size > 1: # anisotropic kernels
kernel.set_params(**{hyperparameter.name: [value]*size})
assert_almost_equal(np.exp(kernel.theta[index:index+size]),
[value]*size)
index += size
else:
kernel.set_params(**{hyperparameter.name: value})
assert_almost_equal(np.exp(kernel.theta[index]), value)
index += 1
| bsd-3-clause |
dsg2806/acti.monash | timetest.py | 2 | 771542 | import matplotlib.pyplot as plt
import pandas as pd
import matplotlib.dates as mdates
times = ['5/07/2011 13:57:00', '5/07/2011 13:58:00', '5/07/2011 13:59:00', '5/07/2011 14:00:00', '5/07/2011 14:01:00', '5/07/2011 14:02:00', '5/07/2011 14:03:00', '5/07/2011 14:04:00', '5/07/2011 14:05:00', '5/07/2011 14:06:00', '5/07/2011 14:07:00', '5/07/2011 14:08:00', '5/07/2011 14:09:00', '5/07/2011 14:10:00', '5/07/2011 14:11:00', '5/07/2011 14:12:00', '5/07/2011 14:13:00', '5/07/2011 14:14:00', '5/07/2011 14:15:00', '5/07/2011 14:16:00', '5/07/2011 14:17:00', '5/07/2011 14:18:00', '5/07/2011 14:19:00', '5/07/2011 14:20:00', '5/07/2011 14:21:00', '5/07/2011 14:22:00', '5/07/2011 14:23:00', '5/07/2011 14:24:00', '5/07/2011 14:25:00', '5/07/2011 14:26:00', '5/07/2011 14:27:00', '5/07/2011 14:28:00', '5/07/2011 14:29:00', '5/07/2011 14:30:00', '5/07/2011 14:31:00', '5/07/2011 14:32:00', '5/07/2011 14:33:00', '5/07/2011 14:34:00', '5/07/2011 14:35:00', '5/07/2011 14:36:00', '5/07/2011 14:37:00', '5/07/2011 14:38:00', '5/07/2011 14:39:00', '5/07/2011 14:40:00', '5/07/2011 14:41:00', '5/07/2011 14:42:00', '5/07/2011 14:43:00', '5/07/2011 14:44:00', '5/07/2011 14:45:00', '5/07/2011 14:46:00', '5/07/2011 14:47:00', '5/07/2011 14:48:00', '5/07/2011 14:49:00', '5/07/2011 14:50:00', '5/07/2011 14:51:00', '5/07/2011 14:52:00', '5/07/2011 14:53:00', '5/07/2011 14:54:00', '5/07/2011 14:55:00', '5/07/2011 14:56:00', '5/07/2011 14:57:00', '5/07/2011 14:58:00', '5/07/2011 14:59:00', '5/07/2011 15:00:00', '5/07/2011 15:01:00', '5/07/2011 15:02:00', '5/07/2011 15:03:00', '5/07/2011 15:04:00', '5/07/2011 15:05:00', '5/07/2011 15:06:00', '5/07/2011 15:07:00', '5/07/2011 15:08:00', '5/07/2011 15:09:00', '5/07/2011 15:10:00', '5/07/2011 15:11:00', '5/07/2011 15:12:00', '5/07/2011 15:13:00', '5/07/2011 15:14:00', '5/07/2011 15:15:00', '5/07/2011 15:16:00', '5/07/2011 15:17:00', '5/07/2011 15:18:00', '5/07/2011 15:19:00', '5/07/2011 15:20:00', '5/07/2011 15:21:00', '5/07/2011 15:22:00', '5/07/2011 15:23:00', '5/07/2011 15:24:00', '5/07/2011 15:25:00', '5/07/2011 15:26:00', '5/07/2011 15:27:00', '5/07/2011 15:28:00', '5/07/2011 15:29:00', '5/07/2011 15:30:00', '5/07/2011 15:31:00', '5/07/2011 15:32:00', '5/07/2011 15:33:00', '5/07/2011 15:34:00', '5/07/2011 15:35:00', '5/07/2011 15:36:00', '5/07/2011 15:37:00', '5/07/2011 15:38:00', '5/07/2011 15:39:00', '5/07/2011 15:40:00', '5/07/2011 15:41:00', '5/07/2011 15:42:00', '5/07/2011 15:43:00', '5/07/2011 15:44:00', '5/07/2011 15:45:00', '5/07/2011 15:46:00', '5/07/2011 15:47:00', '5/07/2011 15:48:00', '5/07/2011 15:49:00', '5/07/2011 15:50:00', '5/07/2011 15:51:00', '5/07/2011 15:52:00', '5/07/2011 15:53:00', '5/07/2011 15:54:00', '5/07/2011 15:55:00', '5/07/2011 15:56:00', '5/07/2011 15:57:00', '5/07/2011 15:58:00', '5/07/2011 15:59:00', '5/07/2011 16:00:00', '5/07/2011 16:01:00', '5/07/2011 16:02:00', '5/07/2011 16:03:00', '5/07/2011 16:04:00', '5/07/2011 16:05:00', '5/07/2011 16:06:00', '5/07/2011 16:07:00', '5/07/2011 16:08:00', '5/07/2011 16:09:00', '5/07/2011 16:10:00', '5/07/2011 16:11:00', '5/07/2011 16:12:00', '5/07/2011 16:13:00', '5/07/2011 16:14:00', '5/07/2011 16:15:00', '5/07/2011 16:16:00', '5/07/2011 16:17:00', '5/07/2011 16:18:00', '5/07/2011 16:19:00', '5/07/2011 16:20:00', '5/07/2011 16:21:00', '5/07/2011 16:22:00', '5/07/2011 16:23:00', '5/07/2011 16:24:00', '5/07/2011 16:25:00', '5/07/2011 16:26:00', '5/07/2011 16:27:00', '5/07/2011 16:28:00', '5/07/2011 16:29:00', '5/07/2011 16:30:00', '5/07/2011 16:31:00', '5/07/2011 16:32:00', '5/07/2011 16:33:00', '5/07/2011 16:34:00', '5/07/2011 16:35:00', '5/07/2011 16:36:00', '5/07/2011 16:37:00', '5/07/2011 16:38:00', '5/07/2011 16:39:00', '5/07/2011 16:40:00', '5/07/2011 16:41:00', '5/07/2011 16:42:00', '5/07/2011 16:43:00', '5/07/2011 16:44:00', '5/07/2011 16:45:00', '5/07/2011 16:46:00', '5/07/2011 16:47:00', '5/07/2011 16:48:00', '5/07/2011 16:49:00', '5/07/2011 16:50:00', '5/07/2011 16:51:00', '5/07/2011 16:52:00', '5/07/2011 16:53:00', '5/07/2011 16:54:00', '5/07/2011 16:55:00', '5/07/2011 16:56:00', '5/07/2011 16:57:00', '5/07/2011 16:58:00', '5/07/2011 16:59:00', '5/07/2011 17:00:00', '5/07/2011 17:01:00', '5/07/2011 17:02:00', '5/07/2011 17:03:00', '5/07/2011 17:04:00', '5/07/2011 17:05:00', '5/07/2011 17:06:00', '5/07/2011 17:07:00', '5/07/2011 17:08:00', '5/07/2011 17:09:00', '5/07/2011 17:10:00', '5/07/2011 17:11:00', '5/07/2011 17:12:00', '5/07/2011 17:13:00', '5/07/2011 17:14:00', '5/07/2011 17:15:00', '5/07/2011 17:16:00', '5/07/2011 17:17:00', '5/07/2011 17:18:00', '5/07/2011 17:19:00', '5/07/2011 17:20:00', '5/07/2011 17:21:00', '5/07/2011 17:22:00', '5/07/2011 17:23:00', '5/07/2011 17:24:00', '5/07/2011 17:25:00', '5/07/2011 17:26:00', '5/07/2011 17:27:00', '5/07/2011 17:28:00', '5/07/2011 17:29:00', '5/07/2011 17:30:00', '5/07/2011 17:31:00', '5/07/2011 17:32:00', '5/07/2011 17:33:00', '5/07/2011 17:34:00', '5/07/2011 17:35:00', '5/07/2011 17:36:00', '5/07/2011 17:37:00', '5/07/2011 17:38:00', '5/07/2011 17:39:00', '5/07/2011 17:40:00', '5/07/2011 17:41:00', '5/07/2011 17:42:00', '5/07/2011 17:43:00', '5/07/2011 17:44:00', '5/07/2011 17:45:00', '5/07/2011 17:46:00', '5/07/2011 17:47:00', '5/07/2011 17:48:00', '5/07/2011 17:49:00', '5/07/2011 17:50:00', '5/07/2011 17:51:00', '5/07/2011 17:52:00', '5/07/2011 17:53:00', '5/07/2011 17:54:00', '5/07/2011 17:55:00', '5/07/2011 17:56:00', '5/07/2011 17:57:00', '5/07/2011 17:58:00', '5/07/2011 17:59:00', '5/07/2011 18:00:00', '5/07/2011 18:01:00', '5/07/2011 18:02:00', '5/07/2011 18:03:00', '5/07/2011 18:04:00', '5/07/2011 18:05:00', '5/07/2011 18:06:00', '5/07/2011 18:07:00', '5/07/2011 18:08:00', '5/07/2011 18:09:00', '5/07/2011 18:10:00', '5/07/2011 18:11:00', '5/07/2011 18:12:00', '5/07/2011 18:13:00', '5/07/2011 18:14:00', '5/07/2011 18:15:00', '5/07/2011 18:16:00', '5/07/2011 18:17:00', '5/07/2011 18:18:00', '5/07/2011 18:19:00', '5/07/2011 18:20:00', '5/07/2011 18:21:00', '5/07/2011 18:22:00', '5/07/2011 18:23:00', '5/07/2011 18:24:00', '5/07/2011 18:25:00', '5/07/2011 18:26:00', '5/07/2011 18:27:00', '5/07/2011 18:28:00', '5/07/2011 18:29:00', '5/07/2011 18:30:00', '5/07/2011 18:31:00', '5/07/2011 18:32:00', '5/07/2011 18:33:00', '5/07/2011 18:34:00', '5/07/2011 18:35:00', '5/07/2011 18:36:00', '5/07/2011 18:37:00', '5/07/2011 18:38:00', '5/07/2011 18:39:00', '5/07/2011 18:40:00', '5/07/2011 18:41:00', '5/07/2011 18:42:00', '5/07/2011 18:43:00', '5/07/2011 18:44:00', '5/07/2011 18:45:00', '5/07/2011 18:46:00', '5/07/2011 18:47:00', '5/07/2011 18:48:00', '5/07/2011 18:49:00', '5/07/2011 18:50:00', '5/07/2011 18:51:00', '5/07/2011 18:52:00', '5/07/2011 18:53:00', '5/07/2011 18:54:00', '5/07/2011 18:55:00', '5/07/2011 18:56:00', '5/07/2011 18:57:00', '5/07/2011 18:58:00', '5/07/2011 18:59:00', '5/07/2011 19:00:00', '5/07/2011 19:01:00', '5/07/2011 19:02:00', '5/07/2011 19:03:00', '5/07/2011 19:04:00', '5/07/2011 19:05:00', '5/07/2011 19:06:00', '5/07/2011 19:07:00', '5/07/2011 19:08:00', '5/07/2011 19:09:00', '5/07/2011 19:10:00', '5/07/2011 19:11:00', '5/07/2011 19:12:00', '5/07/2011 19:13:00', '5/07/2011 19:14:00', '5/07/2011 19:15:00', '5/07/2011 19:16:00', '5/07/2011 19:17:00', '5/07/2011 19:18:00', '5/07/2011 19:19:00', '5/07/2011 19:20:00', '5/07/2011 19:21:00', '5/07/2011 19:22:00', '5/07/2011 19:23:00', '5/07/2011 19:24:00', '5/07/2011 19:25:00', '5/07/2011 19:26:00', '5/07/2011 19:27:00', '5/07/2011 19:28:00', '5/07/2011 19:29:00', '5/07/2011 19:30:00', '5/07/2011 19:31:00', '5/07/2011 19:32:00', '5/07/2011 19:33:00', '5/07/2011 19:34:00', '5/07/2011 19:35:00', '5/07/2011 19:36:00', '5/07/2011 19:37:00', '5/07/2011 19:38:00', '5/07/2011 19:39:00', '5/07/2011 19:40:00', '5/07/2011 19:41:00', '5/07/2011 19:42:00', '5/07/2011 19:43:00', '5/07/2011 19:44:00', '5/07/2011 19:45:00', '5/07/2011 19:46:00', '5/07/2011 19:47:00', '5/07/2011 19:48:00', '5/07/2011 19:49:00', '5/07/2011 19:50:00', '5/07/2011 19:51:00', '5/07/2011 19:52:00', '5/07/2011 19:53:00', '5/07/2011 19:54:00', '5/07/2011 19:55:00', '5/07/2011 19:56:00', '5/07/2011 19:57:00', '5/07/2011 19:58:00', '5/07/2011 19:59:00', '5/07/2011 20:00:00', '5/07/2011 20:01:00', '5/07/2011 20:02:00', '5/07/2011 20:03:00', '5/07/2011 20:04:00', '5/07/2011 20:05:00', '5/07/2011 20:06:00', '5/07/2011 20:07:00', '5/07/2011 20:08:00', '5/07/2011 20:09:00', '5/07/2011 20:10:00', '5/07/2011 20:11:00', '5/07/2011 20:12:00', '5/07/2011 20:13:00', '5/07/2011 20:14:00', '5/07/2011 20:15:00', '5/07/2011 20:16:00', '5/07/2011 20:17:00', '5/07/2011 20:18:00', '5/07/2011 20:19:00', '5/07/2011 20:20:00', '5/07/2011 20:21:00', '5/07/2011 20:22:00', '5/07/2011 20:23:00', '5/07/2011 20:24:00', '5/07/2011 20:25:00', '5/07/2011 20:26:00', '5/07/2011 20:27:00', '5/07/2011 20:28:00', '5/07/2011 20:29:00', '5/07/2011 20:30:00', '5/07/2011 20:31:00', '5/07/2011 20:32:00', '5/07/2011 20:33:00', '5/07/2011 20:34:00', '5/07/2011 20:35:00', '5/07/2011 20:36:00', '5/07/2011 20:37:00', '5/07/2011 20:38:00', '5/07/2011 20:39:00', '5/07/2011 20:40:00', '5/07/2011 20:41:00', '5/07/2011 20:42:00', '5/07/2011 20:43:00', '5/07/2011 20:44:00', '5/07/2011 20:45:00', '5/07/2011 20:46:00', '5/07/2011 20:47:00', '5/07/2011 20:48:00', '5/07/2011 20:49:00', '5/07/2011 20:50:00', '5/07/2011 20:51:00', '5/07/2011 20:52:00', '5/07/2011 20:53:00', '5/07/2011 20:54:00', '5/07/2011 20:55:00', '5/07/2011 20:56:00', '5/07/2011 20:57:00', '5/07/2011 20:58:00', '5/07/2011 20:59:00', '5/07/2011 21:00:00', '5/07/2011 21:01:00', '5/07/2011 21:02:00', '5/07/2011 21:03:00', '5/07/2011 21:04:00', '5/07/2011 21:05:00', '5/07/2011 21:06:00', '5/07/2011 21:07:00', '5/07/2011 21:08:00', '5/07/2011 21:09:00', '5/07/2011 21:10:00', '5/07/2011 21:11:00', '5/07/2011 21:12:00', '5/07/2011 21:13:00', '5/07/2011 21:14:00', '5/07/2011 21:15:00', '5/07/2011 21:16:00', '5/07/2011 21:17:00', '5/07/2011 21:18:00', '5/07/2011 21:19:00', '5/07/2011 21:20:00', '5/07/2011 21:21:00', '5/07/2011 21:22:00', '5/07/2011 21:23:00', '5/07/2011 21:24:00', '5/07/2011 21:25:00', '5/07/2011 21:26:00', '5/07/2011 21:27:00', '5/07/2011 21:28:00', '5/07/2011 21:29:00', '5/07/2011 21:30:00', '5/07/2011 21:31:00', '5/07/2011 21:32:00', '5/07/2011 21:33:00', '5/07/2011 21:34:00', '5/07/2011 21:35:00', '5/07/2011 21:36:00', '5/07/2011 21:37:00', '5/07/2011 21:38:00', '5/07/2011 21:39:00', '5/07/2011 21:40:00', '5/07/2011 21:41:00', '5/07/2011 21:42:00', '5/07/2011 21:43:00', '5/07/2011 21:44:00', '5/07/2011 21:45:00', '5/07/2011 21:46:00', '5/07/2011 21:47:00', '5/07/2011 21:48:00', '5/07/2011 21:49:00', '5/07/2011 21:50:00', '5/07/2011 21:51:00', '5/07/2011 21:52:00', '5/07/2011 21:53:00', '5/07/2011 21:54:00', '5/07/2011 21:55:00', '5/07/2011 21:56:00', '5/07/2011 21:57:00', '5/07/2011 21:58:00', '5/07/2011 21:59:00', '5/07/2011 22:00:00', '5/07/2011 22:01:00', '5/07/2011 22:02:00', '5/07/2011 22:03:00', '5/07/2011 22:04:00', '5/07/2011 22:05:00', '5/07/2011 22:06:00', '5/07/2011 22:07:00', '5/07/2011 22:08:00', '5/07/2011 22:09:00', '5/07/2011 22:10:00', '5/07/2011 22:11:00', '5/07/2011 22:12:00', '5/07/2011 22:13:00', '5/07/2011 22:14:00', '5/07/2011 22:15:00', '5/07/2011 22:16:00', '5/07/2011 22:17:00', '5/07/2011 22:18:00', '5/07/2011 22:19:00', '5/07/2011 22:20:00', '5/07/2011 22:21:00', '5/07/2011 22:22:00', '5/07/2011 22:23:00', '5/07/2011 22:24:00', '5/07/2011 22:25:00', '5/07/2011 22:26:00', '5/07/2011 22:27:00', '5/07/2011 22:28:00', '5/07/2011 22:29:00', '5/07/2011 22:30:00', '5/07/2011 22:31:00', '5/07/2011 22:32:00', '5/07/2011 22:33:00', '5/07/2011 22:34:00', '5/07/2011 22:35:00', '5/07/2011 22:36:00', '5/07/2011 22:37:00', '5/07/2011 22:38:00', '5/07/2011 22:39:00', '5/07/2011 22:40:00', '5/07/2011 22:41:00', '5/07/2011 22:42:00', '5/07/2011 22:43:00', '5/07/2011 22:44:00', '5/07/2011 22:45:00', '5/07/2011 22:46:00', '5/07/2011 22:47:00', '5/07/2011 22:48:00', '5/07/2011 22:49:00', '5/07/2011 22:50:00', '5/07/2011 22:51:00', '5/07/2011 22:52:00', '5/07/2011 22:53:00', '5/07/2011 22:54:00', '5/07/2011 22:55:00', '5/07/2011 22:56:00', '5/07/2011 22:57:00', '5/07/2011 22:58:00', '5/07/2011 22:59:00', '5/07/2011 23:00:00', '5/07/2011 23:01:00', '5/07/2011 23:02:00', '5/07/2011 23:03:00', '5/07/2011 23:04:00', '5/07/2011 23:05:00', '5/07/2011 23:06:00', '5/07/2011 23:07:00', '5/07/2011 23:08:00', '5/07/2011 23:09:00', '5/07/2011 23:10:00', '5/07/2011 23:11:00', '5/07/2011 23:12:00', '5/07/2011 23:13:00', '5/07/2011 23:14:00', '5/07/2011 23:15:00', '5/07/2011 23:16:00', '5/07/2011 23:17:00', '5/07/2011 23:18:00', '5/07/2011 23:19:00', '5/07/2011 23:20:00', '5/07/2011 23:21:00', '5/07/2011 23:22:00', '5/07/2011 23:23:00', '5/07/2011 23:24:00', '5/07/2011 23:25:00', '5/07/2011 23:26:00', '5/07/2011 23:27:00', '5/07/2011 23:28:00', '5/07/2011 23:29:00', '5/07/2011 23:30:00', '5/07/2011 23:31:00', '5/07/2011 23:32:00', '5/07/2011 23:33:00', '5/07/2011 23:34:00', '5/07/2011 23:35:00', '5/07/2011 23:36:00', '5/07/2011 23:37:00', '5/07/2011 23:38:00', '5/07/2011 23:39:00', '5/07/2011 23:40:00', '5/07/2011 23:41:00', '5/07/2011 23:42:00', '5/07/2011 23:43:00', '5/07/2011 23:44:00', '5/07/2011 23:45:00', '5/07/2011 23:46:00', '5/07/2011 23:47:00', '5/07/2011 23:48:00', '5/07/2011 23:49:00', '5/07/2011 23:50:00', '5/07/2011 23:51:00', '5/07/2011 23:52:00', '5/07/2011 23:53:00', '5/07/2011 23:54:00', '5/07/2011 23:55:00', '5/07/2011 23:56:00', '5/07/2011 23:57:00', '5/07/2011 23:58:00', '5/07/2011 23:59:00', '6/07/2011 0:00:00', '6/07/2011 0:01:00', '6/07/2011 0:02:00', '6/07/2011 0:03:00', '6/07/2011 0:04:00', '6/07/2011 0:05:00', '6/07/2011 0:06:00', '6/07/2011 0:07:00', '6/07/2011 0:08:00', '6/07/2011 0:09:00', '6/07/2011 0:10:00', '6/07/2011 0:11:00', '6/07/2011 0:12:00', '6/07/2011 0:13:00', '6/07/2011 0:14:00', '6/07/2011 0:15:00', '6/07/2011 0:16:00', '6/07/2011 0:17:00', '6/07/2011 0:18:00', '6/07/2011 0:19:00', '6/07/2011 0:20:00', '6/07/2011 0:21:00', '6/07/2011 0:22:00', '6/07/2011 0:23:00', '6/07/2011 0:24:00', '6/07/2011 0:25:00', '6/07/2011 0:26:00', '6/07/2011 0:27:00', '6/07/2011 0:28:00', '6/07/2011 0:29:00', '6/07/2011 0:30:00', '6/07/2011 0:31:00', '6/07/2011 0:32:00', '6/07/2011 0:33:00', '6/07/2011 0:34:00', '6/07/2011 0:35:00', '6/07/2011 0:36:00', '6/07/2011 0:37:00', '6/07/2011 0:38:00', '6/07/2011 0:39:00', '6/07/2011 0:40:00', '6/07/2011 0:41:00', '6/07/2011 0:42:00', '6/07/2011 0:43:00', '6/07/2011 0:44:00', '6/07/2011 0:45:00', '6/07/2011 0:46:00', '6/07/2011 0:47:00', '6/07/2011 0:48:00', '6/07/2011 0:49:00', '6/07/2011 0:50:00', '6/07/2011 0:51:00', '6/07/2011 0:52:00', '6/07/2011 0:53:00', '6/07/2011 0:54:00', '6/07/2011 0:55:00', '6/07/2011 0:56:00', '6/07/2011 0:57:00', '6/07/2011 0:58:00', '6/07/2011 0:59:00', '6/07/2011 1:00:00', '6/07/2011 1:01:00', '6/07/2011 1:02:00', '6/07/2011 1:03:00', '6/07/2011 1:04:00', '6/07/2011 1:05:00', '6/07/2011 1:06:00', '6/07/2011 1:07:00', '6/07/2011 1:08:00', '6/07/2011 1:09:00', '6/07/2011 1:10:00', '6/07/2011 1:11:00', '6/07/2011 1:12:00', '6/07/2011 1:13:00', '6/07/2011 1:14:00', '6/07/2011 1:15:00', '6/07/2011 1:16:00', '6/07/2011 1:17:00', '6/07/2011 1:18:00', '6/07/2011 1:19:00', '6/07/2011 1:20:00', '6/07/2011 1:21:00', '6/07/2011 1:22:00', '6/07/2011 1:23:00', '6/07/2011 1:24:00', '6/07/2011 1:25:00', '6/07/2011 1:26:00', '6/07/2011 1:27:00', '6/07/2011 1:28:00', '6/07/2011 1:29:00', '6/07/2011 1:30:00', '6/07/2011 1:31:00', '6/07/2011 1:32:00', '6/07/2011 1:33:00', '6/07/2011 1:34:00', '6/07/2011 1:35:00', '6/07/2011 1:36:00', '6/07/2011 1:37:00', '6/07/2011 1:38:00', '6/07/2011 1:39:00', '6/07/2011 1:40:00', '6/07/2011 1:41:00', '6/07/2011 1:42:00', '6/07/2011 1:43:00', '6/07/2011 1:44:00', '6/07/2011 1:45:00', '6/07/2011 1:46:00', '6/07/2011 1:47:00', '6/07/2011 1:48:00', '6/07/2011 1:49:00', '6/07/2011 1:50:00', '6/07/2011 1:51:00', '6/07/2011 1:52:00', '6/07/2011 1:53:00', '6/07/2011 1:54:00', '6/07/2011 1:55:00', '6/07/2011 1:56:00', '6/07/2011 1:57:00', '6/07/2011 1:58:00', '6/07/2011 1:59:00', '6/07/2011 2:00:00', '6/07/2011 2:01:00', '6/07/2011 2:02:00', '6/07/2011 2:03:00', '6/07/2011 2:04:00', '6/07/2011 2:05:00', '6/07/2011 2:06:00', '6/07/2011 2:07:00', '6/07/2011 2:08:00', '6/07/2011 2:09:00', '6/07/2011 2:10:00', '6/07/2011 2:11:00', '6/07/2011 2:12:00', '6/07/2011 2:13:00', '6/07/2011 2:14:00', '6/07/2011 2:15:00', '6/07/2011 2:16:00', '6/07/2011 2:17:00', '6/07/2011 2:18:00', '6/07/2011 2:19:00', '6/07/2011 2:20:00', '6/07/2011 2:21:00', '6/07/2011 2:22:00', '6/07/2011 2:23:00', '6/07/2011 2:24:00', '6/07/2011 2:25:00', '6/07/2011 2:26:00', '6/07/2011 2:27:00', '6/07/2011 2:28:00', '6/07/2011 2:29:00', '6/07/2011 2:30:00', '6/07/2011 2:31:00', '6/07/2011 2:32:00', '6/07/2011 2:33:00', '6/07/2011 2:34:00', '6/07/2011 2:35:00', '6/07/2011 2:36:00', '6/07/2011 2:37:00', '6/07/2011 2:38:00', '6/07/2011 2:39:00', '6/07/2011 2:40:00', '6/07/2011 2:41:00', '6/07/2011 2:42:00', '6/07/2011 2:43:00', '6/07/2011 2:44:00', '6/07/2011 2:45:00', '6/07/2011 2:46:00', '6/07/2011 2:47:00', '6/07/2011 2:48:00', '6/07/2011 2:49:00', '6/07/2011 2:50:00', '6/07/2011 2:51:00', '6/07/2011 2:52:00', '6/07/2011 2:53:00', '6/07/2011 2:54:00', '6/07/2011 2:55:00', '6/07/2011 2:56:00', '6/07/2011 2:57:00', '6/07/2011 2:58:00', '6/07/2011 2:59:00', '6/07/2011 3:00:00', '6/07/2011 3:01:00', '6/07/2011 3:02:00', '6/07/2011 3:03:00', '6/07/2011 3:04:00', '6/07/2011 3:05:00', '6/07/2011 3:06:00', '6/07/2011 3:07:00', '6/07/2011 3:08:00', '6/07/2011 3:09:00', '6/07/2011 3:10:00', '6/07/2011 3:11:00', '6/07/2011 3:12:00', '6/07/2011 3:13:00', '6/07/2011 3:14:00', '6/07/2011 3:15:00', '6/07/2011 3:16:00', '6/07/2011 3:17:00', '6/07/2011 3:18:00', '6/07/2011 3:19:00', '6/07/2011 3:20:00', '6/07/2011 3:21:00', '6/07/2011 3:22:00', '6/07/2011 3:23:00', '6/07/2011 3:24:00', '6/07/2011 3:25:00', '6/07/2011 3:26:00', '6/07/2011 3:27:00', '6/07/2011 3:28:00', '6/07/2011 3:29:00', '6/07/2011 3:30:00', '6/07/2011 3:31:00', '6/07/2011 3:32:00', '6/07/2011 3:33:00', '6/07/2011 3:34:00', '6/07/2011 3:35:00', '6/07/2011 3:36:00', '6/07/2011 3:37:00', '6/07/2011 3:38:00', '6/07/2011 3:39:00', '6/07/2011 3:40:00', '6/07/2011 3:41:00', '6/07/2011 3:42:00', '6/07/2011 3:43:00', '6/07/2011 3:44:00', '6/07/2011 3:45:00', '6/07/2011 3:46:00', '6/07/2011 3:47:00', '6/07/2011 3:48:00', '6/07/2011 3:49:00', '6/07/2011 3:50:00', '6/07/2011 3:51:00', '6/07/2011 3:52:00', '6/07/2011 3:53:00', '6/07/2011 3:54:00', '6/07/2011 3:55:00', '6/07/2011 3:56:00', '6/07/2011 3:57:00', '6/07/2011 3:58:00', '6/07/2011 3:59:00', '6/07/2011 4:00:00', '6/07/2011 4:01:00', '6/07/2011 4:02:00', '6/07/2011 4:03:00', '6/07/2011 4:04:00', '6/07/2011 4:05:00', '6/07/2011 4:06:00', '6/07/2011 4:07:00', '6/07/2011 4:08:00', '6/07/2011 4:09:00', '6/07/2011 4:10:00', '6/07/2011 4:11:00', '6/07/2011 4:12:00', '6/07/2011 4:13:00', '6/07/2011 4:14:00', '6/07/2011 4:15:00', '6/07/2011 4:16:00', '6/07/2011 4:17:00', '6/07/2011 4:18:00', '6/07/2011 4:19:00', '6/07/2011 4:20:00', '6/07/2011 4:21:00', '6/07/2011 4:22:00', '6/07/2011 4:23:00', '6/07/2011 4:24:00', '6/07/2011 4:25:00', '6/07/2011 4:26:00', '6/07/2011 4:27:00', '6/07/2011 4:28:00', '6/07/2011 4:29:00', '6/07/2011 4:30:00', '6/07/2011 4:31:00', '6/07/2011 4:32:00', '6/07/2011 4:33:00', '6/07/2011 4:34:00', '6/07/2011 4:35:00', '6/07/2011 4:36:00', '6/07/2011 4:37:00', '6/07/2011 4:38:00', '6/07/2011 4:39:00', '6/07/2011 4:40:00', '6/07/2011 4:41:00', '6/07/2011 4:42:00', '6/07/2011 4:43:00', '6/07/2011 4:44:00', '6/07/2011 4:45:00', '6/07/2011 4:46:00', '6/07/2011 4:47:00', '6/07/2011 4:48:00', '6/07/2011 4:49:00', '6/07/2011 4:50:00', '6/07/2011 4:51:00', '6/07/2011 4:52:00', '6/07/2011 4:53:00', '6/07/2011 4:54:00', '6/07/2011 4:55:00', '6/07/2011 4:56:00', '6/07/2011 4:57:00', '6/07/2011 4:58:00', '6/07/2011 4:59:00', '6/07/2011 5:00:00', '6/07/2011 5:01:00', '6/07/2011 5:02:00', '6/07/2011 5:03:00', '6/07/2011 5:04:00', '6/07/2011 5:05:00', '6/07/2011 5:06:00', '6/07/2011 5:07:00', '6/07/2011 5:08:00', '6/07/2011 5:09:00', '6/07/2011 5:10:00', '6/07/2011 5:11:00', '6/07/2011 5:12:00', '6/07/2011 5:13:00', '6/07/2011 5:14:00', '6/07/2011 5:15:00', '6/07/2011 5:16:00', '6/07/2011 5:17:00', '6/07/2011 5:18:00', '6/07/2011 5:19:00', '6/07/2011 5:20:00', '6/07/2011 5:21:00', '6/07/2011 5:22:00', '6/07/2011 5:23:00', '6/07/2011 5:24:00', '6/07/2011 5:25:00', '6/07/2011 5:26:00', '6/07/2011 5:27:00', '6/07/2011 5:28:00', '6/07/2011 5:29:00', '6/07/2011 5:30:00', '6/07/2011 5:31:00', '6/07/2011 5:32:00', '6/07/2011 5:33:00', '6/07/2011 5:34:00', '6/07/2011 5:35:00', '6/07/2011 5:36:00', '6/07/2011 5:37:00', '6/07/2011 5:38:00', '6/07/2011 5:39:00', '6/07/2011 5:40:00', '6/07/2011 5:41:00', '6/07/2011 5:42:00', '6/07/2011 5:43:00', '6/07/2011 5:44:00', '6/07/2011 5:45:00', '6/07/2011 5:46:00', '6/07/2011 5:47:00', '6/07/2011 5:48:00', '6/07/2011 5:49:00', '6/07/2011 5:50:00', '6/07/2011 5:51:00', '6/07/2011 5:52:00', '6/07/2011 5:53:00', '6/07/2011 5:54:00', '6/07/2011 5:55:00', '6/07/2011 5:56:00', '6/07/2011 5:57:00', '6/07/2011 5:58:00', '6/07/2011 5:59:00', '6/07/2011 6:00:00', '6/07/2011 6:01:00', '6/07/2011 6:02:00', '6/07/2011 6:03:00', '6/07/2011 6:04:00', '6/07/2011 6:05:00', '6/07/2011 6:06:00', '6/07/2011 6:07:00', '6/07/2011 6:08:00', '6/07/2011 6:09:00', '6/07/2011 6:10:00', '6/07/2011 6:11:00', '6/07/2011 6:12:00', '6/07/2011 6:13:00', '6/07/2011 6:14:00', '6/07/2011 6:15:00', '6/07/2011 6:16:00', '6/07/2011 6:17:00', '6/07/2011 6:18:00', '6/07/2011 6:19:00', '6/07/2011 6:20:00', '6/07/2011 6:21:00', '6/07/2011 6:22:00', '6/07/2011 6:23:00', '6/07/2011 6:24:00', '6/07/2011 6:25:00', '6/07/2011 6:26:00', '6/07/2011 6:27:00', '6/07/2011 6:28:00', '6/07/2011 6:29:00', '6/07/2011 6:30:00', '6/07/2011 6:31:00', '6/07/2011 6:32:00', '6/07/2011 6:33:00', '6/07/2011 6:34:00', '6/07/2011 6:35:00', '6/07/2011 6:36:00', '6/07/2011 6:37:00', '6/07/2011 6:38:00', '6/07/2011 6:39:00', '6/07/2011 6:40:00', '6/07/2011 6:41:00', '6/07/2011 6:42:00', '6/07/2011 6:43:00', '6/07/2011 6:44:00', '6/07/2011 6:45:00', '6/07/2011 6:46:00', '6/07/2011 6:47:00', '6/07/2011 6:48:00', '6/07/2011 6:49:00', '6/07/2011 6:50:00', '6/07/2011 6:51:00', '6/07/2011 6:52:00', '6/07/2011 6:53:00', '6/07/2011 6:54:00', '6/07/2011 6:55:00', '6/07/2011 6:56:00', '6/07/2011 6:57:00', '6/07/2011 6:58:00', '6/07/2011 6:59:00', '6/07/2011 7:00:00', '6/07/2011 7:01:00', '6/07/2011 7:02:00', '6/07/2011 7:03:00', '6/07/2011 7:04:00', '6/07/2011 7:05:00', '6/07/2011 7:06:00', '6/07/2011 7:07:00', '6/07/2011 7:08:00', '6/07/2011 7:09:00', '6/07/2011 7:10:00', '6/07/2011 7:11:00', '6/07/2011 7:12:00', '6/07/2011 7:13:00', '6/07/2011 7:14:00', '6/07/2011 7:15:00', '6/07/2011 7:16:00', '6/07/2011 7:17:00', '6/07/2011 7:18:00', '6/07/2011 7:19:00', '6/07/2011 7:20:00', '6/07/2011 7:21:00', '6/07/2011 7:22:00', '6/07/2011 7:23:00', '6/07/2011 7:24:00', '6/07/2011 7:25:00', '6/07/2011 7:26:00', '6/07/2011 7:27:00', '6/07/2011 7:28:00', '6/07/2011 7:29:00', '6/07/2011 7:30:00', '6/07/2011 7:31:00', '6/07/2011 7:32:00', '6/07/2011 7:33:00', '6/07/2011 7:34:00', '6/07/2011 7:35:00', '6/07/2011 7:36:00', '6/07/2011 7:37:00', '6/07/2011 7:38:00', '6/07/2011 7:39:00', '6/07/2011 7:40:00', '6/07/2011 7:41:00', '6/07/2011 7:42:00', '6/07/2011 7:43:00', '6/07/2011 7:44:00', '6/07/2011 7:45:00', '6/07/2011 7:46:00', '6/07/2011 7:47:00', '6/07/2011 7:48:00', '6/07/2011 7:49:00', '6/07/2011 7:50:00', '6/07/2011 7:51:00', '6/07/2011 7:52:00', '6/07/2011 7:53:00', '6/07/2011 7:54:00', '6/07/2011 7:55:00', '6/07/2011 7:56:00', '6/07/2011 7:57:00', '6/07/2011 7:58:00', '6/07/2011 7:59:00', '6/07/2011 8:00:00', '6/07/2011 8:01:00', '6/07/2011 8:02:00', '6/07/2011 8:03:00', '6/07/2011 8:04:00', '6/07/2011 8:05:00', '6/07/2011 8:06:00', '6/07/2011 8:07:00', '6/07/2011 8:08:00', '6/07/2011 8:09:00', '6/07/2011 8:10:00', '6/07/2011 8:11:00', '6/07/2011 8:12:00', '6/07/2011 8:13:00', '6/07/2011 8:14:00', '6/07/2011 8:15:00', '6/07/2011 8:16:00', '6/07/2011 8:17:00', '6/07/2011 8:18:00', '6/07/2011 8:19:00', '6/07/2011 8:20:00', '6/07/2011 8:21:00', '6/07/2011 8:22:00', '6/07/2011 8:23:00', '6/07/2011 8:24:00', '6/07/2011 8:25:00', '6/07/2011 8:26:00', '6/07/2011 8:27:00', '6/07/2011 8:28:00', '6/07/2011 8:29:00', '6/07/2011 8:30:00', '6/07/2011 8:31:00', '6/07/2011 8:32:00', '6/07/2011 8:33:00', '6/07/2011 8:34:00', '6/07/2011 8:35:00', '6/07/2011 8:36:00', '6/07/2011 8:37:00', '6/07/2011 8:38:00', '6/07/2011 8:39:00', '6/07/2011 8:40:00', '6/07/2011 8:41:00', '6/07/2011 8:42:00', '6/07/2011 8:43:00', '6/07/2011 8:44:00', '6/07/2011 8:45:00', '6/07/2011 8:46:00', '6/07/2011 8:47:00', '6/07/2011 8:48:00', '6/07/2011 8:49:00', '6/07/2011 8:50:00', '6/07/2011 8:51:00', '6/07/2011 8:52:00', '6/07/2011 8:53:00', '6/07/2011 8:54:00', '6/07/2011 8:55:00', '6/07/2011 8:56:00', '6/07/2011 8:57:00', '6/07/2011 8:58:00', '6/07/2011 8:59:00', '6/07/2011 9:00:00', '6/07/2011 9:01:00', '6/07/2011 9:02:00', '6/07/2011 9:03:00', '6/07/2011 9:04:00', '6/07/2011 9:05:00', '6/07/2011 9:06:00', '6/07/2011 9:07:00', '6/07/2011 9:08:00', '6/07/2011 9:09:00', '6/07/2011 9:10:00', '6/07/2011 9:11:00', '6/07/2011 9:12:00', '6/07/2011 9:13:00', '6/07/2011 9:14:00', '6/07/2011 9:15:00', '6/07/2011 9:16:00', '6/07/2011 9:17:00', '6/07/2011 9:18:00', '6/07/2011 9:19:00', '6/07/2011 9:20:00', '6/07/2011 9:21:00', '6/07/2011 9:22:00', '6/07/2011 9:23:00', '6/07/2011 9:24:00', '6/07/2011 9:25:00', '6/07/2011 9:26:00', '6/07/2011 9:27:00', '6/07/2011 9:28:00', '6/07/2011 9:29:00', '6/07/2011 9:30:00', '6/07/2011 9:31:00', '6/07/2011 9:32:00', '6/07/2011 9:33:00', '6/07/2011 9:34:00', '6/07/2011 9:35:00', '6/07/2011 9:36:00', '6/07/2011 9:37:00', '6/07/2011 9:38:00', '6/07/2011 9:39:00', '6/07/2011 9:40:00', '6/07/2011 9:41:00', '6/07/2011 9:42:00', '6/07/2011 9:43:00', '6/07/2011 9:44:00', '6/07/2011 9:45:00', '6/07/2011 9:46:00', '6/07/2011 9:47:00', '6/07/2011 9:48:00', '6/07/2011 9:49:00', '6/07/2011 9:50:00', '6/07/2011 9:51:00', '6/07/2011 9:52:00', '6/07/2011 9:53:00', '6/07/2011 9:54:00', '6/07/2011 9:55:00', '6/07/2011 9:56:00', '6/07/2011 9:57:00', '6/07/2011 9:58:00', '6/07/2011 9:59:00', '6/07/2011 10:00:00', '6/07/2011 10:01:00', '6/07/2011 10:02:00', '6/07/2011 10:03:00', '6/07/2011 10:04:00', '6/07/2011 10:05:00', '6/07/2011 10:06:00', '6/07/2011 10:07:00', '6/07/2011 10:08:00', '6/07/2011 10:09:00', '6/07/2011 10:10:00', '6/07/2011 10:11:00', '6/07/2011 10:12:00', '6/07/2011 10:13:00', '6/07/2011 10:14:00', '6/07/2011 10:15:00', '6/07/2011 10:16:00', '6/07/2011 10:17:00', '6/07/2011 10:18:00', '6/07/2011 10:19:00', '6/07/2011 10:20:00', '6/07/2011 10:21:00', '6/07/2011 10:22:00', '6/07/2011 10:23:00', '6/07/2011 10:24:00', '6/07/2011 10:25:00', '6/07/2011 10:26:00', '6/07/2011 10:27:00', '6/07/2011 10:28:00', '6/07/2011 10:29:00', '6/07/2011 10:30:00', '6/07/2011 10:31:00', '6/07/2011 10:32:00', '6/07/2011 10:33:00', '6/07/2011 10:34:00', '6/07/2011 10:35:00', '6/07/2011 10:36:00', '6/07/2011 10:37:00', '6/07/2011 10:38:00', '6/07/2011 10:39:00', '6/07/2011 10:40:00', '6/07/2011 10:41:00', '6/07/2011 10:42:00', '6/07/2011 10:43:00', '6/07/2011 10:44:00', '6/07/2011 10:45:00', '6/07/2011 10:46:00', '6/07/2011 10:47:00', '6/07/2011 10:48:00', '6/07/2011 10:49:00', '6/07/2011 10:50:00', '6/07/2011 10:51:00', '6/07/2011 10:52:00', '6/07/2011 10:53:00', '6/07/2011 10:54:00', '6/07/2011 10:55:00', '6/07/2011 10:56:00', '6/07/2011 10:57:00', '6/07/2011 10:58:00', '6/07/2011 10:59:00', '6/07/2011 11:00:00', '6/07/2011 11:01:00', '6/07/2011 11:02:00', '6/07/2011 11:03:00', '6/07/2011 11:04:00', '6/07/2011 11:05:00', '6/07/2011 11:06:00', '6/07/2011 11:07:00', '6/07/2011 11:08:00', '6/07/2011 11:09:00', '6/07/2011 11:10:00', '6/07/2011 11:11:00', '6/07/2011 11:12:00', '6/07/2011 11:13:00', '6/07/2011 11:14:00', '6/07/2011 11:15:00', '6/07/2011 11:16:00', '6/07/2011 11:17:00', '6/07/2011 11:18:00', '6/07/2011 11:19:00', '6/07/2011 11:20:00', '6/07/2011 11:21:00', '6/07/2011 11:22:00', '6/07/2011 11:23:00', '6/07/2011 11:24:00', '6/07/2011 11:25:00', '6/07/2011 11:26:00', '6/07/2011 11:27:00', '6/07/2011 11:28:00', '6/07/2011 11:29:00', '6/07/2011 11:30:00', '6/07/2011 11:31:00', '6/07/2011 11:32:00', '6/07/2011 11:33:00', '6/07/2011 11:34:00', '6/07/2011 11:35:00', '6/07/2011 11:36:00', '6/07/2011 11:37:00', '6/07/2011 11:38:00', '6/07/2011 11:39:00', '6/07/2011 11:40:00', '6/07/2011 11:41:00', '6/07/2011 11:42:00', '6/07/2011 11:43:00', '6/07/2011 11:44:00', '6/07/2011 11:45:00', '6/07/2011 11:46:00', '6/07/2011 11:47:00', '6/07/2011 11:48:00', '6/07/2011 11:49:00', '6/07/2011 11:50:00', '6/07/2011 11:51:00', '6/07/2011 11:52:00', '6/07/2011 11:53:00', '6/07/2011 11:54:00', '6/07/2011 11:55:00', '6/07/2011 11:56:00', '6/07/2011 11:57:00', '6/07/2011 11:58:00', '6/07/2011 11:59:00', '6/07/2011 12:00:00', '6/07/2011 12:01:00', '6/07/2011 12:02:00', '6/07/2011 12:03:00', '6/07/2011 12:04:00', '6/07/2011 12:05:00', '6/07/2011 12:06:00', '6/07/2011 12:07:00', '6/07/2011 12:08:00', '6/07/2011 12:09:00', '6/07/2011 12:10:00', '6/07/2011 12:11:00', '6/07/2011 12:12:00', '6/07/2011 12:13:00', '6/07/2011 12:14:00', '6/07/2011 12:15:00', '6/07/2011 12:16:00', '6/07/2011 12:17:00', '6/07/2011 12:18:00', '6/07/2011 12:19:00', '6/07/2011 12:20:00', '6/07/2011 12:21:00', '6/07/2011 12:22:00', '6/07/2011 12:23:00', '6/07/2011 12:24:00', '6/07/2011 12:25:00', '6/07/2011 12:26:00', '6/07/2011 12:27:00', '6/07/2011 12:28:00', '6/07/2011 12:29:00', '6/07/2011 12:30:00', '6/07/2011 12:31:00', '6/07/2011 12:32:00', '6/07/2011 12:33:00', '6/07/2011 12:34:00', '6/07/2011 12:35:00', '6/07/2011 12:36:00', '6/07/2011 12:37:00', '6/07/2011 12:38:00', '6/07/2011 12:39:00', '6/07/2011 12:40:00', '6/07/2011 12:41:00', '6/07/2011 12:42:00', '6/07/2011 12:43:00', '6/07/2011 12:44:00', '6/07/2011 12:45:00', '6/07/2011 12:46:00', '6/07/2011 12:47:00', '6/07/2011 12:48:00', '6/07/2011 12:49:00', '6/07/2011 12:50:00', '6/07/2011 12:51:00', '6/07/2011 12:52:00', '6/07/2011 12:53:00', '6/07/2011 12:54:00', '6/07/2011 12:55:00', '6/07/2011 12:56:00', '6/07/2011 12:57:00', '6/07/2011 12:58:00', '6/07/2011 12:59:00', '6/07/2011 13:00:00', '6/07/2011 13:01:00', '6/07/2011 13:02:00', '6/07/2011 13:03:00', '6/07/2011 13:04:00', '6/07/2011 13:05:00', '6/07/2011 13:06:00', '6/07/2011 13:07:00', '6/07/2011 13:08:00', '6/07/2011 13:09:00', '6/07/2011 13:10:00', '6/07/2011 13:11:00', '6/07/2011 13:12:00', '6/07/2011 13:13:00', '6/07/2011 13:14:00', '6/07/2011 13:15:00', '6/07/2011 13:16:00', '6/07/2011 13:17:00', '6/07/2011 13:18:00', '6/07/2011 13:19:00', '6/07/2011 13:20:00', '6/07/2011 13:21:00', '6/07/2011 13:22:00', '6/07/2011 13:23:00', '6/07/2011 13:24:00', '6/07/2011 13:25:00', '6/07/2011 13:26:00', '6/07/2011 13:27:00', '6/07/2011 13:28:00', '6/07/2011 13:29:00', '6/07/2011 13:30:00', '6/07/2011 13:31:00', '6/07/2011 13:32:00', '6/07/2011 13:33:00', '6/07/2011 13:34:00', '6/07/2011 13:35:00', '6/07/2011 13:36:00', '6/07/2011 13:37:00', '6/07/2011 13:38:00', '6/07/2011 13:39:00', '6/07/2011 13:40:00', '6/07/2011 13:41:00', '6/07/2011 13:42:00', '6/07/2011 13:43:00', '6/07/2011 13:44:00', '6/07/2011 13:45:00', '6/07/2011 13:46:00', '6/07/2011 13:47:00', '6/07/2011 13:48:00', '6/07/2011 13:49:00', '6/07/2011 13:50:00', '6/07/2011 13:51:00', '6/07/2011 13:52:00', '6/07/2011 13:53:00', '6/07/2011 13:54:00', '6/07/2011 13:55:00', '6/07/2011 13:56:00', '6/07/2011 13:57:00', '6/07/2011 13:58:00', '6/07/2011 13:59:00', '6/07/2011 14:00:00', '6/07/2011 14:01:00', '6/07/2011 14:02:00', '6/07/2011 14:03:00', '6/07/2011 14:04:00', '6/07/2011 14:05:00', '6/07/2011 14:06:00', '6/07/2011 14:07:00', '6/07/2011 14:08:00', '6/07/2011 14:09:00', '6/07/2011 14:10:00', '6/07/2011 14:11:00', '6/07/2011 14:12:00', '6/07/2011 14:13:00', '6/07/2011 14:14:00', '6/07/2011 14:15:00', '6/07/2011 14:16:00', '6/07/2011 14:17:00', '6/07/2011 14:18:00', '6/07/2011 14:19:00', '6/07/2011 14:20:00', '6/07/2011 14:21:00', '6/07/2011 14:22:00', '6/07/2011 14:23:00', '6/07/2011 14:24:00', '6/07/2011 14:25:00', '6/07/2011 14:26:00', '6/07/2011 14:27:00', '6/07/2011 14:28:00', '6/07/2011 14:29:00', '6/07/2011 14:30:00', '6/07/2011 14:31:00', '6/07/2011 14:32:00', '6/07/2011 14:33:00', '6/07/2011 14:34:00', '6/07/2011 14:35:00', '6/07/2011 14:36:00', '6/07/2011 14:37:00', '6/07/2011 14:38:00', '6/07/2011 14:39:00', '6/07/2011 14:40:00', '6/07/2011 14:41:00', '6/07/2011 14:42:00', '6/07/2011 14:43:00', '6/07/2011 14:44:00', '6/07/2011 14:45:00', '6/07/2011 14:46:00', '6/07/2011 14:47:00', '6/07/2011 14:48:00', '6/07/2011 14:49:00', '6/07/2011 14:50:00', '6/07/2011 14:51:00', '6/07/2011 14:52:00', '6/07/2011 14:53:00', '6/07/2011 14:54:00', '6/07/2011 14:55:00', '6/07/2011 14:56:00', '6/07/2011 14:57:00', '6/07/2011 14:58:00', '6/07/2011 14:59:00', '6/07/2011 15:00:00', '6/07/2011 15:01:00', '6/07/2011 15:02:00', '6/07/2011 15:03:00', '6/07/2011 15:04:00', '6/07/2011 15:05:00', '6/07/2011 15:06:00', '6/07/2011 15:07:00', '6/07/2011 15:08:00', '6/07/2011 15:09:00', '6/07/2011 15:10:00', '6/07/2011 15:11:00', '6/07/2011 15:12:00', '6/07/2011 15:13:00', '6/07/2011 15:14:00', '6/07/2011 15:15:00', '6/07/2011 15:16:00', '6/07/2011 15:17:00', '6/07/2011 15:18:00', '6/07/2011 15:19:00', '6/07/2011 15:20:00', '6/07/2011 15:21:00', '6/07/2011 15:22:00', '6/07/2011 15:23:00', '6/07/2011 15:24:00', '6/07/2011 15:25:00', '6/07/2011 15:26:00', '6/07/2011 15:27:00', '6/07/2011 15:28:00', '6/07/2011 15:29:00', '6/07/2011 15:30:00', '6/07/2011 15:31:00', '6/07/2011 15:32:00', '6/07/2011 15:33:00', '6/07/2011 15:34:00', '6/07/2011 15:35:00', '6/07/2011 15:36:00', '6/07/2011 15:37:00', '6/07/2011 15:38:00', '6/07/2011 15:39:00', '6/07/2011 15:40:00', '6/07/2011 15:41:00', '6/07/2011 15:42:00', '6/07/2011 15:43:00', '6/07/2011 15:44:00', '6/07/2011 15:45:00', '6/07/2011 15:46:00', '6/07/2011 15:47:00', '6/07/2011 15:48:00', '6/07/2011 15:49:00', '6/07/2011 15:50:00', '6/07/2011 15:51:00', '6/07/2011 15:52:00', '6/07/2011 15:53:00', '6/07/2011 15:54:00', '6/07/2011 15:55:00', '6/07/2011 15:56:00', '6/07/2011 15:57:00', '6/07/2011 15:58:00', '6/07/2011 15:59:00', '6/07/2011 16:00:00', '6/07/2011 16:01:00', '6/07/2011 16:02:00', '6/07/2011 16:03:00', '6/07/2011 16:04:00', '6/07/2011 16:05:00', '6/07/2011 16:06:00', '6/07/2011 16:07:00', '6/07/2011 16:08:00', '6/07/2011 16:09:00', '6/07/2011 16:10:00', '6/07/2011 16:11:00', '6/07/2011 16:12:00', '6/07/2011 16:13:00', '6/07/2011 16:14:00', '6/07/2011 16:15:00', '6/07/2011 16:16:00', '6/07/2011 16:17:00', '6/07/2011 16:18:00', '6/07/2011 16:19:00', '6/07/2011 16:20:00', '6/07/2011 16:21:00', '6/07/2011 16:22:00', '6/07/2011 16:23:00', '6/07/2011 16:24:00', '6/07/2011 16:25:00', '6/07/2011 16:26:00', '6/07/2011 16:27:00', '6/07/2011 16:28:00', '6/07/2011 16:29:00', '6/07/2011 16:30:00', '6/07/2011 16:31:00', '6/07/2011 16:32:00', '6/07/2011 16:33:00', '6/07/2011 16:34:00', '6/07/2011 16:35:00', '6/07/2011 16:36:00', '6/07/2011 16:37:00', '6/07/2011 16:38:00', '6/07/2011 16:39:00', '6/07/2011 16:40:00', '6/07/2011 16:41:00', '6/07/2011 16:42:00', '6/07/2011 16:43:00', '6/07/2011 16:44:00', '6/07/2011 16:45:00', '6/07/2011 16:46:00', '6/07/2011 16:47:00', '6/07/2011 16:48:00', '6/07/2011 16:49:00', '6/07/2011 16:50:00', '6/07/2011 16:51:00', '6/07/2011 16:52:00', '6/07/2011 16:53:00', '6/07/2011 16:54:00', '6/07/2011 16:55:00', '6/07/2011 16:56:00', '6/07/2011 16:57:00', '6/07/2011 16:58:00', '6/07/2011 16:59:00', '6/07/2011 17:00:00', '6/07/2011 17:01:00', '6/07/2011 17:02:00', '6/07/2011 17:03:00', '6/07/2011 17:04:00', '6/07/2011 17:05:00', '6/07/2011 17:06:00', '6/07/2011 17:07:00', '6/07/2011 17:08:00', '6/07/2011 17:09:00', '6/07/2011 17:10:00', '6/07/2011 17:11:00', '6/07/2011 17:12:00', '6/07/2011 17:13:00', '6/07/2011 17:14:00', '6/07/2011 17:15:00', '6/07/2011 17:16:00', '6/07/2011 17:17:00', '6/07/2011 17:18:00', '6/07/2011 17:19:00', '6/07/2011 17:20:00', '6/07/2011 17:21:00', '6/07/2011 17:22:00', '6/07/2011 17:23:00', '6/07/2011 17:24:00', '6/07/2011 17:25:00', '6/07/2011 17:26:00', '6/07/2011 17:27:00', '6/07/2011 17:28:00', '6/07/2011 17:29:00', '6/07/2011 17:30:00', '6/07/2011 17:31:00', '6/07/2011 17:32:00', '6/07/2011 17:33:00', '6/07/2011 17:34:00', '6/07/2011 17:35:00', '6/07/2011 17:36:00', '6/07/2011 17:37:00', '6/07/2011 17:38:00', '6/07/2011 17:39:00', '6/07/2011 17:40:00', '6/07/2011 17:41:00', '6/07/2011 17:42:00', '6/07/2011 17:43:00', '6/07/2011 17:44:00', '6/07/2011 17:45:00', '6/07/2011 17:46:00', '6/07/2011 17:47:00', '6/07/2011 17:48:00', '6/07/2011 17:49:00', '6/07/2011 17:50:00', '6/07/2011 17:51:00', '6/07/2011 17:52:00', '6/07/2011 17:53:00', '6/07/2011 17:54:00', '6/07/2011 17:55:00', '6/07/2011 17:56:00', '6/07/2011 17:57:00', '6/07/2011 17:58:00', '6/07/2011 17:59:00', '6/07/2011 18:00:00', '6/07/2011 18:01:00', '6/07/2011 18:02:00', '6/07/2011 18:03:00', '6/07/2011 18:04:00', '6/07/2011 18:05:00', '6/07/2011 18:06:00', '6/07/2011 18:07:00', '6/07/2011 18:08:00', '6/07/2011 18:09:00', '6/07/2011 18:10:00', '6/07/2011 18:11:00', '6/07/2011 18:12:00', '6/07/2011 18:13:00', '6/07/2011 18:14:00', '6/07/2011 18:15:00', '6/07/2011 18:16:00', '6/07/2011 18:17:00', '6/07/2011 18:18:00', '6/07/2011 18:19:00', '6/07/2011 18:20:00', '6/07/2011 18:21:00', '6/07/2011 18:22:00', '6/07/2011 18:23:00', '6/07/2011 18:24:00', '6/07/2011 18:25:00', '6/07/2011 18:26:00', '6/07/2011 18:27:00', '6/07/2011 18:28:00', '6/07/2011 18:29:00', '6/07/2011 18:30:00', '6/07/2011 18:31:00', '6/07/2011 18:32:00', '6/07/2011 18:33:00', '6/07/2011 18:34:00', '6/07/2011 18:35:00', '6/07/2011 18:36:00', '6/07/2011 18:37:00', '6/07/2011 18:38:00', '6/07/2011 18:39:00', '6/07/2011 18:40:00', '6/07/2011 18:41:00', '6/07/2011 18:42:00', '6/07/2011 18:43:00', '6/07/2011 18:44:00', '6/07/2011 18:45:00', '6/07/2011 18:46:00', '6/07/2011 18:47:00', '6/07/2011 18:48:00', '6/07/2011 18:49:00', '6/07/2011 18:50:00', '6/07/2011 18:51:00', '6/07/2011 18:52:00', '6/07/2011 18:53:00', '6/07/2011 18:54:00', '6/07/2011 18:55:00', '6/07/2011 18:56:00', '6/07/2011 18:57:00', '6/07/2011 18:58:00', '6/07/2011 18:59:00', '6/07/2011 19:00:00', '6/07/2011 19:01:00', '6/07/2011 19:02:00', '6/07/2011 19:03:00', '6/07/2011 19:04:00', '6/07/2011 19:05:00', '6/07/2011 19:06:00', '6/07/2011 19:07:00', '6/07/2011 19:08:00', '6/07/2011 19:09:00', '6/07/2011 19:10:00', '6/07/2011 19:11:00', '6/07/2011 19:12:00', '6/07/2011 19:13:00', '6/07/2011 19:14:00', '6/07/2011 19:15:00', '6/07/2011 19:16:00', '6/07/2011 19:17:00', '6/07/2011 19:18:00', '6/07/2011 19:19:00', '6/07/2011 19:20:00', '6/07/2011 19:21:00', '6/07/2011 19:22:00', '6/07/2011 19:23:00', '6/07/2011 19:24:00', '6/07/2011 19:25:00', '6/07/2011 19:26:00', '6/07/2011 19:27:00', '6/07/2011 19:28:00', '6/07/2011 19:29:00', '6/07/2011 19:30:00', '6/07/2011 19:31:00', '6/07/2011 19:32:00', '6/07/2011 19:33:00', '6/07/2011 19:34:00', '6/07/2011 19:35:00', '6/07/2011 19:36:00', '6/07/2011 19:37:00', '6/07/2011 19:38:00', '6/07/2011 19:39:00', '6/07/2011 19:40:00', '6/07/2011 19:41:00', '6/07/2011 19:42:00', '6/07/2011 19:43:00', '6/07/2011 19:44:00', '6/07/2011 19:45:00', '6/07/2011 19:46:00', '6/07/2011 19:47:00', '6/07/2011 19:48:00', '6/07/2011 19:49:00', '6/07/2011 19:50:00', '6/07/2011 19:51:00', '6/07/2011 19:52:00', '6/07/2011 19:53:00', '6/07/2011 19:54:00', '6/07/2011 19:55:00', '6/07/2011 19:56:00', '6/07/2011 19:57:00', '6/07/2011 19:58:00', '6/07/2011 19:59:00', '6/07/2011 20:00:00', '6/07/2011 20:01:00', '6/07/2011 20:02:00', '6/07/2011 20:03:00', '6/07/2011 20:04:00', '6/07/2011 20:05:00', '6/07/2011 20:06:00', '6/07/2011 20:07:00', '6/07/2011 20:08:00', '6/07/2011 20:09:00', '6/07/2011 20:10:00', '6/07/2011 20:11:00', '6/07/2011 20:12:00', '6/07/2011 20:13:00', '6/07/2011 20:14:00', '6/07/2011 20:15:00', '6/07/2011 20:16:00', '6/07/2011 20:17:00', '6/07/2011 20:18:00', '6/07/2011 20:19:00', '6/07/2011 20:20:00', '6/07/2011 20:21:00', '6/07/2011 20:22:00', '6/07/2011 20:23:00', '6/07/2011 20:24:00', '6/07/2011 20:25:00', '6/07/2011 20:26:00', '6/07/2011 20:27:00', '6/07/2011 20:28:00', '6/07/2011 20:29:00', '6/07/2011 20:30:00', '6/07/2011 20:31:00', '6/07/2011 20:32:00', '6/07/2011 20:33:00', '6/07/2011 20:34:00', '6/07/2011 20:35:00', '6/07/2011 20:36:00', '6/07/2011 20:37:00', '6/07/2011 20:38:00', '6/07/2011 20:39:00', '6/07/2011 20:40:00', '6/07/2011 20:41:00', '6/07/2011 20:42:00', '6/07/2011 20:43:00', '6/07/2011 20:44:00', '6/07/2011 20:45:00', '6/07/2011 20:46:00', '6/07/2011 20:47:00', '6/07/2011 20:48:00', '6/07/2011 20:49:00', '6/07/2011 20:50:00', '6/07/2011 20:51:00', '6/07/2011 20:52:00', '6/07/2011 20:53:00', '6/07/2011 20:54:00', '6/07/2011 20:55:00', '6/07/2011 20:56:00', '6/07/2011 20:57:00', '6/07/2011 20:58:00', '6/07/2011 20:59:00', '6/07/2011 21:00:00', '6/07/2011 21:01:00', '6/07/2011 21:02:00', '6/07/2011 21:03:00', '6/07/2011 21:04:00', '6/07/2011 21:05:00', '6/07/2011 21:06:00', '6/07/2011 21:07:00', '6/07/2011 21:08:00', '6/07/2011 21:09:00', '6/07/2011 21:10:00', '6/07/2011 21:11:00', '6/07/2011 21:12:00', '6/07/2011 21:13:00', '6/07/2011 21:14:00', '6/07/2011 21:15:00', '6/07/2011 21:16:00', '6/07/2011 21:17:00', '6/07/2011 21:18:00', '6/07/2011 21:19:00', '6/07/2011 21:20:00', '6/07/2011 21:21:00', '6/07/2011 21:22:00', '6/07/2011 21:23:00', '6/07/2011 21:24:00', '6/07/2011 21:25:00', '6/07/2011 21:26:00', '6/07/2011 21:27:00', '6/07/2011 21:28:00', '6/07/2011 21:29:00', '6/07/2011 21:30:00', '6/07/2011 21:31:00', '6/07/2011 21:32:00', '6/07/2011 21:33:00', '6/07/2011 21:34:00', '6/07/2011 21:35:00', '6/07/2011 21:36:00', '6/07/2011 21:37:00', '6/07/2011 21:38:00', '6/07/2011 21:39:00', '6/07/2011 21:40:00', '6/07/2011 21:41:00', '6/07/2011 21:42:00', '6/07/2011 21:43:00', '6/07/2011 21:44:00', '6/07/2011 21:45:00', '6/07/2011 21:46:00', '6/07/2011 21:47:00', '6/07/2011 21:48:00', '6/07/2011 21:49:00', '6/07/2011 21:50:00', '6/07/2011 21:51:00', '6/07/2011 21:52:00', '6/07/2011 21:53:00', '6/07/2011 21:54:00', '6/07/2011 21:55:00', '6/07/2011 21:56:00', '6/07/2011 21:57:00', '6/07/2011 21:58:00', '6/07/2011 21:59:00', '6/07/2011 22:00:00', '6/07/2011 22:01:00', '6/07/2011 22:02:00', '6/07/2011 22:03:00', '6/07/2011 22:04:00', '6/07/2011 22:05:00', '6/07/2011 22:06:00', '6/07/2011 22:07:00', '6/07/2011 22:08:00', '6/07/2011 22:09:00', '6/07/2011 22:10:00', '6/07/2011 22:11:00', '6/07/2011 22:12:00', '6/07/2011 22:13:00', '6/07/2011 22:14:00', '6/07/2011 22:15:00', '6/07/2011 22:16:00', '6/07/2011 22:17:00', '6/07/2011 22:18:00', '6/07/2011 22:19:00', '6/07/2011 22:20:00', '6/07/2011 22:21:00', '6/07/2011 22:22:00', '6/07/2011 22:23:00', '6/07/2011 22:24:00', '6/07/2011 22:25:00', '6/07/2011 22:26:00', '6/07/2011 22:27:00', '6/07/2011 22:28:00', '6/07/2011 22:29:00', '6/07/2011 22:30:00', '6/07/2011 22:31:00', '6/07/2011 22:32:00', '6/07/2011 22:33:00', '6/07/2011 22:34:00', '6/07/2011 22:35:00', '6/07/2011 22:36:00', '6/07/2011 22:37:00', '6/07/2011 22:38:00', '6/07/2011 22:39:00', '6/07/2011 22:40:00', '6/07/2011 22:41:00', '6/07/2011 22:42:00', '6/07/2011 22:43:00', '6/07/2011 22:44:00', '6/07/2011 22:45:00', '6/07/2011 22:46:00', '6/07/2011 22:47:00', '6/07/2011 22:48:00', '6/07/2011 22:49:00', '6/07/2011 22:50:00', '6/07/2011 22:51:00', '6/07/2011 22:52:00', '6/07/2011 22:53:00', '6/07/2011 22:54:00', '6/07/2011 22:55:00', '6/07/2011 22:56:00', '6/07/2011 22:57:00', '6/07/2011 22:58:00', '6/07/2011 22:59:00', '6/07/2011 23:00:00', '6/07/2011 23:01:00', '6/07/2011 23:02:00', '6/07/2011 23:03:00', '6/07/2011 23:04:00', '6/07/2011 23:05:00', '6/07/2011 23:06:00', '6/07/2011 23:07:00', '6/07/2011 23:08:00', '6/07/2011 23:09:00', '6/07/2011 23:10:00', '6/07/2011 23:11:00', '6/07/2011 23:12:00', '6/07/2011 23:13:00', '6/07/2011 23:14:00', '6/07/2011 23:15:00', '6/07/2011 23:16:00', '6/07/2011 23:17:00', '6/07/2011 23:18:00', '6/07/2011 23:19:00', '6/07/2011 23:20:00', '6/07/2011 23:21:00', '6/07/2011 23:22:00', '6/07/2011 23:23:00', '6/07/2011 23:24:00', '6/07/2011 23:25:00', '6/07/2011 23:26:00', '6/07/2011 23:27:00', '6/07/2011 23:28:00', '6/07/2011 23:29:00', '6/07/2011 23:30:00', '6/07/2011 23:31:00', '6/07/2011 23:32:00', '6/07/2011 23:33:00', '6/07/2011 23:34:00', '6/07/2011 23:35:00', '6/07/2011 23:36:00', '6/07/2011 23:37:00', '6/07/2011 23:38:00', '6/07/2011 23:39:00', '6/07/2011 23:40:00', '6/07/2011 23:41:00', '6/07/2011 23:42:00', '6/07/2011 23:43:00', '6/07/2011 23:44:00', '6/07/2011 23:45:00', '6/07/2011 23:46:00', '6/07/2011 23:47:00', '6/07/2011 23:48:00', '6/07/2011 23:49:00', '6/07/2011 23:50:00', '6/07/2011 23:51:00', '6/07/2011 23:52:00', '6/07/2011 23:53:00', '6/07/2011 23:54:00', '6/07/2011 23:55:00', '6/07/2011 23:56:00', '6/07/2011 23:57:00', '6/07/2011 23:58:00', '6/07/2011 23:59:00', '7/07/2011 0:00:00', '7/07/2011 0:01:00', '7/07/2011 0:02:00', '7/07/2011 0:03:00', '7/07/2011 0:04:00', '7/07/2011 0:05:00', '7/07/2011 0:06:00', '7/07/2011 0:07:00', '7/07/2011 0:08:00', '7/07/2011 0:09:00', '7/07/2011 0:10:00', '7/07/2011 0:11:00', '7/07/2011 0:12:00', '7/07/2011 0:13:00', '7/07/2011 0:14:00', '7/07/2011 0:15:00', '7/07/2011 0:16:00', '7/07/2011 0:17:00', '7/07/2011 0:18:00', '7/07/2011 0:19:00', '7/07/2011 0:20:00', '7/07/2011 0:21:00', '7/07/2011 0:22:00', '7/07/2011 0:23:00', '7/07/2011 0:24:00', '7/07/2011 0:25:00', '7/07/2011 0:26:00', '7/07/2011 0:27:00', '7/07/2011 0:28:00', '7/07/2011 0:29:00', '7/07/2011 0:30:00', '7/07/2011 0:31:00', '7/07/2011 0:32:00', '7/07/2011 0:33:00', '7/07/2011 0:34:00', '7/07/2011 0:35:00', '7/07/2011 0:36:00', '7/07/2011 0:37:00', '7/07/2011 0:38:00', '7/07/2011 0:39:00', '7/07/2011 0:40:00', '7/07/2011 0:41:00', '7/07/2011 0:42:00', '7/07/2011 0:43:00', '7/07/2011 0:44:00', '7/07/2011 0:45:00', '7/07/2011 0:46:00', '7/07/2011 0:47:00', '7/07/2011 0:48:00', '7/07/2011 0:49:00', '7/07/2011 0:50:00', '7/07/2011 0:51:00', '7/07/2011 0:52:00', '7/07/2011 0:53:00', '7/07/2011 0:54:00', '7/07/2011 0:55:00', '7/07/2011 0:56:00', '7/07/2011 0:57:00', '7/07/2011 0:58:00', '7/07/2011 0:59:00', '7/07/2011 1:00:00', '7/07/2011 1:01:00', '7/07/2011 1:02:00', '7/07/2011 1:03:00', '7/07/2011 1:04:00', '7/07/2011 1:05:00', '7/07/2011 1:06:00', '7/07/2011 1:07:00', '7/07/2011 1:08:00', '7/07/2011 1:09:00', '7/07/2011 1:10:00', '7/07/2011 1:11:00', '7/07/2011 1:12:00', '7/07/2011 1:13:00', '7/07/2011 1:14:00', '7/07/2011 1:15:00', '7/07/2011 1:16:00', '7/07/2011 1:17:00', '7/07/2011 1:18:00', '7/07/2011 1:19:00', '7/07/2011 1:20:00', '7/07/2011 1:21:00', '7/07/2011 1:22:00', '7/07/2011 1:23:00', '7/07/2011 1:24:00', '7/07/2011 1:25:00', '7/07/2011 1:26:00', '7/07/2011 1:27:00', '7/07/2011 1:28:00', '7/07/2011 1:29:00', '7/07/2011 1:30:00', '7/07/2011 1:31:00', '7/07/2011 1:32:00', '7/07/2011 1:33:00', '7/07/2011 1:34:00', '7/07/2011 1:35:00', '7/07/2011 1:36:00', '7/07/2011 1:37:00', '7/07/2011 1:38:00', '7/07/2011 1:39:00', '7/07/2011 1:40:00', '7/07/2011 1:41:00', '7/07/2011 1:42:00', '7/07/2011 1:43:00', '7/07/2011 1:44:00', '7/07/2011 1:45:00', '7/07/2011 1:46:00', '7/07/2011 1:47:00', '7/07/2011 1:48:00', '7/07/2011 1:49:00', '7/07/2011 1:50:00', '7/07/2011 1:51:00', '7/07/2011 1:52:00', '7/07/2011 1:53:00', '7/07/2011 1:54:00', '7/07/2011 1:55:00', '7/07/2011 1:56:00', '7/07/2011 1:57:00', '7/07/2011 1:58:00', '7/07/2011 1:59:00', '7/07/2011 2:00:00', '7/07/2011 2:01:00', '7/07/2011 2:02:00', '7/07/2011 2:03:00', '7/07/2011 2:04:00', '7/07/2011 2:05:00', '7/07/2011 2:06:00', '7/07/2011 2:07:00', '7/07/2011 2:08:00', '7/07/2011 2:09:00', '7/07/2011 2:10:00', '7/07/2011 2:11:00', '7/07/2011 2:12:00', '7/07/2011 2:13:00', '7/07/2011 2:14:00', '7/07/2011 2:15:00', '7/07/2011 2:16:00', '7/07/2011 2:17:00', '7/07/2011 2:18:00', '7/07/2011 2:19:00', '7/07/2011 2:20:00', '7/07/2011 2:21:00', '7/07/2011 2:22:00', '7/07/2011 2:23:00', '7/07/2011 2:24:00', '7/07/2011 2:25:00', '7/07/2011 2:26:00', '7/07/2011 2:27:00', '7/07/2011 2:28:00', '7/07/2011 2:29:00', '7/07/2011 2:30:00', '7/07/2011 2:31:00', '7/07/2011 2:32:00', '7/07/2011 2:33:00', '7/07/2011 2:34:00', '7/07/2011 2:35:00', '7/07/2011 2:36:00', '7/07/2011 2:37:00', '7/07/2011 2:38:00', '7/07/2011 2:39:00', '7/07/2011 2:40:00', '7/07/2011 2:41:00', '7/07/2011 2:42:00', '7/07/2011 2:43:00', '7/07/2011 2:44:00', '7/07/2011 2:45:00', '7/07/2011 2:46:00', '7/07/2011 2:47:00', '7/07/2011 2:48:00', '7/07/2011 2:49:00', '7/07/2011 2:50:00', '7/07/2011 2:51:00', '7/07/2011 2:52:00', '7/07/2011 2:53:00', '7/07/2011 2:54:00', '7/07/2011 2:55:00', '7/07/2011 2:56:00', '7/07/2011 2:57:00', '7/07/2011 2:58:00', '7/07/2011 2:59:00', '7/07/2011 3:00:00', '7/07/2011 3:01:00', '7/07/2011 3:02:00', '7/07/2011 3:03:00', '7/07/2011 3:04:00', '7/07/2011 3:05:00', '7/07/2011 3:06:00', '7/07/2011 3:07:00', '7/07/2011 3:08:00', '7/07/2011 3:09:00', '7/07/2011 3:10:00', '7/07/2011 3:11:00', '7/07/2011 3:12:00', '7/07/2011 3:13:00', '7/07/2011 3:14:00', '7/07/2011 3:15:00', '7/07/2011 3:16:00', '7/07/2011 3:17:00', '7/07/2011 3:18:00', '7/07/2011 3:19:00', '7/07/2011 3:20:00', '7/07/2011 3:21:00', '7/07/2011 3:22:00', '7/07/2011 3:23:00', '7/07/2011 3:24:00', '7/07/2011 3:25:00', '7/07/2011 3:26:00', '7/07/2011 3:27:00', '7/07/2011 3:28:00', '7/07/2011 3:29:00', '7/07/2011 3:30:00', '7/07/2011 3:31:00', '7/07/2011 3:32:00', '7/07/2011 3:33:00', '7/07/2011 3:34:00', '7/07/2011 3:35:00', '7/07/2011 3:36:00', '7/07/2011 3:37:00', '7/07/2011 3:38:00', '7/07/2011 3:39:00', '7/07/2011 3:40:00', '7/07/2011 3:41:00', '7/07/2011 3:42:00', '7/07/2011 3:43:00', '7/07/2011 3:44:00', '7/07/2011 3:45:00', '7/07/2011 3:46:00', '7/07/2011 3:47:00', '7/07/2011 3:48:00', '7/07/2011 3:49:00', '7/07/2011 3:50:00', '7/07/2011 3:51:00', '7/07/2011 3:52:00', '7/07/2011 3:53:00', '7/07/2011 3:54:00', '7/07/2011 3:55:00', '7/07/2011 3:56:00', '7/07/2011 3:57:00', '7/07/2011 3:58:00', '7/07/2011 3:59:00', '7/07/2011 4:00:00', '7/07/2011 4:01:00', '7/07/2011 4:02:00', '7/07/2011 4:03:00', '7/07/2011 4:04:00', '7/07/2011 4:05:00', '7/07/2011 4:06:00', '7/07/2011 4:07:00', '7/07/2011 4:08:00', '7/07/2011 4:09:00', '7/07/2011 4:10:00', '7/07/2011 4:11:00', '7/07/2011 4:12:00', '7/07/2011 4:13:00', '7/07/2011 4:14:00', '7/07/2011 4:15:00', '7/07/2011 4:16:00', '7/07/2011 4:17:00', '7/07/2011 4:18:00', '7/07/2011 4:19:00', '7/07/2011 4:20:00', '7/07/2011 4:21:00', '7/07/2011 4:22:00', '7/07/2011 4:23:00', '7/07/2011 4:24:00', '7/07/2011 4:25:00', '7/07/2011 4:26:00', '7/07/2011 4:27:00', '7/07/2011 4:28:00', '7/07/2011 4:29:00', '7/07/2011 4:30:00', '7/07/2011 4:31:00', '7/07/2011 4:32:00', '7/07/2011 4:33:00', '7/07/2011 4:34:00', '7/07/2011 4:35:00', '7/07/2011 4:36:00', '7/07/2011 4:37:00', '7/07/2011 4:38:00', '7/07/2011 4:39:00', '7/07/2011 4:40:00', '7/07/2011 4:41:00', '7/07/2011 4:42:00', '7/07/2011 4:43:00', '7/07/2011 4:44:00', '7/07/2011 4:45:00', '7/07/2011 4:46:00', '7/07/2011 4:47:00', '7/07/2011 4:48:00', '7/07/2011 4:49:00', '7/07/2011 4:50:00', '7/07/2011 4:51:00', '7/07/2011 4:52:00', '7/07/2011 4:53:00', '7/07/2011 4:54:00', '7/07/2011 4:55:00', '7/07/2011 4:56:00', '7/07/2011 4:57:00', '7/07/2011 4:58:00', '7/07/2011 4:59:00', '7/07/2011 5:00:00', '7/07/2011 5:01:00', '7/07/2011 5:02:00', '7/07/2011 5:03:00', '7/07/2011 5:04:00', '7/07/2011 5:05:00', '7/07/2011 5:06:00', '7/07/2011 5:07:00', '7/07/2011 5:08:00', '7/07/2011 5:09:00', '7/07/2011 5:10:00', '7/07/2011 5:11:00', '7/07/2011 5:12:00', '7/07/2011 5:13:00', '7/07/2011 5:14:00', '7/07/2011 5:15:00', '7/07/2011 5:16:00', '7/07/2011 5:17:00', '7/07/2011 5:18:00', '7/07/2011 5:19:00', '7/07/2011 5:20:00', '7/07/2011 5:21:00', '7/07/2011 5:22:00', '7/07/2011 5:23:00', '7/07/2011 5:24:00', '7/07/2011 5:25:00', '7/07/2011 5:26:00', '7/07/2011 5:27:00', '7/07/2011 5:28:00', '7/07/2011 5:29:00', '7/07/2011 5:30:00', '7/07/2011 5:31:00', '7/07/2011 5:32:00', '7/07/2011 5:33:00', '7/07/2011 5:34:00', '7/07/2011 5:35:00', '7/07/2011 5:36:00', '7/07/2011 5:37:00', '7/07/2011 5:38:00', '7/07/2011 5:39:00', '7/07/2011 5:40:00', '7/07/2011 5:41:00', '7/07/2011 5:42:00', '7/07/2011 5:43:00', '7/07/2011 5:44:00', '7/07/2011 5:45:00', '7/07/2011 5:46:00', '7/07/2011 5:47:00', '7/07/2011 5:48:00', '7/07/2011 5:49:00', '7/07/2011 5:50:00', '7/07/2011 5:51:00', '7/07/2011 5:52:00', '7/07/2011 5:53:00', '7/07/2011 5:54:00', '7/07/2011 5:55:00', '7/07/2011 5:56:00', '7/07/2011 5:57:00', '7/07/2011 5:58:00', '7/07/2011 5:59:00', '7/07/2011 6:00:00', '7/07/2011 6:01:00', '7/07/2011 6:02:00', '7/07/2011 6:03:00', '7/07/2011 6:04:00', '7/07/2011 6:05:00', '7/07/2011 6:06:00', '7/07/2011 6:07:00', '7/07/2011 6:08:00', '7/07/2011 6:09:00', '7/07/2011 6:10:00', '7/07/2011 6:11:00', '7/07/2011 6:12:00', '7/07/2011 6:13:00', '7/07/2011 6:14:00', '7/07/2011 6:15:00', '7/07/2011 6:16:00', '7/07/2011 6:17:00', '7/07/2011 6:18:00', '7/07/2011 6:19:00', '7/07/2011 6:20:00', '7/07/2011 6:21:00', '7/07/2011 6:22:00', '7/07/2011 6:23:00', '7/07/2011 6:24:00', '7/07/2011 6:25:00', '7/07/2011 6:26:00', '7/07/2011 6:27:00', '7/07/2011 6:28:00', '7/07/2011 6:29:00', '7/07/2011 6:30:00', '7/07/2011 6:31:00', '7/07/2011 6:32:00', '7/07/2011 6:33:00', '7/07/2011 6:34:00', '7/07/2011 6:35:00', '7/07/2011 6:36:00', '7/07/2011 6:37:00', '7/07/2011 6:38:00', '7/07/2011 6:39:00', '7/07/2011 6:40:00', '7/07/2011 6:41:00', '7/07/2011 6:42:00', '7/07/2011 6:43:00', '7/07/2011 6:44:00', '7/07/2011 6:45:00', '7/07/2011 6:46:00', '7/07/2011 6:47:00', '7/07/2011 6:48:00', '7/07/2011 6:49:00', '7/07/2011 6:50:00', '7/07/2011 6:51:00', '7/07/2011 6:52:00', '7/07/2011 6:53:00', '7/07/2011 6:54:00', '7/07/2011 6:55:00', '7/07/2011 6:56:00', '7/07/2011 6:57:00', '7/07/2011 6:58:00', '7/07/2011 6:59:00', '7/07/2011 7:00:00', '7/07/2011 7:01:00', '7/07/2011 7:02:00', '7/07/2011 7:03:00', '7/07/2011 7:04:00', '7/07/2011 7:05:00', '7/07/2011 7:06:00', '7/07/2011 7:07:00', '7/07/2011 7:08:00', '7/07/2011 7:09:00', '7/07/2011 7:10:00', '7/07/2011 7:11:00', '7/07/2011 7:12:00', '7/07/2011 7:13:00', '7/07/2011 7:14:00', '7/07/2011 7:15:00', '7/07/2011 7:16:00', '7/07/2011 7:17:00', '7/07/2011 7:18:00', '7/07/2011 7:19:00', '7/07/2011 7:20:00', '7/07/2011 7:21:00', '7/07/2011 7:22:00', '7/07/2011 7:23:00', '7/07/2011 7:24:00', '7/07/2011 7:25:00', '7/07/2011 7:26:00', '7/07/2011 7:27:00', '7/07/2011 7:28:00', '7/07/2011 7:29:00', '7/07/2011 7:30:00', '7/07/2011 7:31:00', '7/07/2011 7:32:00', '7/07/2011 7:33:00', '7/07/2011 7:34:00', '7/07/2011 7:35:00', '7/07/2011 7:36:00', '7/07/2011 7:37:00', '7/07/2011 7:38:00', '7/07/2011 7:39:00', '7/07/2011 7:40:00', '7/07/2011 7:41:00', '7/07/2011 7:42:00', '7/07/2011 7:43:00', '7/07/2011 7:44:00', '7/07/2011 7:45:00', '7/07/2011 7:46:00', '7/07/2011 7:47:00', '7/07/2011 7:48:00', '7/07/2011 7:49:00', '7/07/2011 7:50:00', '7/07/2011 7:51:00', '7/07/2011 7:52:00', '7/07/2011 7:53:00', '7/07/2011 7:54:00', '7/07/2011 7:55:00', '7/07/2011 7:56:00', '7/07/2011 7:57:00', '7/07/2011 7:58:00', '7/07/2011 7:59:00', '7/07/2011 8:00:00', '7/07/2011 8:01:00', '7/07/2011 8:02:00', '7/07/2011 8:03:00', '7/07/2011 8:04:00', '7/07/2011 8:05:00', '7/07/2011 8:06:00', '7/07/2011 8:07:00', '7/07/2011 8:08:00', '7/07/2011 8:09:00', '7/07/2011 8:10:00', '7/07/2011 8:11:00', '7/07/2011 8:12:00', '7/07/2011 8:13:00', '7/07/2011 8:14:00', '7/07/2011 8:15:00', '7/07/2011 8:16:00', '7/07/2011 8:17:00', '7/07/2011 8:18:00', '7/07/2011 8:19:00', '7/07/2011 8:20:00', '7/07/2011 8:21:00', '7/07/2011 8:22:00', '7/07/2011 8:23:00', '7/07/2011 8:24:00', '7/07/2011 8:25:00', '7/07/2011 8:26:00', '7/07/2011 8:27:00', '7/07/2011 8:28:00', '7/07/2011 8:29:00', '7/07/2011 8:30:00', '7/07/2011 8:31:00', '7/07/2011 8:32:00', '7/07/2011 8:33:00', '7/07/2011 8:34:00', '7/07/2011 8:35:00', '7/07/2011 8:36:00', '7/07/2011 8:37:00', '7/07/2011 8:38:00', '7/07/2011 8:39:00', '7/07/2011 8:40:00', '7/07/2011 8:41:00', '7/07/2011 8:42:00', '7/07/2011 8:43:00', '7/07/2011 8:44:00', '7/07/2011 8:45:00', '7/07/2011 8:46:00', '7/07/2011 8:47:00', '7/07/2011 8:48:00', '7/07/2011 8:49:00', '7/07/2011 8:50:00', '7/07/2011 8:51:00', '7/07/2011 8:52:00', '7/07/2011 8:53:00', '7/07/2011 8:54:00', '7/07/2011 8:55:00', '7/07/2011 8:56:00', '7/07/2011 8:57:00', '7/07/2011 8:58:00', '7/07/2011 8:59:00', '7/07/2011 9:00:00', '7/07/2011 9:01:00', '7/07/2011 9:02:00', '7/07/2011 9:03:00', '7/07/2011 9:04:00', '7/07/2011 9:05:00', '7/07/2011 9:06:00', '7/07/2011 9:07:00', '7/07/2011 9:08:00', '7/07/2011 9:09:00', '7/07/2011 9:10:00', '7/07/2011 9:11:00', '7/07/2011 9:12:00', '7/07/2011 9:13:00', '7/07/2011 9:14:00', '7/07/2011 9:15:00', '7/07/2011 9:16:00', '7/07/2011 9:17:00', '7/07/2011 9:18:00', '7/07/2011 9:19:00', '7/07/2011 9:20:00', '7/07/2011 9:21:00', '7/07/2011 9:22:00', '7/07/2011 9:23:00', '7/07/2011 9:24:00', '7/07/2011 9:25:00', '7/07/2011 9:26:00', '7/07/2011 9:27:00', '7/07/2011 9:28:00', '7/07/2011 9:29:00', '7/07/2011 9:30:00', '7/07/2011 9:31:00', '7/07/2011 9:32:00', '7/07/2011 9:33:00', '7/07/2011 9:34:00', '7/07/2011 9:35:00', '7/07/2011 9:36:00', '7/07/2011 9:37:00', '7/07/2011 9:38:00', '7/07/2011 9:39:00', '7/07/2011 9:40:00', '7/07/2011 9:41:00', '7/07/2011 9:42:00', '7/07/2011 9:43:00', '7/07/2011 9:44:00', '7/07/2011 9:45:00', '7/07/2011 9:46:00', '7/07/2011 9:47:00', '7/07/2011 9:48:00', '7/07/2011 9:49:00', '7/07/2011 9:50:00', '7/07/2011 9:51:00', '7/07/2011 9:52:00', '7/07/2011 9:53:00', '7/07/2011 9:54:00', '7/07/2011 9:55:00', '7/07/2011 9:56:00', '7/07/2011 9:57:00', '7/07/2011 9:58:00', '7/07/2011 9:59:00', '7/07/2011 10:00:00', '7/07/2011 10:01:00', '7/07/2011 10:02:00', '7/07/2011 10:03:00', '7/07/2011 10:04:00', '7/07/2011 10:05:00', '7/07/2011 10:06:00', '7/07/2011 10:07:00', '7/07/2011 10:08:00', '7/07/2011 10:09:00', '7/07/2011 10:10:00', '7/07/2011 10:11:00', '7/07/2011 10:12:00', '7/07/2011 10:13:00', '7/07/2011 10:14:00', '7/07/2011 10:15:00', '7/07/2011 10:16:00', '7/07/2011 10:17:00', '7/07/2011 10:18:00', '7/07/2011 10:19:00', '7/07/2011 10:20:00', '7/07/2011 10:21:00', '7/07/2011 10:22:00', '7/07/2011 10:23:00', '7/07/2011 10:24:00', '7/07/2011 10:25:00', '7/07/2011 10:26:00', '7/07/2011 10:27:00', '7/07/2011 10:28:00', '7/07/2011 10:29:00', '7/07/2011 10:30:00', '7/07/2011 10:31:00', '7/07/2011 10:32:00', '7/07/2011 10:33:00', '7/07/2011 10:34:00', '7/07/2011 10:35:00', '7/07/2011 10:36:00', '7/07/2011 10:37:00', '7/07/2011 10:38:00', '7/07/2011 10:39:00', '7/07/2011 10:40:00', '7/07/2011 10:41:00', '7/07/2011 10:42:00', '7/07/2011 10:43:00', '7/07/2011 10:44:00', '7/07/2011 10:45:00', '7/07/2011 10:46:00', '7/07/2011 10:47:00', '7/07/2011 10:48:00', '7/07/2011 10:49:00', '7/07/2011 10:50:00', '7/07/2011 10:51:00', '7/07/2011 10:52:00', '7/07/2011 10:53:00', '7/07/2011 10:54:00', '7/07/2011 10:55:00', '7/07/2011 10:56:00', '7/07/2011 10:57:00', '7/07/2011 10:58:00', '7/07/2011 10:59:00', '7/07/2011 11:00:00', '7/07/2011 11:01:00', '7/07/2011 11:02:00', '7/07/2011 11:03:00', '7/07/2011 11:04:00', '7/07/2011 11:05:00', '7/07/2011 11:06:00', '7/07/2011 11:07:00', '7/07/2011 11:08:00', '7/07/2011 11:09:00', '7/07/2011 11:10:00', '7/07/2011 11:11:00', '7/07/2011 11:12:00', '7/07/2011 11:13:00', '7/07/2011 11:14:00', '7/07/2011 11:15:00', '7/07/2011 11:16:00', '7/07/2011 11:17:00', '7/07/2011 11:18:00', '7/07/2011 11:19:00', '7/07/2011 11:20:00', '7/07/2011 11:21:00', '7/07/2011 11:22:00', '7/07/2011 11:23:00', '7/07/2011 11:24:00', '7/07/2011 11:25:00', '7/07/2011 11:26:00', '7/07/2011 11:27:00', '7/07/2011 11:28:00', '7/07/2011 11:29:00', '7/07/2011 11:30:00', '7/07/2011 11:31:00', '7/07/2011 11:32:00', '7/07/2011 11:33:00', '7/07/2011 11:34:00', '7/07/2011 11:35:00', '7/07/2011 11:36:00', '7/07/2011 11:37:00', '7/07/2011 11:38:00', '7/07/2011 11:39:00', '7/07/2011 11:40:00', '7/07/2011 11:41:00', '7/07/2011 11:42:00', '7/07/2011 11:43:00', '7/07/2011 11:44:00', '7/07/2011 11:45:00', '7/07/2011 11:46:00', '7/07/2011 11:47:00', '7/07/2011 11:48:00', '7/07/2011 11:49:00', '7/07/2011 11:50:00', '7/07/2011 11:51:00', '7/07/2011 11:52:00', '7/07/2011 11:53:00', '7/07/2011 11:54:00', '7/07/2011 11:55:00', '7/07/2011 11:56:00', '7/07/2011 11:57:00', '7/07/2011 11:58:00', '7/07/2011 11:59:00', '7/07/2011 12:00:00', '7/07/2011 12:01:00', '7/07/2011 12:02:00', '7/07/2011 12:03:00', '7/07/2011 12:04:00', '7/07/2011 12:05:00', '7/07/2011 12:06:00', '7/07/2011 12:07:00', '7/07/2011 12:08:00', '7/07/2011 12:09:00', '7/07/2011 12:10:00', '7/07/2011 12:11:00', '7/07/2011 12:12:00', '7/07/2011 12:13:00', '7/07/2011 12:14:00', '7/07/2011 12:15:00', '7/07/2011 12:16:00', '7/07/2011 12:17:00', '7/07/2011 12:18:00', '7/07/2011 12:19:00', '7/07/2011 12:20:00', '7/07/2011 12:21:00', '7/07/2011 12:22:00', '7/07/2011 12:23:00', '7/07/2011 12:24:00', '7/07/2011 12:25:00', '7/07/2011 12:26:00', '7/07/2011 12:27:00', '7/07/2011 12:28:00', '7/07/2011 12:29:00', '7/07/2011 12:30:00', '7/07/2011 12:31:00', '7/07/2011 12:32:00', '7/07/2011 12:33:00', '7/07/2011 12:34:00', '7/07/2011 12:35:00', '7/07/2011 12:36:00', '7/07/2011 12:37:00', '7/07/2011 12:38:00', '7/07/2011 12:39:00', '7/07/2011 12:40:00', '7/07/2011 12:41:00', '7/07/2011 12:42:00', '7/07/2011 12:43:00', '7/07/2011 12:44:00', '7/07/2011 12:45:00', '7/07/2011 12:46:00', '7/07/2011 12:47:00', '7/07/2011 12:48:00', '7/07/2011 12:49:00', '7/07/2011 12:50:00', '7/07/2011 12:51:00', '7/07/2011 12:52:00', '7/07/2011 12:53:00', '7/07/2011 12:54:00', '7/07/2011 12:55:00', '7/07/2011 12:56:00', '7/07/2011 12:57:00', '7/07/2011 12:58:00', '7/07/2011 12:59:00', '7/07/2011 13:00:00', '7/07/2011 13:01:00', '7/07/2011 13:02:00', '7/07/2011 13:03:00', '7/07/2011 13:04:00', '7/07/2011 13:05:00', '7/07/2011 13:06:00', '7/07/2011 13:07:00', '7/07/2011 13:08:00', '7/07/2011 13:09:00', '7/07/2011 13:10:00', '7/07/2011 13:11:00', '7/07/2011 13:12:00', '7/07/2011 13:13:00', '7/07/2011 13:14:00', '7/07/2011 13:15:00', '7/07/2011 13:16:00', '7/07/2011 13:17:00', '7/07/2011 13:18:00', '7/07/2011 13:19:00', '7/07/2011 13:20:00', '7/07/2011 13:21:00', '7/07/2011 13:22:00', '7/07/2011 13:23:00', '7/07/2011 13:24:00', '7/07/2011 13:25:00', '7/07/2011 13:26:00', '7/07/2011 13:27:00', '7/07/2011 13:28:00', '7/07/2011 13:29:00', '7/07/2011 13:30:00', '7/07/2011 13:31:00', '7/07/2011 13:32:00', '7/07/2011 13:33:00', '7/07/2011 13:34:00', '7/07/2011 13:35:00', '7/07/2011 13:36:00', '7/07/2011 13:37:00', '7/07/2011 13:38:00', '7/07/2011 13:39:00', '7/07/2011 13:40:00', '7/07/2011 13:41:00', '7/07/2011 13:42:00', '7/07/2011 13:43:00', '7/07/2011 13:44:00', '7/07/2011 13:45:00', '7/07/2011 13:46:00', '7/07/2011 13:47:00', '7/07/2011 13:48:00', '7/07/2011 13:49:00', '7/07/2011 13:50:00', '7/07/2011 13:51:00', '7/07/2011 13:52:00', '7/07/2011 13:53:00', '7/07/2011 13:54:00', '7/07/2011 13:55:00', '7/07/2011 13:56:00', '7/07/2011 13:57:00', '7/07/2011 13:58:00', '7/07/2011 13:59:00', '7/07/2011 14:00:00', '7/07/2011 14:01:00', '7/07/2011 14:02:00', '7/07/2011 14:03:00', '7/07/2011 14:04:00', '7/07/2011 14:05:00', '7/07/2011 14:06:00', '7/07/2011 14:07:00', '7/07/2011 14:08:00', '7/07/2011 14:09:00', '7/07/2011 14:10:00', '7/07/2011 14:11:00', '7/07/2011 14:12:00', '7/07/2011 14:13:00', '7/07/2011 14:14:00', '7/07/2011 14:15:00', '7/07/2011 14:16:00', '7/07/2011 14:17:00', '7/07/2011 14:18:00', '7/07/2011 14:19:00', '7/07/2011 14:20:00', '7/07/2011 14:21:00', '7/07/2011 14:22:00', '7/07/2011 14:23:00', '7/07/2011 14:24:00', '7/07/2011 14:25:00', '7/07/2011 14:26:00', '7/07/2011 14:27:00', '7/07/2011 14:28:00', '7/07/2011 14:29:00', '7/07/2011 14:30:00', '7/07/2011 14:31:00', '7/07/2011 14:32:00', '7/07/2011 14:33:00', '7/07/2011 14:34:00', '7/07/2011 14:35:00', '7/07/2011 14:36:00', '7/07/2011 14:37:00', '7/07/2011 14:38:00', '7/07/2011 14:39:00', '7/07/2011 14:40:00', '7/07/2011 14:41:00', '7/07/2011 14:42:00', '7/07/2011 14:43:00', '7/07/2011 14:44:00', '7/07/2011 14:45:00', '7/07/2011 14:46:00', '7/07/2011 14:47:00', '7/07/2011 14:48:00', '7/07/2011 14:49:00', '7/07/2011 14:50:00', '7/07/2011 14:51:00', '7/07/2011 14:52:00', '7/07/2011 14:53:00', '7/07/2011 14:54:00', '7/07/2011 14:55:00', '7/07/2011 14:56:00', '7/07/2011 14:57:00', '7/07/2011 14:58:00', '7/07/2011 14:59:00', '7/07/2011 15:00:00', '7/07/2011 15:01:00', '7/07/2011 15:02:00', '7/07/2011 15:03:00', '7/07/2011 15:04:00', '7/07/2011 15:05:00', '7/07/2011 15:06:00', '7/07/2011 15:07:00', '7/07/2011 15:08:00', '7/07/2011 15:09:00', '7/07/2011 15:10:00', '7/07/2011 15:11:00', '7/07/2011 15:12:00', '7/07/2011 15:13:00', '7/07/2011 15:14:00', '7/07/2011 15:15:00', '7/07/2011 15:16:00', '7/07/2011 15:17:00', '7/07/2011 15:18:00', '7/07/2011 15:19:00', '7/07/2011 15:20:00', '7/07/2011 15:21:00', '7/07/2011 15:22:00', '7/07/2011 15:23:00', '7/07/2011 15:24:00', '7/07/2011 15:25:00', '7/07/2011 15:26:00', '7/07/2011 15:27:00', '7/07/2011 15:28:00', '7/07/2011 15:29:00', '7/07/2011 15:30:00', '7/07/2011 15:31:00', '7/07/2011 15:32:00', '7/07/2011 15:33:00', '7/07/2011 15:34:00', '7/07/2011 15:35:00', '7/07/2011 15:36:00', '7/07/2011 15:37:00', '7/07/2011 15:38:00', '7/07/2011 15:39:00', '7/07/2011 15:40:00', '7/07/2011 15:41:00', '7/07/2011 15:42:00', '7/07/2011 15:43:00', '7/07/2011 15:44:00', '7/07/2011 15:45:00', '7/07/2011 15:46:00', '7/07/2011 15:47:00', '7/07/2011 15:48:00', '7/07/2011 15:49:00', '7/07/2011 15:50:00', '7/07/2011 15:51:00', '7/07/2011 15:52:00', '7/07/2011 15:53:00', '7/07/2011 15:54:00', '7/07/2011 15:55:00', '7/07/2011 15:56:00', '7/07/2011 15:57:00', '7/07/2011 15:58:00', '7/07/2011 15:59:00', '7/07/2011 16:00:00', '7/07/2011 16:01:00', '7/07/2011 16:02:00', '7/07/2011 16:03:00', '7/07/2011 16:04:00', '7/07/2011 16:05:00', '7/07/2011 16:06:00', '7/07/2011 16:07:00', '7/07/2011 16:08:00', '7/07/2011 16:09:00', '7/07/2011 16:10:00', '7/07/2011 16:11:00', '7/07/2011 16:12:00', '7/07/2011 16:13:00', '7/07/2011 16:14:00', '7/07/2011 16:15:00', '7/07/2011 16:16:00', '7/07/2011 16:17:00', '7/07/2011 16:18:00', '7/07/2011 16:19:00', '7/07/2011 16:20:00', '7/07/2011 16:21:00', '7/07/2011 16:22:00', '7/07/2011 16:23:00', '7/07/2011 16:24:00', '7/07/2011 16:25:00', '7/07/2011 16:26:00', '7/07/2011 16:27:00', '7/07/2011 16:28:00', '7/07/2011 16:29:00', '7/07/2011 16:30:00', '7/07/2011 16:31:00', '7/07/2011 16:32:00', '7/07/2011 16:33:00', '7/07/2011 16:34:00', '7/07/2011 16:35:00', '7/07/2011 16:36:00', '7/07/2011 16:37:00', '7/07/2011 16:38:00', '7/07/2011 16:39:00', '7/07/2011 16:40:00', '7/07/2011 16:41:00', '7/07/2011 16:42:00', '7/07/2011 16:43:00', '7/07/2011 16:44:00', '7/07/2011 16:45:00', '7/07/2011 16:46:00', '7/07/2011 16:47:00', '7/07/2011 16:48:00', '7/07/2011 16:49:00', '7/07/2011 16:50:00', '7/07/2011 16:51:00', '7/07/2011 16:52:00', '7/07/2011 16:53:00', '7/07/2011 16:54:00', '7/07/2011 16:55:00', '7/07/2011 16:56:00', '7/07/2011 16:57:00', '7/07/2011 16:58:00', '7/07/2011 16:59:00', '7/07/2011 17:00:00', '7/07/2011 17:01:00', '7/07/2011 17:02:00', '7/07/2011 17:03:00', '7/07/2011 17:04:00', '7/07/2011 17:05:00', '7/07/2011 17:06:00', '7/07/2011 17:07:00', '7/07/2011 17:08:00', '7/07/2011 17:09:00', '7/07/2011 17:10:00', '7/07/2011 17:11:00', '7/07/2011 17:12:00', '7/07/2011 17:13:00', '7/07/2011 17:14:00', '7/07/2011 17:15:00', '7/07/2011 17:16:00', '7/07/2011 17:17:00', '7/07/2011 17:18:00', '7/07/2011 17:19:00', '7/07/2011 17:20:00', '7/07/2011 17:21:00', '7/07/2011 17:22:00', '7/07/2011 17:23:00', '7/07/2011 17:24:00', '7/07/2011 17:25:00', '7/07/2011 17:26:00', '7/07/2011 17:27:00', '7/07/2011 17:28:00', '7/07/2011 17:29:00', '7/07/2011 17:30:00', '7/07/2011 17:31:00', '7/07/2011 17:32:00', '7/07/2011 17:33:00', '7/07/2011 17:34:00', '7/07/2011 17:35:00', '7/07/2011 17:36:00', '7/07/2011 17:37:00', '7/07/2011 17:38:00', '7/07/2011 17:39:00', '7/07/2011 17:40:00', '7/07/2011 17:41:00', '7/07/2011 17:42:00', '7/07/2011 17:43:00', '7/07/2011 17:44:00', '7/07/2011 17:45:00', '7/07/2011 17:46:00', '7/07/2011 17:47:00', '7/07/2011 17:48:00', '7/07/2011 17:49:00', '7/07/2011 17:50:00', '7/07/2011 17:51:00', '7/07/2011 17:52:00', '7/07/2011 17:53:00', '7/07/2011 17:54:00', '7/07/2011 17:55:00', '7/07/2011 17:56:00', '7/07/2011 17:57:00', '7/07/2011 17:58:00', '7/07/2011 17:59:00', '7/07/2011 18:00:00', '7/07/2011 18:01:00', '7/07/2011 18:02:00', '7/07/2011 18:03:00', '7/07/2011 18:04:00', '7/07/2011 18:05:00', '7/07/2011 18:06:00', '7/07/2011 18:07:00', '7/07/2011 18:08:00', '7/07/2011 18:09:00', '7/07/2011 18:10:00', '7/07/2011 18:11:00', '7/07/2011 18:12:00', '7/07/2011 18:13:00', '7/07/2011 18:14:00', '7/07/2011 18:15:00', '7/07/2011 18:16:00', '7/07/2011 18:17:00', '7/07/2011 18:18:00', '7/07/2011 18:19:00', '7/07/2011 18:20:00', '7/07/2011 18:21:00', '7/07/2011 18:22:00', '7/07/2011 18:23:00', '7/07/2011 18:24:00', '7/07/2011 18:25:00', '7/07/2011 18:26:00', '7/07/2011 18:27:00', '7/07/2011 18:28:00', '7/07/2011 18:29:00', '7/07/2011 18:30:00', '7/07/2011 18:31:00', '7/07/2011 18:32:00', '7/07/2011 18:33:00', '7/07/2011 18:34:00', '7/07/2011 18:35:00', '7/07/2011 18:36:00', '7/07/2011 18:37:00', '7/07/2011 18:38:00', '7/07/2011 18:39:00', '7/07/2011 18:40:00', '7/07/2011 18:41:00', '7/07/2011 18:42:00', '7/07/2011 18:43:00', '7/07/2011 18:44:00', '7/07/2011 18:45:00', '7/07/2011 18:46:00', '7/07/2011 18:47:00', '7/07/2011 18:48:00', '7/07/2011 18:49:00', '7/07/2011 18:50:00', '7/07/2011 18:51:00', '7/07/2011 18:52:00', '7/07/2011 18:53:00', '7/07/2011 18:54:00', '7/07/2011 18:55:00', '7/07/2011 18:56:00', '7/07/2011 18:57:00', '7/07/2011 18:58:00', '7/07/2011 18:59:00', '7/07/2011 19:00:00', '7/07/2011 19:01:00', '7/07/2011 19:02:00', '7/07/2011 19:03:00', '7/07/2011 19:04:00', '7/07/2011 19:05:00', '7/07/2011 19:06:00', '7/07/2011 19:07:00', '7/07/2011 19:08:00', '7/07/2011 19:09:00', '7/07/2011 19:10:00', '7/07/2011 19:11:00', '7/07/2011 19:12:00', '7/07/2011 19:13:00', '7/07/2011 19:14:00', '7/07/2011 19:15:00', '7/07/2011 19:16:00', '7/07/2011 19:17:00', '7/07/2011 19:18:00', '7/07/2011 19:19:00', '7/07/2011 19:20:00', '7/07/2011 19:21:00', '7/07/2011 19:22:00', '7/07/2011 19:23:00', '7/07/2011 19:24:00', '7/07/2011 19:25:00', '7/07/2011 19:26:00', '7/07/2011 19:27:00', '7/07/2011 19:28:00', '7/07/2011 19:29:00', '7/07/2011 19:30:00', '7/07/2011 19:31:00', '7/07/2011 19:32:00', '7/07/2011 19:33:00', '7/07/2011 19:34:00', '7/07/2011 19:35:00', '7/07/2011 19:36:00', '7/07/2011 19:37:00', '7/07/2011 19:38:00', '7/07/2011 19:39:00', '7/07/2011 19:40:00', '7/07/2011 19:41:00', '7/07/2011 19:42:00', '7/07/2011 19:43:00', '7/07/2011 19:44:00', '7/07/2011 19:45:00', '7/07/2011 19:46:00', '7/07/2011 19:47:00', '7/07/2011 19:48:00', '7/07/2011 19:49:00', '7/07/2011 19:50:00', '7/07/2011 19:51:00', '7/07/2011 19:52:00', '7/07/2011 19:53:00', '7/07/2011 19:54:00', '7/07/2011 19:55:00', '7/07/2011 19:56:00', '7/07/2011 19:57:00', '7/07/2011 19:58:00', '7/07/2011 19:59:00', '7/07/2011 20:00:00', '7/07/2011 20:01:00', '7/07/2011 20:02:00', '7/07/2011 20:03:00', '7/07/2011 20:04:00', '7/07/2011 20:05:00', '7/07/2011 20:06:00', '7/07/2011 20:07:00', '7/07/2011 20:08:00', '7/07/2011 20:09:00', '7/07/2011 20:10:00', '7/07/2011 20:11:00', '7/07/2011 20:12:00', '7/07/2011 20:13:00', '7/07/2011 20:14:00', '7/07/2011 20:15:00', '7/07/2011 20:16:00', '7/07/2011 20:17:00', '7/07/2011 20:18:00', '7/07/2011 20:19:00', '7/07/2011 20:20:00', '7/07/2011 20:21:00', '7/07/2011 20:22:00', '7/07/2011 20:23:00', '7/07/2011 20:24:00', '7/07/2011 20:25:00', '7/07/2011 20:26:00', '7/07/2011 20:27:00', '7/07/2011 20:28:00', '7/07/2011 20:29:00', '7/07/2011 20:30:00', '7/07/2011 20:31:00', '7/07/2011 20:32:00', '7/07/2011 20:33:00', '7/07/2011 20:34:00', '7/07/2011 20:35:00', '7/07/2011 20:36:00', '7/07/2011 20:37:00', '7/07/2011 20:38:00', '7/07/2011 20:39:00', '7/07/2011 20:40:00', '7/07/2011 20:41:00', '7/07/2011 20:42:00', '7/07/2011 20:43:00', '7/07/2011 20:44:00', '7/07/2011 20:45:00', '7/07/2011 20:46:00', '7/07/2011 20:47:00', '7/07/2011 20:48:00', '7/07/2011 20:49:00', '7/07/2011 20:50:00', '7/07/2011 20:51:00', '7/07/2011 20:52:00', '7/07/2011 20:53:00', '7/07/2011 20:54:00', '7/07/2011 20:55:00', '7/07/2011 20:56:00', '7/07/2011 20:57:00', '7/07/2011 20:58:00', '7/07/2011 20:59:00', '7/07/2011 21:00:00', '7/07/2011 21:01:00', '7/07/2011 21:02:00', '7/07/2011 21:03:00', '7/07/2011 21:04:00', '7/07/2011 21:05:00', '7/07/2011 21:06:00', '7/07/2011 21:07:00', '7/07/2011 21:08:00', '7/07/2011 21:09:00', '7/07/2011 21:10:00', '7/07/2011 21:11:00', '7/07/2011 21:12:00', '7/07/2011 21:13:00', '7/07/2011 21:14:00', '7/07/2011 21:15:00', '7/07/2011 21:16:00', '7/07/2011 21:17:00', '7/07/2011 21:18:00', '7/07/2011 21:19:00', '7/07/2011 21:20:00', '7/07/2011 21:21:00', '7/07/2011 21:22:00', '7/07/2011 21:23:00', '7/07/2011 21:24:00', '7/07/2011 21:25:00', '7/07/2011 21:26:00', '7/07/2011 21:27:00', '7/07/2011 21:28:00', '7/07/2011 21:29:00', '7/07/2011 21:30:00', '7/07/2011 21:31:00', '7/07/2011 21:32:00', '7/07/2011 21:33:00', '7/07/2011 21:34:00', '7/07/2011 21:35:00', '7/07/2011 21:36:00', '7/07/2011 21:37:00', '7/07/2011 21:38:00', '7/07/2011 21:39:00', '7/07/2011 21:40:00', '7/07/2011 21:41:00', '7/07/2011 21:42:00', '7/07/2011 21:43:00', '7/07/2011 21:44:00', '7/07/2011 21:45:00', '7/07/2011 21:46:00', '7/07/2011 21:47:00', '7/07/2011 21:48:00', '7/07/2011 21:49:00', '7/07/2011 21:50:00', '7/07/2011 21:51:00', '7/07/2011 21:52:00', '7/07/2011 21:53:00', '7/07/2011 21:54:00', '7/07/2011 21:55:00', '7/07/2011 21:56:00', '7/07/2011 21:57:00', '7/07/2011 21:58:00', '7/07/2011 21:59:00', '7/07/2011 22:00:00', '7/07/2011 22:01:00', '7/07/2011 22:02:00', '7/07/2011 22:03:00', '7/07/2011 22:04:00', '7/07/2011 22:05:00', '7/07/2011 22:06:00', '7/07/2011 22:07:00', '7/07/2011 22:08:00', '7/07/2011 22:09:00', '7/07/2011 22:10:00', '7/07/2011 22:11:00', '7/07/2011 22:12:00', '7/07/2011 22:13:00', '7/07/2011 22:14:00', '7/07/2011 22:15:00', '7/07/2011 22:16:00', '7/07/2011 22:17:00', '7/07/2011 22:18:00', '7/07/2011 22:19:00', '7/07/2011 22:20:00', '7/07/2011 22:21:00', '7/07/2011 22:22:00', '7/07/2011 22:23:00', '7/07/2011 22:24:00', '7/07/2011 22:25:00', '7/07/2011 22:26:00', '7/07/2011 22:27:00', '7/07/2011 22:28:00', '7/07/2011 22:29:00', '7/07/2011 22:30:00', '7/07/2011 22:31:00', '7/07/2011 22:32:00', '7/07/2011 22:33:00', '7/07/2011 22:34:00', '7/07/2011 22:35:00', '7/07/2011 22:36:00', '7/07/2011 22:37:00', '7/07/2011 22:38:00', '7/07/2011 22:39:00', '7/07/2011 22:40:00', '7/07/2011 22:41:00', '7/07/2011 22:42:00', '7/07/2011 22:43:00', '7/07/2011 22:44:00', '7/07/2011 22:45:00', '7/07/2011 22:46:00', '7/07/2011 22:47:00', '7/07/2011 22:48:00', '7/07/2011 22:49:00', '7/07/2011 22:50:00', '7/07/2011 22:51:00', '7/07/2011 22:52:00', '7/07/2011 22:53:00', '7/07/2011 22:54:00', '7/07/2011 22:55:00', '7/07/2011 22:56:00', '7/07/2011 22:57:00', '7/07/2011 22:58:00', '7/07/2011 22:59:00', '7/07/2011 23:00:00', '7/07/2011 23:01:00', '7/07/2011 23:02:00', '7/07/2011 23:03:00', '7/07/2011 23:04:00', '7/07/2011 23:05:00', '7/07/2011 23:06:00', '7/07/2011 23:07:00', '7/07/2011 23:08:00', '7/07/2011 23:09:00', '7/07/2011 23:10:00', '7/07/2011 23:11:00', '7/07/2011 23:12:00', '7/07/2011 23:13:00', '7/07/2011 23:14:00', '7/07/2011 23:15:00', '7/07/2011 23:16:00', '7/07/2011 23:17:00', '7/07/2011 23:18:00', '7/07/2011 23:19:00', '7/07/2011 23:20:00', '7/07/2011 23:21:00', '7/07/2011 23:22:00', '7/07/2011 23:23:00', '7/07/2011 23:24:00', '7/07/2011 23:25:00', '7/07/2011 23:26:00', '7/07/2011 23:27:00', '7/07/2011 23:28:00', '7/07/2011 23:29:00', '7/07/2011 23:30:00', '7/07/2011 23:31:00', '7/07/2011 23:32:00', '7/07/2011 23:33:00', '7/07/2011 23:34:00', '7/07/2011 23:35:00', '7/07/2011 23:36:00', '7/07/2011 23:37:00', '7/07/2011 23:38:00', '7/07/2011 23:39:00', '7/07/2011 23:40:00', '7/07/2011 23:41:00', '7/07/2011 23:42:00', '7/07/2011 23:43:00', '7/07/2011 23:44:00', '7/07/2011 23:45:00', '7/07/2011 23:46:00', '7/07/2011 23:47:00', '7/07/2011 23:48:00', '7/07/2011 23:49:00', '7/07/2011 23:50:00', '7/07/2011 23:51:00', '7/07/2011 23:52:00', '7/07/2011 23:53:00', '7/07/2011 23:54:00', '7/07/2011 23:55:00', '7/07/2011 23:56:00', '7/07/2011 23:57:00', '7/07/2011 23:58:00', '7/07/2011 23:59:00', '8/07/2011 0:00:00', '8/07/2011 0:01:00', '8/07/2011 0:02:00', '8/07/2011 0:03:00', '8/07/2011 0:04:00', '8/07/2011 0:05:00', '8/07/2011 0:06:00', '8/07/2011 0:07:00', '8/07/2011 0:08:00', '8/07/2011 0:09:00', '8/07/2011 0:10:00', '8/07/2011 0:11:00', '8/07/2011 0:12:00', '8/07/2011 0:13:00', '8/07/2011 0:14:00', '8/07/2011 0:15:00', '8/07/2011 0:16:00', '8/07/2011 0:17:00', '8/07/2011 0:18:00', '8/07/2011 0:19:00', '8/07/2011 0:20:00', '8/07/2011 0:21:00', '8/07/2011 0:22:00', '8/07/2011 0:23:00', '8/07/2011 0:24:00', '8/07/2011 0:25:00', '8/07/2011 0:26:00', '8/07/2011 0:27:00', '8/07/2011 0:28:00', '8/07/2011 0:29:00', '8/07/2011 0:30:00', '8/07/2011 0:31:00', '8/07/2011 0:32:00', '8/07/2011 0:33:00', '8/07/2011 0:34:00', '8/07/2011 0:35:00', '8/07/2011 0:36:00', '8/07/2011 0:37:00', '8/07/2011 0:38:00', '8/07/2011 0:39:00', '8/07/2011 0:40:00', '8/07/2011 0:41:00', '8/07/2011 0:42:00', '8/07/2011 0:43:00', '8/07/2011 0:44:00', '8/07/2011 0:45:00', '8/07/2011 0:46:00', '8/07/2011 0:47:00', '8/07/2011 0:48:00', '8/07/2011 0:49:00', '8/07/2011 0:50:00', '8/07/2011 0:51:00', '8/07/2011 0:52:00', '8/07/2011 0:53:00', '8/07/2011 0:54:00', '8/07/2011 0:55:00', '8/07/2011 0:56:00', '8/07/2011 0:57:00', '8/07/2011 0:58:00', '8/07/2011 0:59:00', '8/07/2011 1:00:00', '8/07/2011 1:01:00', '8/07/2011 1:02:00', '8/07/2011 1:03:00', '8/07/2011 1:04:00', '8/07/2011 1:05:00', '8/07/2011 1:06:00', '8/07/2011 1:07:00', '8/07/2011 1:08:00', '8/07/2011 1:09:00', '8/07/2011 1:10:00', '8/07/2011 1:11:00', '8/07/2011 1:12:00', '8/07/2011 1:13:00', '8/07/2011 1:14:00', '8/07/2011 1:15:00', '8/07/2011 1:16:00', '8/07/2011 1:17:00', '8/07/2011 1:18:00', '8/07/2011 1:19:00', '8/07/2011 1:20:00', '8/07/2011 1:21:00', '8/07/2011 1:22:00', '8/07/2011 1:23:00', '8/07/2011 1:24:00', '8/07/2011 1:25:00', '8/07/2011 1:26:00', '8/07/2011 1:27:00', '8/07/2011 1:28:00', '8/07/2011 1:29:00', '8/07/2011 1:30:00', '8/07/2011 1:31:00', '8/07/2011 1:32:00', '8/07/2011 1:33:00', '8/07/2011 1:34:00', '8/07/2011 1:35:00', '8/07/2011 1:36:00', '8/07/2011 1:37:00', '8/07/2011 1:38:00', '8/07/2011 1:39:00', '8/07/2011 1:40:00', '8/07/2011 1:41:00', '8/07/2011 1:42:00', '8/07/2011 1:43:00', '8/07/2011 1:44:00', '8/07/2011 1:45:00', '8/07/2011 1:46:00', '8/07/2011 1:47:00', '8/07/2011 1:48:00', '8/07/2011 1:49:00', '8/07/2011 1:50:00', '8/07/2011 1:51:00', '8/07/2011 1:52:00', '8/07/2011 1:53:00', '8/07/2011 1:54:00', '8/07/2011 1:55:00', '8/07/2011 1:56:00', '8/07/2011 1:57:00', '8/07/2011 1:58:00', '8/07/2011 1:59:00', '8/07/2011 2:00:00', '8/07/2011 2:01:00', '8/07/2011 2:02:00', '8/07/2011 2:03:00', '8/07/2011 2:04:00', '8/07/2011 2:05:00', '8/07/2011 2:06:00', '8/07/2011 2:07:00', '8/07/2011 2:08:00', '8/07/2011 2:09:00', '8/07/2011 2:10:00', '8/07/2011 2:11:00', '8/07/2011 2:12:00', '8/07/2011 2:13:00', '8/07/2011 2:14:00', '8/07/2011 2:15:00', '8/07/2011 2:16:00', '8/07/2011 2:17:00', '8/07/2011 2:18:00', '8/07/2011 2:19:00', '8/07/2011 2:20:00', '8/07/2011 2:21:00', '8/07/2011 2:22:00', '8/07/2011 2:23:00', '8/07/2011 2:24:00', '8/07/2011 2:25:00', '8/07/2011 2:26:00', '8/07/2011 2:27:00', '8/07/2011 2:28:00', '8/07/2011 2:29:00', '8/07/2011 2:30:00', '8/07/2011 2:31:00', '8/07/2011 2:32:00', '8/07/2011 2:33:00', '8/07/2011 2:34:00', '8/07/2011 2:35:00', '8/07/2011 2:36:00', '8/07/2011 2:37:00', '8/07/2011 2:38:00', '8/07/2011 2:39:00', '8/07/2011 2:40:00', '8/07/2011 2:41:00', '8/07/2011 2:42:00', '8/07/2011 2:43:00', '8/07/2011 2:44:00', '8/07/2011 2:45:00', '8/07/2011 2:46:00', '8/07/2011 2:47:00', '8/07/2011 2:48:00', '8/07/2011 2:49:00', '8/07/2011 2:50:00', '8/07/2011 2:51:00', '8/07/2011 2:52:00', '8/07/2011 2:53:00', '8/07/2011 2:54:00', '8/07/2011 2:55:00', '8/07/2011 2:56:00', '8/07/2011 2:57:00', '8/07/2011 2:58:00', '8/07/2011 2:59:00', '8/07/2011 3:00:00', '8/07/2011 3:01:00', '8/07/2011 3:02:00', '8/07/2011 3:03:00', '8/07/2011 3:04:00', '8/07/2011 3:05:00', '8/07/2011 3:06:00', '8/07/2011 3:07:00', '8/07/2011 3:08:00', '8/07/2011 3:09:00', '8/07/2011 3:10:00', '8/07/2011 3:11:00', '8/07/2011 3:12:00', '8/07/2011 3:13:00', '8/07/2011 3:14:00', '8/07/2011 3:15:00', '8/07/2011 3:16:00', '8/07/2011 3:17:00', '8/07/2011 3:18:00', '8/07/2011 3:19:00', '8/07/2011 3:20:00', '8/07/2011 3:21:00', '8/07/2011 3:22:00', '8/07/2011 3:23:00', '8/07/2011 3:24:00', '8/07/2011 3:25:00', '8/07/2011 3:26:00', '8/07/2011 3:27:00', '8/07/2011 3:28:00', '8/07/2011 3:29:00', '8/07/2011 3:30:00', '8/07/2011 3:31:00', '8/07/2011 3:32:00', '8/07/2011 3:33:00', '8/07/2011 3:34:00', '8/07/2011 3:35:00', '8/07/2011 3:36:00', '8/07/2011 3:37:00', '8/07/2011 3:38:00', '8/07/2011 3:39:00', '8/07/2011 3:40:00', '8/07/2011 3:41:00', '8/07/2011 3:42:00', '8/07/2011 3:43:00', '8/07/2011 3:44:00', '8/07/2011 3:45:00', '8/07/2011 3:46:00', '8/07/2011 3:47:00', '8/07/2011 3:48:00', '8/07/2011 3:49:00', '8/07/2011 3:50:00', '8/07/2011 3:51:00', '8/07/2011 3:52:00', '8/07/2011 3:53:00', '8/07/2011 3:54:00', '8/07/2011 3:55:00', '8/07/2011 3:56:00', '8/07/2011 3:57:00', '8/07/2011 3:58:00', '8/07/2011 3:59:00', '8/07/2011 4:00:00', '8/07/2011 4:01:00', '8/07/2011 4:02:00', '8/07/2011 4:03:00', '8/07/2011 4:04:00', '8/07/2011 4:05:00', '8/07/2011 4:06:00', '8/07/2011 4:07:00', '8/07/2011 4:08:00', '8/07/2011 4:09:00', '8/07/2011 4:10:00', '8/07/2011 4:11:00', '8/07/2011 4:12:00', '8/07/2011 4:13:00', '8/07/2011 4:14:00', '8/07/2011 4:15:00', '8/07/2011 4:16:00', '8/07/2011 4:17:00', '8/07/2011 4:18:00', '8/07/2011 4:19:00', '8/07/2011 4:20:00', '8/07/2011 4:21:00', '8/07/2011 4:22:00', '8/07/2011 4:23:00', '8/07/2011 4:24:00', '8/07/2011 4:25:00', '8/07/2011 4:26:00', '8/07/2011 4:27:00', '8/07/2011 4:28:00', '8/07/2011 4:29:00', '8/07/2011 4:30:00', '8/07/2011 4:31:00', '8/07/2011 4:32:00', '8/07/2011 4:33:00', '8/07/2011 4:34:00', '8/07/2011 4:35:00', '8/07/2011 4:36:00', '8/07/2011 4:37:00', '8/07/2011 4:38:00', '8/07/2011 4:39:00', '8/07/2011 4:40:00', '8/07/2011 4:41:00', '8/07/2011 4:42:00', '8/07/2011 4:43:00', '8/07/2011 4:44:00', '8/07/2011 4:45:00', '8/07/2011 4:46:00', '8/07/2011 4:47:00', '8/07/2011 4:48:00', '8/07/2011 4:49:00', '8/07/2011 4:50:00', '8/07/2011 4:51:00', '8/07/2011 4:52:00', '8/07/2011 4:53:00', '8/07/2011 4:54:00', '8/07/2011 4:55:00', '8/07/2011 4:56:00', '8/07/2011 4:57:00', '8/07/2011 4:58:00', '8/07/2011 4:59:00', '8/07/2011 5:00:00', '8/07/2011 5:01:00', '8/07/2011 5:02:00', '8/07/2011 5:03:00', '8/07/2011 5:04:00', '8/07/2011 5:05:00', '8/07/2011 5:06:00', '8/07/2011 5:07:00', '8/07/2011 5:08:00', '8/07/2011 5:09:00', '8/07/2011 5:10:00', '8/07/2011 5:11:00', '8/07/2011 5:12:00', '8/07/2011 5:13:00', '8/07/2011 5:14:00', '8/07/2011 5:15:00', '8/07/2011 5:16:00', '8/07/2011 5:17:00', '8/07/2011 5:18:00', '8/07/2011 5:19:00', '8/07/2011 5:20:00', '8/07/2011 5:21:00', '8/07/2011 5:22:00', '8/07/2011 5:23:00', '8/07/2011 5:24:00', '8/07/2011 5:25:00', '8/07/2011 5:26:00', '8/07/2011 5:27:00', '8/07/2011 5:28:00', '8/07/2011 5:29:00', '8/07/2011 5:30:00', '8/07/2011 5:31:00', '8/07/2011 5:32:00', '8/07/2011 5:33:00', '8/07/2011 5:34:00', '8/07/2011 5:35:00', '8/07/2011 5:36:00', '8/07/2011 5:37:00', '8/07/2011 5:38:00', '8/07/2011 5:39:00', '8/07/2011 5:40:00', '8/07/2011 5:41:00', '8/07/2011 5:42:00', '8/07/2011 5:43:00', '8/07/2011 5:44:00', '8/07/2011 5:45:00', '8/07/2011 5:46:00', '8/07/2011 5:47:00', '8/07/2011 5:48:00', '8/07/2011 5:49:00', '8/07/2011 5:50:00', '8/07/2011 5:51:00', '8/07/2011 5:52:00', '8/07/2011 5:53:00', '8/07/2011 5:54:00', '8/07/2011 5:55:00', '8/07/2011 5:56:00', '8/07/2011 5:57:00', '8/07/2011 5:58:00', '8/07/2011 5:59:00', '8/07/2011 6:00:00', '8/07/2011 6:01:00', '8/07/2011 6:02:00', '8/07/2011 6:03:00', '8/07/2011 6:04:00', '8/07/2011 6:05:00', '8/07/2011 6:06:00', '8/07/2011 6:07:00', '8/07/2011 6:08:00', '8/07/2011 6:09:00', '8/07/2011 6:10:00', '8/07/2011 6:11:00', '8/07/2011 6:12:00', '8/07/2011 6:13:00', '8/07/2011 6:14:00', '8/07/2011 6:15:00', '8/07/2011 6:16:00', '8/07/2011 6:17:00', '8/07/2011 6:18:00', '8/07/2011 6:19:00', '8/07/2011 6:20:00', '8/07/2011 6:21:00', '8/07/2011 6:22:00', '8/07/2011 6:23:00', '8/07/2011 6:24:00', '8/07/2011 6:25:00', '8/07/2011 6:26:00', '8/07/2011 6:27:00', '8/07/2011 6:28:00', '8/07/2011 6:29:00', '8/07/2011 6:30:00', '8/07/2011 6:31:00', '8/07/2011 6:32:00', '8/07/2011 6:33:00', '8/07/2011 6:34:00', '8/07/2011 6:35:00', '8/07/2011 6:36:00', '8/07/2011 6:37:00', '8/07/2011 6:38:00', '8/07/2011 6:39:00', '8/07/2011 6:40:00', '8/07/2011 6:41:00', '8/07/2011 6:42:00', '8/07/2011 6:43:00', '8/07/2011 6:44:00', '8/07/2011 6:45:00', '8/07/2011 6:46:00', '8/07/2011 6:47:00', '8/07/2011 6:48:00', '8/07/2011 6:49:00', '8/07/2011 6:50:00', '8/07/2011 6:51:00', '8/07/2011 6:52:00', '8/07/2011 6:53:00', '8/07/2011 6:54:00', '8/07/2011 6:55:00', '8/07/2011 6:56:00', '8/07/2011 6:57:00', '8/07/2011 6:58:00', '8/07/2011 6:59:00', '8/07/2011 7:00:00', '8/07/2011 7:01:00', '8/07/2011 7:02:00', '8/07/2011 7:03:00', '8/07/2011 7:04:00', '8/07/2011 7:05:00', '8/07/2011 7:06:00', '8/07/2011 7:07:00', '8/07/2011 7:08:00', '8/07/2011 7:09:00', '8/07/2011 7:10:00', '8/07/2011 7:11:00', '8/07/2011 7:12:00', '8/07/2011 7:13:00', '8/07/2011 7:14:00', '8/07/2011 7:15:00', '8/07/2011 7:16:00', '8/07/2011 7:17:00', '8/07/2011 7:18:00', '8/07/2011 7:19:00', '8/07/2011 7:20:00', '8/07/2011 7:21:00', '8/07/2011 7:22:00', '8/07/2011 7:23:00', '8/07/2011 7:24:00', '8/07/2011 7:25:00', '8/07/2011 7:26:00', '8/07/2011 7:27:00', '8/07/2011 7:28:00', '8/07/2011 7:29:00', '8/07/2011 7:30:00', '8/07/2011 7:31:00', '8/07/2011 7:32:00', '8/07/2011 7:33:00', '8/07/2011 7:34:00', '8/07/2011 7:35:00', '8/07/2011 7:36:00', '8/07/2011 7:37:00', '8/07/2011 7:38:00', '8/07/2011 7:39:00', '8/07/2011 7:40:00', '8/07/2011 7:41:00', '8/07/2011 7:42:00', '8/07/2011 7:43:00', '8/07/2011 7:44:00', '8/07/2011 7:45:00', '8/07/2011 7:46:00', '8/07/2011 7:47:00', '8/07/2011 7:48:00', '8/07/2011 7:49:00', '8/07/2011 7:50:00', '8/07/2011 7:51:00', '8/07/2011 7:52:00', '8/07/2011 7:53:00', '8/07/2011 7:54:00', '8/07/2011 7:55:00', '8/07/2011 7:56:00', '8/07/2011 7:57:00', '8/07/2011 7:58:00', '8/07/2011 7:59:00', '8/07/2011 8:00:00', '8/07/2011 8:01:00', '8/07/2011 8:02:00', '8/07/2011 8:03:00', '8/07/2011 8:04:00', '8/07/2011 8:05:00', '8/07/2011 8:06:00', '8/07/2011 8:07:00', '8/07/2011 8:08:00', '8/07/2011 8:09:00', '8/07/2011 8:10:00', '8/07/2011 8:11:00', '8/07/2011 8:12:00', '8/07/2011 8:13:00', '8/07/2011 8:14:00', '8/07/2011 8:15:00', '8/07/2011 8:16:00', '8/07/2011 8:17:00', '8/07/2011 8:18:00', '8/07/2011 8:19:00', '8/07/2011 8:20:00', '8/07/2011 8:21:00', '8/07/2011 8:22:00', '8/07/2011 8:23:00', '8/07/2011 8:24:00', '8/07/2011 8:25:00', '8/07/2011 8:26:00', '8/07/2011 8:27:00', '8/07/2011 8:28:00', '8/07/2011 8:29:00', '8/07/2011 8:30:00', '8/07/2011 8:31:00', '8/07/2011 8:32:00', '8/07/2011 8:33:00', '8/07/2011 8:34:00', '8/07/2011 8:35:00', '8/07/2011 8:36:00', '8/07/2011 8:37:00', '8/07/2011 8:38:00', '8/07/2011 8:39:00', '8/07/2011 8:40:00', '8/07/2011 8:41:00', '8/07/2011 8:42:00', '8/07/2011 8:43:00', '8/07/2011 8:44:00', '8/07/2011 8:45:00', '8/07/2011 8:46:00', '8/07/2011 8:47:00', '8/07/2011 8:48:00', '8/07/2011 8:49:00', '8/07/2011 8:50:00', '8/07/2011 8:51:00', '8/07/2011 8:52:00', '8/07/2011 8:53:00', '8/07/2011 8:54:00', '8/07/2011 8:55:00', '8/07/2011 8:56:00', '8/07/2011 8:57:00', '8/07/2011 8:58:00', '8/07/2011 8:59:00', '8/07/2011 9:00:00', '8/07/2011 9:01:00', '8/07/2011 9:02:00', '8/07/2011 9:03:00', '8/07/2011 9:04:00', '8/07/2011 9:05:00', '8/07/2011 9:06:00', '8/07/2011 9:07:00', '8/07/2011 9:08:00', '8/07/2011 9:09:00', '8/07/2011 9:10:00', '8/07/2011 9:11:00', '8/07/2011 9:12:00', '8/07/2011 9:13:00', '8/07/2011 9:14:00', '8/07/2011 9:15:00', '8/07/2011 9:16:00', '8/07/2011 9:17:00', '8/07/2011 9:18:00', '8/07/2011 9:19:00', '8/07/2011 9:20:00', '8/07/2011 9:21:00', '8/07/2011 9:22:00', '8/07/2011 9:23:00', '8/07/2011 9:24:00', '8/07/2011 9:25:00', '8/07/2011 9:26:00', '8/07/2011 9:27:00', '8/07/2011 9:28:00', '8/07/2011 9:29:00', '8/07/2011 9:30:00', '8/07/2011 9:31:00', '8/07/2011 9:32:00', '8/07/2011 9:33:00', '8/07/2011 9:34:00', '8/07/2011 9:35:00', '8/07/2011 9:36:00', '8/07/2011 9:37:00', '8/07/2011 9:38:00', '8/07/2011 9:39:00', '8/07/2011 9:40:00', '8/07/2011 9:41:00', '8/07/2011 9:42:00', '8/07/2011 9:43:00', '8/07/2011 9:44:00', '8/07/2011 9:45:00', '8/07/2011 9:46:00', '8/07/2011 9:47:00', '8/07/2011 9:48:00', '8/07/2011 9:49:00', '8/07/2011 9:50:00', '8/07/2011 9:51:00', '8/07/2011 9:52:00', '8/07/2011 9:53:00', '8/07/2011 9:54:00', '8/07/2011 9:55:00', '8/07/2011 9:56:00', '8/07/2011 9:57:00', '8/07/2011 9:58:00', '8/07/2011 9:59:00', '8/07/2011 10:00:00', '8/07/2011 10:01:00', '8/07/2011 10:02:00', '8/07/2011 10:03:00', '8/07/2011 10:04:00', '8/07/2011 10:05:00', '8/07/2011 10:06:00', '8/07/2011 10:07:00', '8/07/2011 10:08:00', '8/07/2011 10:09:00', '8/07/2011 10:10:00', '8/07/2011 10:11:00', '8/07/2011 10:12:00', '8/07/2011 10:13:00', '8/07/2011 10:14:00', '8/07/2011 10:15:00', '8/07/2011 10:16:00', '8/07/2011 10:17:00', '8/07/2011 10:18:00', '8/07/2011 10:19:00', '8/07/2011 10:20:00', '8/07/2011 10:21:00', '8/07/2011 10:22:00', '8/07/2011 10:23:00', '8/07/2011 10:24:00', '8/07/2011 10:25:00', '8/07/2011 10:26:00', '8/07/2011 10:27:00', '8/07/2011 10:28:00', '8/07/2011 10:29:00', '8/07/2011 10:30:00', '8/07/2011 10:31:00', '8/07/2011 10:32:00', '8/07/2011 10:33:00', '8/07/2011 10:34:00', '8/07/2011 10:35:00', '8/07/2011 10:36:00', '8/07/2011 10:37:00', '8/07/2011 10:38:00', '8/07/2011 10:39:00', '8/07/2011 10:40:00', '8/07/2011 10:41:00', '8/07/2011 10:42:00', '8/07/2011 10:43:00', '8/07/2011 10:44:00', '8/07/2011 10:45:00', '8/07/2011 10:46:00', '8/07/2011 10:47:00', '8/07/2011 10:48:00', '8/07/2011 10:49:00', '8/07/2011 10:50:00', '8/07/2011 10:51:00', '8/07/2011 10:52:00', '8/07/2011 10:53:00', '8/07/2011 10:54:00', '8/07/2011 10:55:00', '8/07/2011 10:56:00', '8/07/2011 10:57:00', '8/07/2011 10:58:00', '8/07/2011 10:59:00', '8/07/2011 11:00:00', '8/07/2011 11:01:00', '8/07/2011 11:02:00', '8/07/2011 11:03:00', '8/07/2011 11:04:00', '8/07/2011 11:05:00', '8/07/2011 11:06:00', '8/07/2011 11:07:00', '8/07/2011 11:08:00', '8/07/2011 11:09:00', '8/07/2011 11:10:00', '8/07/2011 11:11:00', '8/07/2011 11:12:00', '8/07/2011 11:13:00', '8/07/2011 11:14:00', '8/07/2011 11:15:00', '8/07/2011 11:16:00', '8/07/2011 11:17:00', '8/07/2011 11:18:00', '8/07/2011 11:19:00', '8/07/2011 11:20:00', '8/07/2011 11:21:00', '8/07/2011 11:22:00', '8/07/2011 11:23:00', '8/07/2011 11:24:00', '8/07/2011 11:25:00', '8/07/2011 11:26:00', '8/07/2011 11:27:00', '8/07/2011 11:28:00', '8/07/2011 11:29:00', '8/07/2011 11:30:00', '8/07/2011 11:31:00', '8/07/2011 11:32:00', '8/07/2011 11:33:00', '8/07/2011 11:34:00', '8/07/2011 11:35:00', '8/07/2011 11:36:00', '8/07/2011 11:37:00', '8/07/2011 11:38:00', '8/07/2011 11:39:00', '8/07/2011 11:40:00', '8/07/2011 11:41:00', '8/07/2011 11:42:00', '8/07/2011 11:43:00', '8/07/2011 11:44:00', '8/07/2011 11:45:00', '8/07/2011 11:46:00', '8/07/2011 11:47:00', '8/07/2011 11:48:00', '8/07/2011 11:49:00', '8/07/2011 11:50:00', '8/07/2011 11:51:00', '8/07/2011 11:52:00', '8/07/2011 11:53:00', '8/07/2011 11:54:00', '8/07/2011 11:55:00', '8/07/2011 11:56:00', '8/07/2011 11:57:00', '8/07/2011 11:58:00', '8/07/2011 11:59:00', '8/07/2011 12:00:00', '8/07/2011 12:01:00', '8/07/2011 12:02:00', '8/07/2011 12:03:00', '8/07/2011 12:04:00', '8/07/2011 12:05:00', '8/07/2011 12:06:00', '8/07/2011 12:07:00', '8/07/2011 12:08:00', '8/07/2011 12:09:00', '8/07/2011 12:10:00', '8/07/2011 12:11:00', '8/07/2011 12:12:00', '8/07/2011 12:13:00', '8/07/2011 12:14:00', '8/07/2011 12:15:00', '8/07/2011 12:16:00', '8/07/2011 12:17:00', '8/07/2011 12:18:00', '8/07/2011 12:19:00', '8/07/2011 12:20:00', '8/07/2011 12:21:00', '8/07/2011 12:22:00', '8/07/2011 12:23:00', '8/07/2011 12:24:00', '8/07/2011 12:25:00', '8/07/2011 12:26:00', '8/07/2011 12:27:00', '8/07/2011 12:28:00', '8/07/2011 12:29:00', '8/07/2011 12:30:00', '8/07/2011 12:31:00', '8/07/2011 12:32:00', '8/07/2011 12:33:00', '8/07/2011 12:34:00', '8/07/2011 12:35:00', '8/07/2011 12:36:00', '8/07/2011 12:37:00', '8/07/2011 12:38:00', '8/07/2011 12:39:00', '8/07/2011 12:40:00', '8/07/2011 12:41:00', '8/07/2011 12:42:00', '8/07/2011 12:43:00', '8/07/2011 12:44:00', '8/07/2011 12:45:00', '8/07/2011 12:46:00', '8/07/2011 12:47:00', '8/07/2011 12:48:00', '8/07/2011 12:49:00', '8/07/2011 12:50:00', '8/07/2011 12:51:00', '8/07/2011 12:52:00', '8/07/2011 12:53:00', '8/07/2011 12:54:00', '8/07/2011 12:55:00', '8/07/2011 12:56:00', '8/07/2011 12:57:00', '8/07/2011 12:58:00', '8/07/2011 12:59:00', '8/07/2011 13:00:00', '8/07/2011 13:01:00', '8/07/2011 13:02:00', '8/07/2011 13:03:00', '8/07/2011 13:04:00', '8/07/2011 13:05:00', '8/07/2011 13:06:00', '8/07/2011 13:07:00', '8/07/2011 13:08:00', '8/07/2011 13:09:00', '8/07/2011 13:10:00', '8/07/2011 13:11:00', '8/07/2011 13:12:00', '8/07/2011 13:13:00', '8/07/2011 13:14:00', '8/07/2011 13:15:00', '8/07/2011 13:16:00', '8/07/2011 13:17:00', '8/07/2011 13:18:00', '8/07/2011 13:19:00', '8/07/2011 13:20:00', '8/07/2011 13:21:00', '8/07/2011 13:22:00', '8/07/2011 13:23:00', '8/07/2011 13:24:00', '8/07/2011 13:25:00', '8/07/2011 13:26:00', '8/07/2011 13:27:00', '8/07/2011 13:28:00', '8/07/2011 13:29:00', '8/07/2011 13:30:00', '8/07/2011 13:31:00', '8/07/2011 13:32:00', '8/07/2011 13:33:00', '8/07/2011 13:34:00', '8/07/2011 13:35:00', '8/07/2011 13:36:00', '8/07/2011 13:37:00', '8/07/2011 13:38:00', '8/07/2011 13:39:00', '8/07/2011 13:40:00', '8/07/2011 13:41:00', '8/07/2011 13:42:00', '8/07/2011 13:43:00', '8/07/2011 13:44:00', '8/07/2011 13:45:00', '8/07/2011 13:46:00', '8/07/2011 13:47:00', '8/07/2011 13:48:00', '8/07/2011 13:49:00', '8/07/2011 13:50:00', '8/07/2011 13:51:00', '8/07/2011 13:52:00', '8/07/2011 13:53:00', '8/07/2011 13:54:00', '8/07/2011 13:55:00', '8/07/2011 13:56:00', '8/07/2011 13:57:00', '8/07/2011 13:58:00', '8/07/2011 13:59:00', '8/07/2011 14:00:00', '8/07/2011 14:01:00', '8/07/2011 14:02:00', '8/07/2011 14:03:00', '8/07/2011 14:04:00', '8/07/2011 14:05:00', '8/07/2011 14:06:00', '8/07/2011 14:07:00', '8/07/2011 14:08:00', '8/07/2011 14:09:00', '8/07/2011 14:10:00', '8/07/2011 14:11:00', '8/07/2011 14:12:00', '8/07/2011 14:13:00', '8/07/2011 14:14:00', '8/07/2011 14:15:00', '8/07/2011 14:16:00', '8/07/2011 14:17:00', '8/07/2011 14:18:00', '8/07/2011 14:19:00', '8/07/2011 14:20:00', '8/07/2011 14:21:00', '8/07/2011 14:22:00', '8/07/2011 14:23:00', '8/07/2011 14:24:00', '8/07/2011 14:25:00', '8/07/2011 14:26:00', '8/07/2011 14:27:00', '8/07/2011 14:28:00', '8/07/2011 14:29:00', '8/07/2011 14:30:00', '8/07/2011 14:31:00', '8/07/2011 14:32:00', '8/07/2011 14:33:00', '8/07/2011 14:34:00', '8/07/2011 14:35:00', '8/07/2011 14:36:00', '8/07/2011 14:37:00', '8/07/2011 14:38:00', '8/07/2011 14:39:00', '8/07/2011 14:40:00', '8/07/2011 14:41:00', '8/07/2011 14:42:00', '8/07/2011 14:43:00', '8/07/2011 14:44:00', '8/07/2011 14:45:00', '8/07/2011 14:46:00', '8/07/2011 14:47:00', '8/07/2011 14:48:00', '8/07/2011 14:49:00', '8/07/2011 14:50:00', '8/07/2011 14:51:00', '8/07/2011 14:52:00', '8/07/2011 14:53:00', '8/07/2011 14:54:00', '8/07/2011 14:55:00', '8/07/2011 14:56:00', '8/07/2011 14:57:00', '8/07/2011 14:58:00', '8/07/2011 14:59:00', '8/07/2011 15:00:00', '8/07/2011 15:01:00', '8/07/2011 15:02:00', '8/07/2011 15:03:00', '8/07/2011 15:04:00', '8/07/2011 15:05:00', '8/07/2011 15:06:00', '8/07/2011 15:07:00', '8/07/2011 15:08:00', '8/07/2011 15:09:00', '8/07/2011 15:10:00', '8/07/2011 15:11:00', '8/07/2011 15:12:00', '8/07/2011 15:13:00', '8/07/2011 15:14:00', '8/07/2011 15:15:00', '8/07/2011 15:16:00', '8/07/2011 15:17:00', '8/07/2011 15:18:00', '8/07/2011 15:19:00', '8/07/2011 15:20:00', '8/07/2011 15:21:00', '8/07/2011 15:22:00', '8/07/2011 15:23:00', '8/07/2011 15:24:00', '8/07/2011 15:25:00', '8/07/2011 15:26:00', '8/07/2011 15:27:00', '8/07/2011 15:28:00', '8/07/2011 15:29:00', '8/07/2011 15:30:00', '8/07/2011 15:31:00', '8/07/2011 15:32:00', '8/07/2011 15:33:00', '8/07/2011 15:34:00', '8/07/2011 15:35:00', '8/07/2011 15:36:00', '8/07/2011 15:37:00', '8/07/2011 15:38:00', '8/07/2011 15:39:00', '8/07/2011 15:40:00', '8/07/2011 15:41:00', '8/07/2011 15:42:00', '8/07/2011 15:43:00', '8/07/2011 15:44:00', '8/07/2011 15:45:00', '8/07/2011 15:46:00', '8/07/2011 15:47:00', '8/07/2011 15:48:00', '8/07/2011 15:49:00', '8/07/2011 15:50:00', '8/07/2011 15:51:00', '8/07/2011 15:52:00', '8/07/2011 15:53:00', '8/07/2011 15:54:00', '8/07/2011 15:55:00', '8/07/2011 15:56:00', '8/07/2011 15:57:00', '8/07/2011 15:58:00', '8/07/2011 15:59:00', '8/07/2011 16:00:00', '8/07/2011 16:01:00', '8/07/2011 16:02:00', '8/07/2011 16:03:00', '8/07/2011 16:04:00', '8/07/2011 16:05:00', '8/07/2011 16:06:00', '8/07/2011 16:07:00', '8/07/2011 16:08:00', '8/07/2011 16:09:00', '8/07/2011 16:10:00', '8/07/2011 16:11:00', '8/07/2011 16:12:00', '8/07/2011 16:13:00', '8/07/2011 16:14:00', '8/07/2011 16:15:00', '8/07/2011 16:16:00', '8/07/2011 16:17:00', '8/07/2011 16:18:00', '8/07/2011 16:19:00', '8/07/2011 16:20:00', '8/07/2011 16:21:00', '8/07/2011 16:22:00', '8/07/2011 16:23:00', '8/07/2011 16:24:00', '8/07/2011 16:25:00', '8/07/2011 16:26:00', '8/07/2011 16:27:00', '8/07/2011 16:28:00', '8/07/2011 16:29:00', '8/07/2011 16:30:00', '8/07/2011 16:31:00', '8/07/2011 16:32:00', '8/07/2011 16:33:00', '8/07/2011 16:34:00', '8/07/2011 16:35:00', '8/07/2011 16:36:00', '8/07/2011 16:37:00', '8/07/2011 16:38:00', '8/07/2011 16:39:00', '8/07/2011 16:40:00', '8/07/2011 16:41:00', '8/07/2011 16:42:00', '8/07/2011 16:43:00', '8/07/2011 16:44:00', '8/07/2011 16:45:00', '8/07/2011 16:46:00', '8/07/2011 16:47:00', '8/07/2011 16:48:00', '8/07/2011 16:49:00', '8/07/2011 16:50:00', '8/07/2011 16:51:00', '8/07/2011 16:52:00', '8/07/2011 16:53:00', '8/07/2011 16:54:00', '8/07/2011 16:55:00', '8/07/2011 16:56:00', '8/07/2011 16:57:00', '8/07/2011 16:58:00', '8/07/2011 16:59:00', '8/07/2011 17:00:00', '8/07/2011 17:01:00', '8/07/2011 17:02:00', '8/07/2011 17:03:00', '8/07/2011 17:04:00', '8/07/2011 17:05:00', '8/07/2011 17:06:00', '8/07/2011 17:07:00', '8/07/2011 17:08:00', '8/07/2011 17:09:00', '8/07/2011 17:10:00', '8/07/2011 17:11:00', '8/07/2011 17:12:00', '8/07/2011 17:13:00', '8/07/2011 17:14:00', '8/07/2011 17:15:00', '8/07/2011 17:16:00', '8/07/2011 17:17:00', '8/07/2011 17:18:00', '8/07/2011 17:19:00', '8/07/2011 17:20:00', '8/07/2011 17:21:00', '8/07/2011 17:22:00', '8/07/2011 17:23:00', '8/07/2011 17:24:00', '8/07/2011 17:25:00', '8/07/2011 17:26:00', '8/07/2011 17:27:00', '8/07/2011 17:28:00', '8/07/2011 17:29:00', '8/07/2011 17:30:00', '8/07/2011 17:31:00', '8/07/2011 17:32:00', '8/07/2011 17:33:00', '8/07/2011 17:34:00', '8/07/2011 17:35:00', '8/07/2011 17:36:00', '8/07/2011 17:37:00', '8/07/2011 17:38:00', '8/07/2011 17:39:00', '8/07/2011 17:40:00', '8/07/2011 17:41:00', '8/07/2011 17:42:00', '8/07/2011 17:43:00', '8/07/2011 17:44:00', '8/07/2011 17:45:00', '8/07/2011 17:46:00', '8/07/2011 17:47:00', '8/07/2011 17:48:00', '8/07/2011 17:49:00', '8/07/2011 17:50:00', '8/07/2011 17:51:00', '8/07/2011 17:52:00', '8/07/2011 17:53:00', '8/07/2011 17:54:00', '8/07/2011 17:55:00', '8/07/2011 17:56:00', '8/07/2011 17:57:00', '8/07/2011 17:58:00', '8/07/2011 17:59:00', '8/07/2011 18:00:00', '8/07/2011 18:01:00', '8/07/2011 18:02:00', '8/07/2011 18:03:00', '8/07/2011 18:04:00', '8/07/2011 18:05:00', '8/07/2011 18:06:00', '8/07/2011 18:07:00', '8/07/2011 18:08:00', '8/07/2011 18:09:00', '8/07/2011 18:10:00', '8/07/2011 18:11:00', '8/07/2011 18:12:00', '8/07/2011 18:13:00', '8/07/2011 18:14:00', '8/07/2011 18:15:00', '8/07/2011 18:16:00', '8/07/2011 18:17:00', '8/07/2011 18:18:00', '8/07/2011 18:19:00', '8/07/2011 18:20:00', '8/07/2011 18:21:00', '8/07/2011 18:22:00', '8/07/2011 18:23:00', '8/07/2011 18:24:00', '8/07/2011 18:25:00', '8/07/2011 18:26:00', '8/07/2011 18:27:00', '8/07/2011 18:28:00', '8/07/2011 18:29:00', '8/07/2011 18:30:00', '8/07/2011 18:31:00', '8/07/2011 18:32:00', '8/07/2011 18:33:00', '8/07/2011 18:34:00', '8/07/2011 18:35:00', '8/07/2011 18:36:00', '8/07/2011 18:37:00', '8/07/2011 18:38:00', '8/07/2011 18:39:00', '8/07/2011 18:40:00', '8/07/2011 18:41:00', '8/07/2011 18:42:00', '8/07/2011 18:43:00', '8/07/2011 18:44:00', '8/07/2011 18:45:00', '8/07/2011 18:46:00', '8/07/2011 18:47:00', '8/07/2011 18:48:00', '8/07/2011 18:49:00', '8/07/2011 18:50:00', '8/07/2011 18:51:00', '8/07/2011 18:52:00', '8/07/2011 18:53:00', '8/07/2011 18:54:00', '8/07/2011 18:55:00', '8/07/2011 18:56:00', '8/07/2011 18:57:00', '8/07/2011 18:58:00', '8/07/2011 18:59:00', '8/07/2011 19:00:00', '8/07/2011 19:01:00', '8/07/2011 19:02:00', '8/07/2011 19:03:00', '8/07/2011 19:04:00', '8/07/2011 19:05:00', '8/07/2011 19:06:00', '8/07/2011 19:07:00', '8/07/2011 19:08:00', '8/07/2011 19:09:00', '8/07/2011 19:10:00', '8/07/2011 19:11:00', '8/07/2011 19:12:00', '8/07/2011 19:13:00', '8/07/2011 19:14:00', '8/07/2011 19:15:00', '8/07/2011 19:16:00', '8/07/2011 19:17:00', '8/07/2011 19:18:00', '8/07/2011 19:19:00', '8/07/2011 19:20:00', '8/07/2011 19:21:00', '8/07/2011 19:22:00', '8/07/2011 19:23:00', '8/07/2011 19:24:00', '8/07/2011 19:25:00', '8/07/2011 19:26:00', '8/07/2011 19:27:00', '8/07/2011 19:28:00', '8/07/2011 19:29:00', '8/07/2011 19:30:00', '8/07/2011 19:31:00', '8/07/2011 19:32:00', '8/07/2011 19:33:00', '8/07/2011 19:34:00', '8/07/2011 19:35:00', '8/07/2011 19:36:00', '8/07/2011 19:37:00', '8/07/2011 19:38:00', '8/07/2011 19:39:00', '8/07/2011 19:40:00', '8/07/2011 19:41:00', '8/07/2011 19:42:00', '8/07/2011 19:43:00', '8/07/2011 19:44:00', '8/07/2011 19:45:00', '8/07/2011 19:46:00', '8/07/2011 19:47:00', '8/07/2011 19:48:00', '8/07/2011 19:49:00', '8/07/2011 19:50:00', '8/07/2011 19:51:00', '8/07/2011 19:52:00', '8/07/2011 19:53:00', '8/07/2011 19:54:00', '8/07/2011 19:55:00', '8/07/2011 19:56:00', '8/07/2011 19:57:00', '8/07/2011 19:58:00', '8/07/2011 19:59:00', '8/07/2011 20:00:00', '8/07/2011 20:01:00', '8/07/2011 20:02:00', '8/07/2011 20:03:00', '8/07/2011 20:04:00', '8/07/2011 20:05:00', '8/07/2011 20:06:00', '8/07/2011 20:07:00', '8/07/2011 20:08:00', '8/07/2011 20:09:00', '8/07/2011 20:10:00', '8/07/2011 20:11:00', '8/07/2011 20:12:00', '8/07/2011 20:13:00', '8/07/2011 20:14:00', '8/07/2011 20:15:00', '8/07/2011 20:16:00', '8/07/2011 20:17:00', '8/07/2011 20:18:00', '8/07/2011 20:19:00', '8/07/2011 20:20:00', '8/07/2011 20:21:00', '8/07/2011 20:22:00', '8/07/2011 20:23:00', '8/07/2011 20:24:00', '8/07/2011 20:25:00', '8/07/2011 20:26:00', '8/07/2011 20:27:00', '8/07/2011 20:28:00', '8/07/2011 20:29:00', '8/07/2011 20:30:00', '8/07/2011 20:31:00', '8/07/2011 20:32:00', '8/07/2011 20:33:00', '8/07/2011 20:34:00', '8/07/2011 20:35:00', '8/07/2011 20:36:00', '8/07/2011 20:37:00', '8/07/2011 20:38:00', '8/07/2011 20:39:00', '8/07/2011 20:40:00', '8/07/2011 20:41:00', '8/07/2011 20:42:00', '8/07/2011 20:43:00', '8/07/2011 20:44:00', '8/07/2011 20:45:00', '8/07/2011 20:46:00', '8/07/2011 20:47:00', '8/07/2011 20:48:00', '8/07/2011 20:49:00', '8/07/2011 20:50:00', '8/07/2011 20:51:00', '8/07/2011 20:52:00', '8/07/2011 20:53:00', '8/07/2011 20:54:00', '8/07/2011 20:55:00', '8/07/2011 20:56:00', '8/07/2011 20:57:00', '8/07/2011 20:58:00', '8/07/2011 20:59:00', '8/07/2011 21:00:00', '8/07/2011 21:01:00', '8/07/2011 21:02:00', '8/07/2011 21:03:00', '8/07/2011 21:04:00', '8/07/2011 21:05:00', '8/07/2011 21:06:00', '8/07/2011 21:07:00', '8/07/2011 21:08:00', '8/07/2011 21:09:00', '8/07/2011 21:10:00', '8/07/2011 21:11:00', '8/07/2011 21:12:00', '8/07/2011 21:13:00', '8/07/2011 21:14:00', '8/07/2011 21:15:00', '8/07/2011 21:16:00', '8/07/2011 21:17:00', '8/07/2011 21:18:00', '8/07/2011 21:19:00', '8/07/2011 21:20:00', '8/07/2011 21:21:00', '8/07/2011 21:22:00', '8/07/2011 21:23:00', '8/07/2011 21:24:00', '8/07/2011 21:25:00', '8/07/2011 21:26:00', '8/07/2011 21:27:00', '8/07/2011 21:28:00', '8/07/2011 21:29:00', '8/07/2011 21:30:00', '8/07/2011 21:31:00', '8/07/2011 21:32:00', '8/07/2011 21:33:00', '8/07/2011 21:34:00', '8/07/2011 21:35:00', '8/07/2011 21:36:00', '8/07/2011 21:37:00', '8/07/2011 21:38:00', '8/07/2011 21:39:00', '8/07/2011 21:40:00', '8/07/2011 21:41:00', '8/07/2011 21:42:00', '8/07/2011 21:43:00', '8/07/2011 21:44:00', '8/07/2011 21:45:00', '8/07/2011 21:46:00', '8/07/2011 21:47:00', '8/07/2011 21:48:00', '8/07/2011 21:49:00', '8/07/2011 21:50:00', '8/07/2011 21:51:00', '8/07/2011 21:52:00', '8/07/2011 21:53:00', '8/07/2011 21:54:00', '8/07/2011 21:55:00', '8/07/2011 21:56:00', '8/07/2011 21:57:00', '8/07/2011 21:58:00', '8/07/2011 21:59:00', '8/07/2011 22:00:00', '8/07/2011 22:01:00', '8/07/2011 22:02:00', '8/07/2011 22:03:00', '8/07/2011 22:04:00', '8/07/2011 22:05:00', '8/07/2011 22:06:00', '8/07/2011 22:07:00', '8/07/2011 22:08:00', '8/07/2011 22:09:00', '8/07/2011 22:10:00', '8/07/2011 22:11:00', '8/07/2011 22:12:00', '8/07/2011 22:13:00', '8/07/2011 22:14:00', '8/07/2011 22:15:00', '8/07/2011 22:16:00', '8/07/2011 22:17:00', '8/07/2011 22:18:00', '8/07/2011 22:19:00', '8/07/2011 22:20:00', '8/07/2011 22:21:00', '8/07/2011 22:22:00', '8/07/2011 22:23:00', '8/07/2011 22:24:00', '8/07/2011 22:25:00', '8/07/2011 22:26:00', '8/07/2011 22:27:00', '8/07/2011 22:28:00', '8/07/2011 22:29:00', '8/07/2011 22:30:00', '8/07/2011 22:31:00', '8/07/2011 22:32:00', '8/07/2011 22:33:00', '8/07/2011 22:34:00', '8/07/2011 22:35:00', '8/07/2011 22:36:00', '8/07/2011 22:37:00', '8/07/2011 22:38:00', '8/07/2011 22:39:00', '8/07/2011 22:40:00', '8/07/2011 22:41:00', '8/07/2011 22:42:00', '8/07/2011 22:43:00', '8/07/2011 22:44:00', '8/07/2011 22:45:00', '8/07/2011 22:46:00', '8/07/2011 22:47:00', '8/07/2011 22:48:00', '8/07/2011 22:49:00', '8/07/2011 22:50:00', '8/07/2011 22:51:00', '8/07/2011 22:52:00', '8/07/2011 22:53:00', '8/07/2011 22:54:00', '8/07/2011 22:55:00', '8/07/2011 22:56:00', '8/07/2011 22:57:00', '8/07/2011 22:58:00', '8/07/2011 22:59:00', '8/07/2011 23:00:00', '8/07/2011 23:01:00', '8/07/2011 23:02:00', '8/07/2011 23:03:00', '8/07/2011 23:04:00', '8/07/2011 23:05:00', '8/07/2011 23:06:00', '8/07/2011 23:07:00', '8/07/2011 23:08:00', '8/07/2011 23:09:00', '8/07/2011 23:10:00', '8/07/2011 23:11:00', '8/07/2011 23:12:00', '8/07/2011 23:13:00', '8/07/2011 23:14:00', '8/07/2011 23:15:00', '8/07/2011 23:16:00', '8/07/2011 23:17:00', '8/07/2011 23:18:00', '8/07/2011 23:19:00', '8/07/2011 23:20:00', '8/07/2011 23:21:00', '8/07/2011 23:22:00', '8/07/2011 23:23:00', '8/07/2011 23:24:00', '8/07/2011 23:25:00', '8/07/2011 23:26:00', '8/07/2011 23:27:00', '8/07/2011 23:28:00', '8/07/2011 23:29:00', '8/07/2011 23:30:00', '8/07/2011 23:31:00', '8/07/2011 23:32:00', '8/07/2011 23:33:00', '8/07/2011 23:34:00', '8/07/2011 23:35:00', '8/07/2011 23:36:00', '8/07/2011 23:37:00', '8/07/2011 23:38:00', '8/07/2011 23:39:00', '8/07/2011 23:40:00', '8/07/2011 23:41:00', '8/07/2011 23:42:00', '8/07/2011 23:43:00', '8/07/2011 23:44:00', '8/07/2011 23:45:00', '8/07/2011 23:46:00', '8/07/2011 23:47:00', '8/07/2011 23:48:00', '8/07/2011 23:49:00', '8/07/2011 23:50:00', '8/07/2011 23:51:00', '8/07/2011 23:52:00', '8/07/2011 23:53:00', '8/07/2011 23:54:00', '8/07/2011 23:55:00', '8/07/2011 23:56:00', '8/07/2011 23:57:00', '8/07/2011 23:58:00', '8/07/2011 23:59:00', '9/07/2011 0:00:00', '9/07/2011 0:01:00', '9/07/2011 0:02:00', '9/07/2011 0:03:00', '9/07/2011 0:04:00', '9/07/2011 0:05:00', '9/07/2011 0:06:00', '9/07/2011 0:07:00', '9/07/2011 0:08:00', '9/07/2011 0:09:00', '9/07/2011 0:10:00', '9/07/2011 0:11:00', '9/07/2011 0:12:00', '9/07/2011 0:13:00', '9/07/2011 0:14:00', '9/07/2011 0:15:00', '9/07/2011 0:16:00', '9/07/2011 0:17:00', '9/07/2011 0:18:00', '9/07/2011 0:19:00', '9/07/2011 0:20:00', '9/07/2011 0:21:00', '9/07/2011 0:22:00', '9/07/2011 0:23:00', '9/07/2011 0:24:00', '9/07/2011 0:25:00', '9/07/2011 0:26:00', '9/07/2011 0:27:00', '9/07/2011 0:28:00', '9/07/2011 0:29:00', '9/07/2011 0:30:00', '9/07/2011 0:31:00', '9/07/2011 0:32:00', '9/07/2011 0:33:00', '9/07/2011 0:34:00', '9/07/2011 0:35:00', '9/07/2011 0:36:00', '9/07/2011 0:37:00', '9/07/2011 0:38:00', '9/07/2011 0:39:00', '9/07/2011 0:40:00', '9/07/2011 0:41:00', '9/07/2011 0:42:00', '9/07/2011 0:43:00', '9/07/2011 0:44:00', '9/07/2011 0:45:00', '9/07/2011 0:46:00', '9/07/2011 0:47:00', '9/07/2011 0:48:00', '9/07/2011 0:49:00', '9/07/2011 0:50:00', '9/07/2011 0:51:00', '9/07/2011 0:52:00', '9/07/2011 0:53:00', '9/07/2011 0:54:00', '9/07/2011 0:55:00', '9/07/2011 0:56:00', '9/07/2011 0:57:00', '9/07/2011 0:58:00', '9/07/2011 0:59:00', '9/07/2011 1:00:00', '9/07/2011 1:01:00', '9/07/2011 1:02:00', '9/07/2011 1:03:00', '9/07/2011 1:04:00', '9/07/2011 1:05:00', '9/07/2011 1:06:00', '9/07/2011 1:07:00', '9/07/2011 1:08:00', '9/07/2011 1:09:00', '9/07/2011 1:10:00', '9/07/2011 1:11:00', '9/07/2011 1:12:00', '9/07/2011 1:13:00', '9/07/2011 1:14:00', '9/07/2011 1:15:00', '9/07/2011 1:16:00', '9/07/2011 1:17:00', '9/07/2011 1:18:00', '9/07/2011 1:19:00', '9/07/2011 1:20:00', '9/07/2011 1:21:00', '9/07/2011 1:22:00', '9/07/2011 1:23:00', '9/07/2011 1:24:00', '9/07/2011 1:25:00', '9/07/2011 1:26:00', '9/07/2011 1:27:00', '9/07/2011 1:28:00', '9/07/2011 1:29:00', '9/07/2011 1:30:00', '9/07/2011 1:31:00', '9/07/2011 1:32:00', '9/07/2011 1:33:00', '9/07/2011 1:34:00', '9/07/2011 1:35:00', '9/07/2011 1:36:00', '9/07/2011 1:37:00', '9/07/2011 1:38:00', '9/07/2011 1:39:00', '9/07/2011 1:40:00', '9/07/2011 1:41:00', '9/07/2011 1:42:00', '9/07/2011 1:43:00', '9/07/2011 1:44:00', '9/07/2011 1:45:00', '9/07/2011 1:46:00', '9/07/2011 1:47:00', '9/07/2011 1:48:00', '9/07/2011 1:49:00', '9/07/2011 1:50:00', '9/07/2011 1:51:00', '9/07/2011 1:52:00', '9/07/2011 1:53:00', '9/07/2011 1:54:00', '9/07/2011 1:55:00', '9/07/2011 1:56:00', '9/07/2011 1:57:00', '9/07/2011 1:58:00', '9/07/2011 1:59:00', '9/07/2011 2:00:00', '9/07/2011 2:01:00', '9/07/2011 2:02:00', '9/07/2011 2:03:00', '9/07/2011 2:04:00', '9/07/2011 2:05:00', '9/07/2011 2:06:00', '9/07/2011 2:07:00', '9/07/2011 2:08:00', '9/07/2011 2:09:00', '9/07/2011 2:10:00', '9/07/2011 2:11:00', '9/07/2011 2:12:00', '9/07/2011 2:13:00', '9/07/2011 2:14:00', '9/07/2011 2:15:00', '9/07/2011 2:16:00', '9/07/2011 2:17:00', '9/07/2011 2:18:00', '9/07/2011 2:19:00', '9/07/2011 2:20:00', '9/07/2011 2:21:00', '9/07/2011 2:22:00', '9/07/2011 2:23:00', '9/07/2011 2:24:00', '9/07/2011 2:25:00', '9/07/2011 2:26:00', '9/07/2011 2:27:00', '9/07/2011 2:28:00', '9/07/2011 2:29:00', '9/07/2011 2:30:00', '9/07/2011 2:31:00', '9/07/2011 2:32:00', '9/07/2011 2:33:00', '9/07/2011 2:34:00', '9/07/2011 2:35:00', '9/07/2011 2:36:00', '9/07/2011 2:37:00', '9/07/2011 2:38:00', '9/07/2011 2:39:00', '9/07/2011 2:40:00', '9/07/2011 2:41:00', '9/07/2011 2:42:00', '9/07/2011 2:43:00', '9/07/2011 2:44:00', '9/07/2011 2:45:00', '9/07/2011 2:46:00', '9/07/2011 2:47:00', '9/07/2011 2:48:00', '9/07/2011 2:49:00', '9/07/2011 2:50:00', '9/07/2011 2:51:00', '9/07/2011 2:52:00', '9/07/2011 2:53:00', '9/07/2011 2:54:00', '9/07/2011 2:55:00', '9/07/2011 2:56:00', '9/07/2011 2:57:00', '9/07/2011 2:58:00', '9/07/2011 2:59:00', '9/07/2011 3:00:00', '9/07/2011 3:01:00', '9/07/2011 3:02:00', '9/07/2011 3:03:00', '9/07/2011 3:04:00', '9/07/2011 3:05:00', '9/07/2011 3:06:00', '9/07/2011 3:07:00', '9/07/2011 3:08:00', '9/07/2011 3:09:00', '9/07/2011 3:10:00', '9/07/2011 3:11:00', '9/07/2011 3:12:00', '9/07/2011 3:13:00', '9/07/2011 3:14:00', '9/07/2011 3:15:00', '9/07/2011 3:16:00', '9/07/2011 3:17:00', '9/07/2011 3:18:00', '9/07/2011 3:19:00', '9/07/2011 3:20:00', '9/07/2011 3:21:00', '9/07/2011 3:22:00', '9/07/2011 3:23:00', '9/07/2011 3:24:00', '9/07/2011 3:25:00', '9/07/2011 3:26:00', '9/07/2011 3:27:00', '9/07/2011 3:28:00', '9/07/2011 3:29:00', '9/07/2011 3:30:00', '9/07/2011 3:31:00', '9/07/2011 3:32:00', '9/07/2011 3:33:00', '9/07/2011 3:34:00', '9/07/2011 3:35:00', '9/07/2011 3:36:00', '9/07/2011 3:37:00', '9/07/2011 3:38:00', '9/07/2011 3:39:00', '9/07/2011 3:40:00', '9/07/2011 3:41:00', '9/07/2011 3:42:00', '9/07/2011 3:43:00', '9/07/2011 3:44:00', '9/07/2011 3:45:00', '9/07/2011 3:46:00', '9/07/2011 3:47:00', '9/07/2011 3:48:00', '9/07/2011 3:49:00', '9/07/2011 3:50:00', '9/07/2011 3:51:00', '9/07/2011 3:52:00', '9/07/2011 3:53:00', '9/07/2011 3:54:00', '9/07/2011 3:55:00', '9/07/2011 3:56:00', '9/07/2011 3:57:00', '9/07/2011 3:58:00', '9/07/2011 3:59:00', '9/07/2011 4:00:00', '9/07/2011 4:01:00', '9/07/2011 4:02:00', '9/07/2011 4:03:00', '9/07/2011 4:04:00', '9/07/2011 4:05:00', '9/07/2011 4:06:00', '9/07/2011 4:07:00', '9/07/2011 4:08:00', '9/07/2011 4:09:00', '9/07/2011 4:10:00', '9/07/2011 4:11:00', '9/07/2011 4:12:00', '9/07/2011 4:13:00', '9/07/2011 4:14:00', '9/07/2011 4:15:00', '9/07/2011 4:16:00', '9/07/2011 4:17:00', '9/07/2011 4:18:00', '9/07/2011 4:19:00', '9/07/2011 4:20:00', '9/07/2011 4:21:00', '9/07/2011 4:22:00', '9/07/2011 4:23:00', '9/07/2011 4:24:00', '9/07/2011 4:25:00', '9/07/2011 4:26:00', '9/07/2011 4:27:00', '9/07/2011 4:28:00', '9/07/2011 4:29:00', '9/07/2011 4:30:00', '9/07/2011 4:31:00', '9/07/2011 4:32:00', '9/07/2011 4:33:00', '9/07/2011 4:34:00', '9/07/2011 4:35:00', '9/07/2011 4:36:00', '9/07/2011 4:37:00', '9/07/2011 4:38:00', '9/07/2011 4:39:00', '9/07/2011 4:40:00', '9/07/2011 4:41:00', '9/07/2011 4:42:00', '9/07/2011 4:43:00', '9/07/2011 4:44:00', '9/07/2011 4:45:00', '9/07/2011 4:46:00', '9/07/2011 4:47:00', '9/07/2011 4:48:00', '9/07/2011 4:49:00', '9/07/2011 4:50:00', '9/07/2011 4:51:00', '9/07/2011 4:52:00', '9/07/2011 4:53:00', '9/07/2011 4:54:00', '9/07/2011 4:55:00', '9/07/2011 4:56:00', '9/07/2011 4:57:00', '9/07/2011 4:58:00', '9/07/2011 4:59:00', '9/07/2011 5:00:00', '9/07/2011 5:01:00', '9/07/2011 5:02:00', '9/07/2011 5:03:00', '9/07/2011 5:04:00', '9/07/2011 5:05:00', '9/07/2011 5:06:00', '9/07/2011 5:07:00', '9/07/2011 5:08:00', '9/07/2011 5:09:00', '9/07/2011 5:10:00', '9/07/2011 5:11:00', '9/07/2011 5:12:00', '9/07/2011 5:13:00', '9/07/2011 5:14:00', '9/07/2011 5:15:00', '9/07/2011 5:16:00', '9/07/2011 5:17:00', '9/07/2011 5:18:00', '9/07/2011 5:19:00', '9/07/2011 5:20:00', '9/07/2011 5:21:00', '9/07/2011 5:22:00', '9/07/2011 5:23:00', '9/07/2011 5:24:00', '9/07/2011 5:25:00', '9/07/2011 5:26:00', '9/07/2011 5:27:00', '9/07/2011 5:28:00', '9/07/2011 5:29:00', '9/07/2011 5:30:00', '9/07/2011 5:31:00', '9/07/2011 5:32:00', '9/07/2011 5:33:00', '9/07/2011 5:34:00', '9/07/2011 5:35:00', '9/07/2011 5:36:00', '9/07/2011 5:37:00', '9/07/2011 5:38:00', '9/07/2011 5:39:00', '9/07/2011 5:40:00', '9/07/2011 5:41:00', '9/07/2011 5:42:00', '9/07/2011 5:43:00', '9/07/2011 5:44:00', '9/07/2011 5:45:00', '9/07/2011 5:46:00', '9/07/2011 5:47:00', '9/07/2011 5:48:00', '9/07/2011 5:49:00', '9/07/2011 5:50:00', '9/07/2011 5:51:00', '9/07/2011 5:52:00', '9/07/2011 5:53:00', '9/07/2011 5:54:00', '9/07/2011 5:55:00', '9/07/2011 5:56:00', '9/07/2011 5:57:00', '9/07/2011 5:58:00', '9/07/2011 5:59:00', '9/07/2011 6:00:00', '9/07/2011 6:01:00', '9/07/2011 6:02:00', '9/07/2011 6:03:00', '9/07/2011 6:04:00', '9/07/2011 6:05:00', '9/07/2011 6:06:00', '9/07/2011 6:07:00', '9/07/2011 6:08:00', '9/07/2011 6:09:00', '9/07/2011 6:10:00', '9/07/2011 6:11:00', '9/07/2011 6:12:00', '9/07/2011 6:13:00', '9/07/2011 6:14:00', '9/07/2011 6:15:00', '9/07/2011 6:16:00', '9/07/2011 6:17:00', '9/07/2011 6:18:00', '9/07/2011 6:19:00', '9/07/2011 6:20:00', '9/07/2011 6:21:00', '9/07/2011 6:22:00', '9/07/2011 6:23:00', '9/07/2011 6:24:00', '9/07/2011 6:25:00', '9/07/2011 6:26:00', '9/07/2011 6:27:00', '9/07/2011 6:28:00', '9/07/2011 6:29:00', '9/07/2011 6:30:00', '9/07/2011 6:31:00', '9/07/2011 6:32:00', '9/07/2011 6:33:00', '9/07/2011 6:34:00', '9/07/2011 6:35:00', '9/07/2011 6:36:00', '9/07/2011 6:37:00', '9/07/2011 6:38:00', '9/07/2011 6:39:00', '9/07/2011 6:40:00', '9/07/2011 6:41:00', '9/07/2011 6:42:00', '9/07/2011 6:43:00', '9/07/2011 6:44:00', '9/07/2011 6:45:00', '9/07/2011 6:46:00', '9/07/2011 6:47:00', '9/07/2011 6:48:00', '9/07/2011 6:49:00', '9/07/2011 6:50:00', '9/07/2011 6:51:00', '9/07/2011 6:52:00', '9/07/2011 6:53:00', '9/07/2011 6:54:00', '9/07/2011 6:55:00', '9/07/2011 6:56:00', '9/07/2011 6:57:00', '9/07/2011 6:58:00', '9/07/2011 6:59:00', '9/07/2011 7:00:00', '9/07/2011 7:01:00', '9/07/2011 7:02:00', '9/07/2011 7:03:00', '9/07/2011 7:04:00', '9/07/2011 7:05:00', '9/07/2011 7:06:00', '9/07/2011 7:07:00', '9/07/2011 7:08:00', '9/07/2011 7:09:00', '9/07/2011 7:10:00', '9/07/2011 7:11:00', '9/07/2011 7:12:00', '9/07/2011 7:13:00', '9/07/2011 7:14:00', '9/07/2011 7:15:00', '9/07/2011 7:16:00', '9/07/2011 7:17:00', '9/07/2011 7:18:00', '9/07/2011 7:19:00', '9/07/2011 7:20:00', '9/07/2011 7:21:00', '9/07/2011 7:22:00', '9/07/2011 7:23:00', '9/07/2011 7:24:00', '9/07/2011 7:25:00', '9/07/2011 7:26:00', '9/07/2011 7:27:00', '9/07/2011 7:28:00', '9/07/2011 7:29:00', '9/07/2011 7:30:00', '9/07/2011 7:31:00', '9/07/2011 7:32:00', '9/07/2011 7:33:00', '9/07/2011 7:34:00', '9/07/2011 7:35:00', '9/07/2011 7:36:00', '9/07/2011 7:37:00', '9/07/2011 7:38:00', '9/07/2011 7:39:00', '9/07/2011 7:40:00', '9/07/2011 7:41:00', '9/07/2011 7:42:00', '9/07/2011 7:43:00', '9/07/2011 7:44:00', '9/07/2011 7:45:00', '9/07/2011 7:46:00', '9/07/2011 7:47:00', '9/07/2011 7:48:00', '9/07/2011 7:49:00', '9/07/2011 7:50:00', '9/07/2011 7:51:00', '9/07/2011 7:52:00', '9/07/2011 7:53:00', '9/07/2011 7:54:00', '9/07/2011 7:55:00', '9/07/2011 7:56:00', '9/07/2011 7:57:00', '9/07/2011 7:58:00', '9/07/2011 7:59:00', '9/07/2011 8:00:00', '9/07/2011 8:01:00', '9/07/2011 8:02:00', '9/07/2011 8:03:00', '9/07/2011 8:04:00', '9/07/2011 8:05:00', '9/07/2011 8:06:00', '9/07/2011 8:07:00', '9/07/2011 8:08:00', '9/07/2011 8:09:00', '9/07/2011 8:10:00', '9/07/2011 8:11:00', '9/07/2011 8:12:00', '9/07/2011 8:13:00', '9/07/2011 8:14:00', '9/07/2011 8:15:00', '9/07/2011 8:16:00', '9/07/2011 8:17:00', '9/07/2011 8:18:00', '9/07/2011 8:19:00', '9/07/2011 8:20:00', '9/07/2011 8:21:00', '9/07/2011 8:22:00', '9/07/2011 8:23:00', '9/07/2011 8:24:00', '9/07/2011 8:25:00', '9/07/2011 8:26:00', '9/07/2011 8:27:00', '9/07/2011 8:28:00', '9/07/2011 8:29:00', '9/07/2011 8:30:00', '9/07/2011 8:31:00', '9/07/2011 8:32:00', '9/07/2011 8:33:00', '9/07/2011 8:34:00', '9/07/2011 8:35:00', '9/07/2011 8:36:00', '9/07/2011 8:37:00', '9/07/2011 8:38:00', '9/07/2011 8:39:00', '9/07/2011 8:40:00', '9/07/2011 8:41:00', '9/07/2011 8:42:00', '9/07/2011 8:43:00', '9/07/2011 8:44:00', '9/07/2011 8:45:00', '9/07/2011 8:46:00', '9/07/2011 8:47:00', '9/07/2011 8:48:00', '9/07/2011 8:49:00', '9/07/2011 8:50:00', '9/07/2011 8:51:00', '9/07/2011 8:52:00', '9/07/2011 8:53:00', '9/07/2011 8:54:00', '9/07/2011 8:55:00', '9/07/2011 8:56:00', '9/07/2011 8:57:00', '9/07/2011 8:58:00', '9/07/2011 8:59:00', '9/07/2011 9:00:00', '9/07/2011 9:01:00', '9/07/2011 9:02:00', '9/07/2011 9:03:00', '9/07/2011 9:04:00', '9/07/2011 9:05:00', '9/07/2011 9:06:00', '9/07/2011 9:07:00', '9/07/2011 9:08:00', '9/07/2011 9:09:00', '9/07/2011 9:10:00', '9/07/2011 9:11:00', '9/07/2011 9:12:00', '9/07/2011 9:13:00', '9/07/2011 9:14:00', '9/07/2011 9:15:00', '9/07/2011 9:16:00', '9/07/2011 9:17:00', '9/07/2011 9:18:00', '9/07/2011 9:19:00', '9/07/2011 9:20:00', '9/07/2011 9:21:00', '9/07/2011 9:22:00', '9/07/2011 9:23:00', '9/07/2011 9:24:00', '9/07/2011 9:25:00', '9/07/2011 9:26:00', '9/07/2011 9:27:00', '9/07/2011 9:28:00', '9/07/2011 9:29:00', '9/07/2011 9:30:00', '9/07/2011 9:31:00', '9/07/2011 9:32:00', '9/07/2011 9:33:00', '9/07/2011 9:34:00', '9/07/2011 9:35:00', '9/07/2011 9:36:00', '9/07/2011 9:37:00', '9/07/2011 9:38:00', '9/07/2011 9:39:00', '9/07/2011 9:40:00', '9/07/2011 9:41:00', '9/07/2011 9:42:00', '9/07/2011 9:43:00', '9/07/2011 9:44:00', '9/07/2011 9:45:00', '9/07/2011 9:46:00', '9/07/2011 9:47:00', '9/07/2011 9:48:00', '9/07/2011 9:49:00', '9/07/2011 9:50:00', '9/07/2011 9:51:00', '9/07/2011 9:52:00', '9/07/2011 9:53:00', '9/07/2011 9:54:00', '9/07/2011 9:55:00', '9/07/2011 9:56:00', '9/07/2011 9:57:00', '9/07/2011 9:58:00', '9/07/2011 9:59:00', '9/07/2011 10:00:00', '9/07/2011 10:01:00', '9/07/2011 10:02:00', '9/07/2011 10:03:00', '9/07/2011 10:04:00', '9/07/2011 10:05:00', '9/07/2011 10:06:00', '9/07/2011 10:07:00', '9/07/2011 10:08:00', '9/07/2011 10:09:00', '9/07/2011 10:10:00', '9/07/2011 10:11:00', '9/07/2011 10:12:00', '9/07/2011 10:13:00', '9/07/2011 10:14:00', '9/07/2011 10:15:00', '9/07/2011 10:16:00', '9/07/2011 10:17:00', '9/07/2011 10:18:00', '9/07/2011 10:19:00', '9/07/2011 10:20:00', '9/07/2011 10:21:00', '9/07/2011 10:22:00', '9/07/2011 10:23:00', '9/07/2011 10:24:00', '9/07/2011 10:25:00', '9/07/2011 10:26:00', '9/07/2011 10:27:00', '9/07/2011 10:28:00', '9/07/2011 10:29:00', '9/07/2011 10:30:00', '9/07/2011 10:31:00', '9/07/2011 10:32:00', '9/07/2011 10:33:00', '9/07/2011 10:34:00', '9/07/2011 10:35:00', '9/07/2011 10:36:00', '9/07/2011 10:37:00', '9/07/2011 10:38:00', '9/07/2011 10:39:00', '9/07/2011 10:40:00', '9/07/2011 10:41:00', '9/07/2011 10:42:00', '9/07/2011 10:43:00', '9/07/2011 10:44:00', '9/07/2011 10:45:00', '9/07/2011 10:46:00', '9/07/2011 10:47:00', '9/07/2011 10:48:00', '9/07/2011 10:49:00', '9/07/2011 10:50:00', '9/07/2011 10:51:00', '9/07/2011 10:52:00', '9/07/2011 10:53:00', '9/07/2011 10:54:00', '9/07/2011 10:55:00', '9/07/2011 10:56:00', '9/07/2011 10:57:00', '9/07/2011 10:58:00', '9/07/2011 10:59:00', '9/07/2011 11:00:00', '9/07/2011 11:01:00', '9/07/2011 11:02:00', '9/07/2011 11:03:00', '9/07/2011 11:04:00', '9/07/2011 11:05:00', '9/07/2011 11:06:00', '9/07/2011 11:07:00', '9/07/2011 11:08:00', '9/07/2011 11:09:00', '9/07/2011 11:10:00', '9/07/2011 11:11:00', '9/07/2011 11:12:00', '9/07/2011 11:13:00', '9/07/2011 11:14:00', '9/07/2011 11:15:00', '9/07/2011 11:16:00', '9/07/2011 11:17:00', '9/07/2011 11:18:00', '9/07/2011 11:19:00', '9/07/2011 11:20:00', '9/07/2011 11:21:00', '9/07/2011 11:22:00', '9/07/2011 11:23:00', '9/07/2011 11:24:00', '9/07/2011 11:25:00', '9/07/2011 11:26:00', '9/07/2011 11:27:00', '9/07/2011 11:28:00', '9/07/2011 11:29:00', '9/07/2011 11:30:00', '9/07/2011 11:31:00', '9/07/2011 11:32:00', '9/07/2011 11:33:00', '9/07/2011 11:34:00', '9/07/2011 11:35:00', '9/07/2011 11:36:00', '9/07/2011 11:37:00', '9/07/2011 11:38:00', '9/07/2011 11:39:00', '9/07/2011 11:40:00', '9/07/2011 11:41:00', '9/07/2011 11:42:00', '9/07/2011 11:43:00', '9/07/2011 11:44:00', '9/07/2011 11:45:00', '9/07/2011 11:46:00', '9/07/2011 11:47:00', '9/07/2011 11:48:00', '9/07/2011 11:49:00', '9/07/2011 11:50:00', '9/07/2011 11:51:00', '9/07/2011 11:52:00', '9/07/2011 11:53:00', '9/07/2011 11:54:00', '9/07/2011 11:55:00', '9/07/2011 11:56:00', '9/07/2011 11:57:00', '9/07/2011 11:58:00', '9/07/2011 11:59:00', '9/07/2011 12:00:00', '9/07/2011 12:01:00', '9/07/2011 12:02:00', '9/07/2011 12:03:00', '9/07/2011 12:04:00', '9/07/2011 12:05:00', '9/07/2011 12:06:00', '9/07/2011 12:07:00', '9/07/2011 12:08:00', '9/07/2011 12:09:00', '9/07/2011 12:10:00', '9/07/2011 12:11:00', '9/07/2011 12:12:00', '9/07/2011 12:13:00', '9/07/2011 12:14:00', '9/07/2011 12:15:00', '9/07/2011 12:16:00', '9/07/2011 12:17:00', '9/07/2011 12:18:00', '9/07/2011 12:19:00', '9/07/2011 12:20:00', '9/07/2011 12:21:00', '9/07/2011 12:22:00', '9/07/2011 12:23:00', '9/07/2011 12:24:00', '9/07/2011 12:25:00', '9/07/2011 12:26:00', '9/07/2011 12:27:00', '9/07/2011 12:28:00', '9/07/2011 12:29:00', '9/07/2011 12:30:00', '9/07/2011 12:31:00', '9/07/2011 12:32:00', '9/07/2011 12:33:00', '9/07/2011 12:34:00', '9/07/2011 12:35:00', '9/07/2011 12:36:00', '9/07/2011 12:37:00', '9/07/2011 12:38:00', '9/07/2011 12:39:00', '9/07/2011 12:40:00', '9/07/2011 12:41:00', '9/07/2011 12:42:00', '9/07/2011 12:43:00', '9/07/2011 12:44:00', '9/07/2011 12:45:00', '9/07/2011 12:46:00', '9/07/2011 12:47:00', '9/07/2011 12:48:00', '9/07/2011 12:49:00', '9/07/2011 12:50:00', '9/07/2011 12:51:00', '9/07/2011 12:52:00', '9/07/2011 12:53:00', '9/07/2011 12:54:00', '9/07/2011 12:55:00', '9/07/2011 12:56:00', '9/07/2011 12:57:00', '9/07/2011 12:58:00', '9/07/2011 12:59:00', '9/07/2011 13:00:00', '9/07/2011 13:01:00', '9/07/2011 13:02:00', '9/07/2011 13:03:00', '9/07/2011 13:04:00', '9/07/2011 13:05:00', '9/07/2011 13:06:00', '9/07/2011 13:07:00', '9/07/2011 13:08:00', '9/07/2011 13:09:00', '9/07/2011 13:10:00', '9/07/2011 13:11:00', '9/07/2011 13:12:00', '9/07/2011 13:13:00', '9/07/2011 13:14:00', '9/07/2011 13:15:00', '9/07/2011 13:16:00', '9/07/2011 13:17:00', '9/07/2011 13:18:00', '9/07/2011 13:19:00', '9/07/2011 13:20:00', '9/07/2011 13:21:00', '9/07/2011 13:22:00', '9/07/2011 13:23:00', '9/07/2011 13:24:00', '9/07/2011 13:25:00', '9/07/2011 13:26:00', '9/07/2011 13:27:00', '9/07/2011 13:28:00', '9/07/2011 13:29:00', '9/07/2011 13:30:00', '9/07/2011 13:31:00', '9/07/2011 13:32:00', '9/07/2011 13:33:00', '9/07/2011 13:34:00', '9/07/2011 13:35:00', '9/07/2011 13:36:00', '9/07/2011 13:37:00', '9/07/2011 13:38:00', '9/07/2011 13:39:00', '9/07/2011 13:40:00', '9/07/2011 13:41:00', '9/07/2011 13:42:00', '9/07/2011 13:43:00', '9/07/2011 13:44:00', '9/07/2011 13:45:00', '9/07/2011 13:46:00', '9/07/2011 13:47:00', '9/07/2011 13:48:00', '9/07/2011 13:49:00', '9/07/2011 13:50:00', '9/07/2011 13:51:00', '9/07/2011 13:52:00', '9/07/2011 13:53:00', '9/07/2011 13:54:00', '9/07/2011 13:55:00', '9/07/2011 13:56:00', '9/07/2011 13:57:00', '9/07/2011 13:58:00', '9/07/2011 13:59:00', '9/07/2011 14:00:00', '9/07/2011 14:01:00', '9/07/2011 14:02:00', '9/07/2011 14:03:00', '9/07/2011 14:04:00', '9/07/2011 14:05:00', '9/07/2011 14:06:00', '9/07/2011 14:07:00', '9/07/2011 14:08:00', '9/07/2011 14:09:00', '9/07/2011 14:10:00', '9/07/2011 14:11:00', '9/07/2011 14:12:00', '9/07/2011 14:13:00', '9/07/2011 14:14:00', '9/07/2011 14:15:00', '9/07/2011 14:16:00', '9/07/2011 14:17:00', '9/07/2011 14:18:00', '9/07/2011 14:19:00', '9/07/2011 14:20:00', '9/07/2011 14:21:00', '9/07/2011 14:22:00', '9/07/2011 14:23:00', '9/07/2011 14:24:00', '9/07/2011 14:25:00', '9/07/2011 14:26:00', '9/07/2011 14:27:00', '9/07/2011 14:28:00', '9/07/2011 14:29:00', '9/07/2011 14:30:00', '9/07/2011 14:31:00', '9/07/2011 14:32:00', '9/07/2011 14:33:00', '9/07/2011 14:34:00', '9/07/2011 14:35:00', '9/07/2011 14:36:00', '9/07/2011 14:37:00', '9/07/2011 14:38:00', '9/07/2011 14:39:00', '9/07/2011 14:40:00', '9/07/2011 14:41:00', '9/07/2011 14:42:00', '9/07/2011 14:43:00', '9/07/2011 14:44:00', '9/07/2011 14:45:00', '9/07/2011 14:46:00', '9/07/2011 14:47:00', '9/07/2011 14:48:00', '9/07/2011 14:49:00', '9/07/2011 14:50:00', '9/07/2011 14:51:00', '9/07/2011 14:52:00', '9/07/2011 14:53:00', '9/07/2011 14:54:00', '9/07/2011 14:55:00', '9/07/2011 14:56:00', '9/07/2011 14:57:00', '9/07/2011 14:58:00', '9/07/2011 14:59:00', '9/07/2011 15:00:00', '9/07/2011 15:01:00', '9/07/2011 15:02:00', '9/07/2011 15:03:00', '9/07/2011 15:04:00', '9/07/2011 15:05:00', '9/07/2011 15:06:00', '9/07/2011 15:07:00', '9/07/2011 15:08:00', '9/07/2011 15:09:00', '9/07/2011 15:10:00', '9/07/2011 15:11:00', '9/07/2011 15:12:00', '9/07/2011 15:13:00', '9/07/2011 15:14:00', '9/07/2011 15:15:00', '9/07/2011 15:16:00', '9/07/2011 15:17:00', '9/07/2011 15:18:00', '9/07/2011 15:19:00', '9/07/2011 15:20:00', '9/07/2011 15:21:00', '9/07/2011 15:22:00', '9/07/2011 15:23:00', '9/07/2011 15:24:00', '9/07/2011 15:25:00', '9/07/2011 15:26:00', '9/07/2011 15:27:00', '9/07/2011 15:28:00', '9/07/2011 15:29:00', '9/07/2011 15:30:00', '9/07/2011 15:31:00', '9/07/2011 15:32:00', '9/07/2011 15:33:00', '9/07/2011 15:34:00', '9/07/2011 15:35:00', '9/07/2011 15:36:00', '9/07/2011 15:37:00', '9/07/2011 15:38:00', '9/07/2011 15:39:00', '9/07/2011 15:40:00', '9/07/2011 15:41:00', '9/07/2011 15:42:00', '9/07/2011 15:43:00', '9/07/2011 15:44:00', '9/07/2011 15:45:00', '9/07/2011 15:46:00', '9/07/2011 15:47:00', '9/07/2011 15:48:00', '9/07/2011 15:49:00', '9/07/2011 15:50:00', '9/07/2011 15:51:00', '9/07/2011 15:52:00', '9/07/2011 15:53:00', '9/07/2011 15:54:00', '9/07/2011 15:55:00', '9/07/2011 15:56:00', '9/07/2011 15:57:00', '9/07/2011 15:58:00', '9/07/2011 15:59:00', '9/07/2011 16:00:00', '9/07/2011 16:01:00', '9/07/2011 16:02:00', '9/07/2011 16:03:00', '9/07/2011 16:04:00', '9/07/2011 16:05:00', '9/07/2011 16:06:00', '9/07/2011 16:07:00', '9/07/2011 16:08:00', '9/07/2011 16:09:00', '9/07/2011 16:10:00', '9/07/2011 16:11:00', '9/07/2011 16:12:00', '9/07/2011 16:13:00', '9/07/2011 16:14:00', '9/07/2011 16:15:00', '9/07/2011 16:16:00', '9/07/2011 16:17:00', '9/07/2011 16:18:00', '9/07/2011 16:19:00', '9/07/2011 16:20:00', '9/07/2011 16:21:00', '9/07/2011 16:22:00', '9/07/2011 16:23:00', '9/07/2011 16:24:00', '9/07/2011 16:25:00', '9/07/2011 16:26:00', '9/07/2011 16:27:00', '9/07/2011 16:28:00', '9/07/2011 16:29:00', '9/07/2011 16:30:00', '9/07/2011 16:31:00', '9/07/2011 16:32:00', '9/07/2011 16:33:00', '9/07/2011 16:34:00', '9/07/2011 16:35:00', '9/07/2011 16:36:00', '9/07/2011 16:37:00', '9/07/2011 16:38:00', '9/07/2011 16:39:00', '9/07/2011 16:40:00', '9/07/2011 16:41:00', '9/07/2011 16:42:00', '9/07/2011 16:43:00', '9/07/2011 16:44:00', '9/07/2011 16:45:00', '9/07/2011 16:46:00', '9/07/2011 16:47:00', '9/07/2011 16:48:00', '9/07/2011 16:49:00', '9/07/2011 16:50:00', '9/07/2011 16:51:00', '9/07/2011 16:52:00', '9/07/2011 16:53:00', '9/07/2011 16:54:00', '9/07/2011 16:55:00', '9/07/2011 16:56:00', '9/07/2011 16:57:00', '9/07/2011 16:58:00', '9/07/2011 16:59:00', '9/07/2011 17:00:00', '9/07/2011 17:01:00', '9/07/2011 17:02:00', '9/07/2011 17:03:00', '9/07/2011 17:04:00', '9/07/2011 17:05:00', '9/07/2011 17:06:00', '9/07/2011 17:07:00', '9/07/2011 17:08:00', '9/07/2011 17:09:00', '9/07/2011 17:10:00', '9/07/2011 17:11:00', '9/07/2011 17:12:00', '9/07/2011 17:13:00', '9/07/2011 17:14:00', '9/07/2011 17:15:00', '9/07/2011 17:16:00', '9/07/2011 17:17:00', '9/07/2011 17:18:00', '9/07/2011 17:19:00', '9/07/2011 17:20:00', '9/07/2011 17:21:00', '9/07/2011 17:22:00', '9/07/2011 17:23:00', '9/07/2011 17:24:00', '9/07/2011 17:25:00', '9/07/2011 17:26:00', '9/07/2011 17:27:00', '9/07/2011 17:28:00', '9/07/2011 17:29:00', '9/07/2011 17:30:00', '9/07/2011 17:31:00', '9/07/2011 17:32:00', '9/07/2011 17:33:00', '9/07/2011 17:34:00', '9/07/2011 17:35:00', '9/07/2011 17:36:00', '9/07/2011 17:37:00', '9/07/2011 17:38:00', '9/07/2011 17:39:00', '9/07/2011 17:40:00', '9/07/2011 17:41:00', '9/07/2011 17:42:00', '9/07/2011 17:43:00', '9/07/2011 17:44:00', '9/07/2011 17:45:00', '9/07/2011 17:46:00', '9/07/2011 17:47:00', '9/07/2011 17:48:00', '9/07/2011 17:49:00', '9/07/2011 17:50:00', '9/07/2011 17:51:00', '9/07/2011 17:52:00', '9/07/2011 17:53:00', '9/07/2011 17:54:00', '9/07/2011 17:55:00', '9/07/2011 17:56:00', '9/07/2011 17:57:00', '9/07/2011 17:58:00', '9/07/2011 17:59:00', '9/07/2011 18:00:00', '9/07/2011 18:01:00', '9/07/2011 18:02:00', '9/07/2011 18:03:00', '9/07/2011 18:04:00', '9/07/2011 18:05:00', '9/07/2011 18:06:00', '9/07/2011 18:07:00', '9/07/2011 18:08:00', '9/07/2011 18:09:00', '9/07/2011 18:10:00', '9/07/2011 18:11:00', '9/07/2011 18:12:00', '9/07/2011 18:13:00', '9/07/2011 18:14:00', '9/07/2011 18:15:00', '9/07/2011 18:16:00', '9/07/2011 18:17:00', '9/07/2011 18:18:00', '9/07/2011 18:19:00', '9/07/2011 18:20:00', '9/07/2011 18:21:00', '9/07/2011 18:22:00', '9/07/2011 18:23:00', '9/07/2011 18:24:00', '9/07/2011 18:25:00', '9/07/2011 18:26:00', '9/07/2011 18:27:00', '9/07/2011 18:28:00', '9/07/2011 18:29:00', '9/07/2011 18:30:00', '9/07/2011 18:31:00', '9/07/2011 18:32:00', '9/07/2011 18:33:00', '9/07/2011 18:34:00', '9/07/2011 18:35:00', '9/07/2011 18:36:00', '9/07/2011 18:37:00', '9/07/2011 18:38:00', '9/07/2011 18:39:00', '9/07/2011 18:40:00', '9/07/2011 18:41:00', '9/07/2011 18:42:00', '9/07/2011 18:43:00', '9/07/2011 18:44:00', '9/07/2011 18:45:00', '9/07/2011 18:46:00', '9/07/2011 18:47:00', '9/07/2011 18:48:00', '9/07/2011 18:49:00', '9/07/2011 18:50:00', '9/07/2011 18:51:00', '9/07/2011 18:52:00', '9/07/2011 18:53:00', '9/07/2011 18:54:00', '9/07/2011 18:55:00', '9/07/2011 18:56:00', '9/07/2011 18:57:00', '9/07/2011 18:58:00', '9/07/2011 18:59:00', '9/07/2011 19:00:00', '9/07/2011 19:01:00', '9/07/2011 19:02:00', '9/07/2011 19:03:00', '9/07/2011 19:04:00', '9/07/2011 19:05:00', '9/07/2011 19:06:00', '9/07/2011 19:07:00', '9/07/2011 19:08:00', '9/07/2011 19:09:00', '9/07/2011 19:10:00', '9/07/2011 19:11:00', '9/07/2011 19:12:00', '9/07/2011 19:13:00', '9/07/2011 19:14:00', '9/07/2011 19:15:00', '9/07/2011 19:16:00', '9/07/2011 19:17:00', '9/07/2011 19:18:00', '9/07/2011 19:19:00', '9/07/2011 19:20:00', '9/07/2011 19:21:00', '9/07/2011 19:22:00', '9/07/2011 19:23:00', '9/07/2011 19:24:00', '9/07/2011 19:25:00', '9/07/2011 19:26:00', '9/07/2011 19:27:00', '9/07/2011 19:28:00', '9/07/2011 19:29:00', '9/07/2011 19:30:00', '9/07/2011 19:31:00', '9/07/2011 19:32:00', '9/07/2011 19:33:00', '9/07/2011 19:34:00', '9/07/2011 19:35:00', '9/07/2011 19:36:00', '9/07/2011 19:37:00', '9/07/2011 19:38:00', '9/07/2011 19:39:00', '9/07/2011 19:40:00', '9/07/2011 19:41:00', '9/07/2011 19:42:00', '9/07/2011 19:43:00', '9/07/2011 19:44:00', '9/07/2011 19:45:00', '9/07/2011 19:46:00', '9/07/2011 19:47:00', '9/07/2011 19:48:00', '9/07/2011 19:49:00', '9/07/2011 19:50:00', '9/07/2011 19:51:00', '9/07/2011 19:52:00', '9/07/2011 19:53:00', '9/07/2011 19:54:00', '9/07/2011 19:55:00', '9/07/2011 19:56:00', '9/07/2011 19:57:00', '9/07/2011 19:58:00', '9/07/2011 19:59:00', '9/07/2011 20:00:00', '9/07/2011 20:01:00', '9/07/2011 20:02:00', '9/07/2011 20:03:00', '9/07/2011 20:04:00', '9/07/2011 20:05:00', '9/07/2011 20:06:00', '9/07/2011 20:07:00', '9/07/2011 20:08:00', '9/07/2011 20:09:00', '9/07/2011 20:10:00', '9/07/2011 20:11:00', '9/07/2011 20:12:00', '9/07/2011 20:13:00', '9/07/2011 20:14:00', '9/07/2011 20:15:00', '9/07/2011 20:16:00', '9/07/2011 20:17:00', '9/07/2011 20:18:00', '9/07/2011 20:19:00', '9/07/2011 20:20:00', '9/07/2011 20:21:00', '9/07/2011 20:22:00', '9/07/2011 20:23:00', '9/07/2011 20:24:00', '9/07/2011 20:25:00', '9/07/2011 20:26:00', '9/07/2011 20:27:00', '9/07/2011 20:28:00', '9/07/2011 20:29:00', '9/07/2011 20:30:00', '9/07/2011 20:31:00', '9/07/2011 20:32:00', '9/07/2011 20:33:00', '9/07/2011 20:34:00', '9/07/2011 20:35:00', '9/07/2011 20:36:00', '9/07/2011 20:37:00', '9/07/2011 20:38:00', '9/07/2011 20:39:00', '9/07/2011 20:40:00', '9/07/2011 20:41:00', '9/07/2011 20:42:00', '9/07/2011 20:43:00', '9/07/2011 20:44:00', '9/07/2011 20:45:00', '9/07/2011 20:46:00', '9/07/2011 20:47:00', '9/07/2011 20:48:00', '9/07/2011 20:49:00', '9/07/2011 20:50:00', '9/07/2011 20:51:00', '9/07/2011 20:52:00', '9/07/2011 20:53:00', '9/07/2011 20:54:00', '9/07/2011 20:55:00', '9/07/2011 20:56:00', '9/07/2011 20:57:00', '9/07/2011 20:58:00', '9/07/2011 20:59:00', '9/07/2011 21:00:00', '9/07/2011 21:01:00', '9/07/2011 21:02:00', '9/07/2011 21:03:00', '9/07/2011 21:04:00', '9/07/2011 21:05:00', '9/07/2011 21:06:00', '9/07/2011 21:07:00', '9/07/2011 21:08:00', '9/07/2011 21:09:00', '9/07/2011 21:10:00', '9/07/2011 21:11:00', '9/07/2011 21:12:00', '9/07/2011 21:13:00', '9/07/2011 21:14:00', '9/07/2011 21:15:00', '9/07/2011 21:16:00', '9/07/2011 21:17:00', '9/07/2011 21:18:00', '9/07/2011 21:19:00', '9/07/2011 21:20:00', '9/07/2011 21:21:00', '9/07/2011 21:22:00', '9/07/2011 21:23:00', '9/07/2011 21:24:00', '9/07/2011 21:25:00', '9/07/2011 21:26:00', '9/07/2011 21:27:00', '9/07/2011 21:28:00', '9/07/2011 21:29:00', '9/07/2011 21:30:00', '9/07/2011 21:31:00', '9/07/2011 21:32:00', '9/07/2011 21:33:00', '9/07/2011 21:34:00', '9/07/2011 21:35:00', '9/07/2011 21:36:00', '9/07/2011 21:37:00', '9/07/2011 21:38:00', '9/07/2011 21:39:00', '9/07/2011 21:40:00', '9/07/2011 21:41:00', '9/07/2011 21:42:00', '9/07/2011 21:43:00', '9/07/2011 21:44:00', '9/07/2011 21:45:00', '9/07/2011 21:46:00', '9/07/2011 21:47:00', '9/07/2011 21:48:00', '9/07/2011 21:49:00', '9/07/2011 21:50:00', '9/07/2011 21:51:00', '9/07/2011 21:52:00', '9/07/2011 21:53:00', '9/07/2011 21:54:00', '9/07/2011 21:55:00', '9/07/2011 21:56:00', '9/07/2011 21:57:00', '9/07/2011 21:58:00', '9/07/2011 21:59:00', '9/07/2011 22:00:00', '9/07/2011 22:01:00', '9/07/2011 22:02:00', '9/07/2011 22:03:00', '9/07/2011 22:04:00', '9/07/2011 22:05:00', '9/07/2011 22:06:00', '9/07/2011 22:07:00', '9/07/2011 22:08:00', '9/07/2011 22:09:00', '9/07/2011 22:10:00', '9/07/2011 22:11:00', '9/07/2011 22:12:00', '9/07/2011 22:13:00', '9/07/2011 22:14:00', '9/07/2011 22:15:00', '9/07/2011 22:16:00', '9/07/2011 22:17:00', '9/07/2011 22:18:00', '9/07/2011 22:19:00', '9/07/2011 22:20:00', '9/07/2011 22:21:00', '9/07/2011 22:22:00', '9/07/2011 22:23:00', '9/07/2011 22:24:00', '9/07/2011 22:25:00', '9/07/2011 22:26:00', '9/07/2011 22:27:00', '9/07/2011 22:28:00', '9/07/2011 22:29:00', '9/07/2011 22:30:00', '9/07/2011 22:31:00', '9/07/2011 22:32:00', '9/07/2011 22:33:00', '9/07/2011 22:34:00', '9/07/2011 22:35:00', '9/07/2011 22:36:00', '9/07/2011 22:37:00', '9/07/2011 22:38:00', '9/07/2011 22:39:00', '9/07/2011 22:40:00', '9/07/2011 22:41:00', '9/07/2011 22:42:00', '9/07/2011 22:43:00', '9/07/2011 22:44:00', '9/07/2011 22:45:00', '9/07/2011 22:46:00', '9/07/2011 22:47:00', '9/07/2011 22:48:00', '9/07/2011 22:49:00', '9/07/2011 22:50:00', '9/07/2011 22:51:00', '9/07/2011 22:52:00', '9/07/2011 22:53:00', '9/07/2011 22:54:00', '9/07/2011 22:55:00', '9/07/2011 22:56:00', '9/07/2011 22:57:00', '9/07/2011 22:58:00', '9/07/2011 22:59:00', '9/07/2011 23:00:00', '9/07/2011 23:01:00', '9/07/2011 23:02:00', '9/07/2011 23:03:00', '9/07/2011 23:04:00', '9/07/2011 23:05:00', '9/07/2011 23:06:00', '9/07/2011 23:07:00', '9/07/2011 23:08:00', '9/07/2011 23:09:00', '9/07/2011 23:10:00', '9/07/2011 23:11:00', '9/07/2011 23:12:00', '9/07/2011 23:13:00', '9/07/2011 23:14:00', '9/07/2011 23:15:00', '9/07/2011 23:16:00', '9/07/2011 23:17:00', '9/07/2011 23:18:00', '9/07/2011 23:19:00', '9/07/2011 23:20:00', '9/07/2011 23:21:00', '9/07/2011 23:22:00', '9/07/2011 23:23:00', '9/07/2011 23:24:00', '9/07/2011 23:25:00', '9/07/2011 23:26:00', '9/07/2011 23:27:00', '9/07/2011 23:28:00', '9/07/2011 23:29:00', '9/07/2011 23:30:00', '9/07/2011 23:31:00', '9/07/2011 23:32:00', '9/07/2011 23:33:00', '9/07/2011 23:34:00', '9/07/2011 23:35:00', '9/07/2011 23:36:00', '9/07/2011 23:37:00', '9/07/2011 23:38:00', '9/07/2011 23:39:00', '9/07/2011 23:40:00', '9/07/2011 23:41:00', '9/07/2011 23:42:00', '9/07/2011 23:43:00', '9/07/2011 23:44:00', '9/07/2011 23:45:00', '9/07/2011 23:46:00', '9/07/2011 23:47:00', '9/07/2011 23:48:00', '9/07/2011 23:49:00', '9/07/2011 23:50:00', '9/07/2011 23:51:00', '9/07/2011 23:52:00', '9/07/2011 23:53:00', '9/07/2011 23:54:00', '9/07/2011 23:55:00', '9/07/2011 23:56:00', '9/07/2011 23:57:00', '9/07/2011 23:58:00', '9/07/2011 23:59:00', '10/07/2011 0:00:00', '10/07/2011 0:01:00', '10/07/2011 0:02:00', '10/07/2011 0:03:00', '10/07/2011 0:04:00', '10/07/2011 0:05:00', '10/07/2011 0:06:00', '10/07/2011 0:07:00', '10/07/2011 0:08:00', '10/07/2011 0:09:00', '10/07/2011 0:10:00', '10/07/2011 0:11:00', '10/07/2011 0:12:00', '10/07/2011 0:13:00', '10/07/2011 0:14:00', '10/07/2011 0:15:00', '10/07/2011 0:16:00', '10/07/2011 0:17:00', '10/07/2011 0:18:00', '10/07/2011 0:19:00', '10/07/2011 0:20:00', '10/07/2011 0:21:00', '10/07/2011 0:22:00', '10/07/2011 0:23:00', '10/07/2011 0:24:00', '10/07/2011 0:25:00', '10/07/2011 0:26:00', '10/07/2011 0:27:00', '10/07/2011 0:28:00', '10/07/2011 0:29:00', '10/07/2011 0:30:00', '10/07/2011 0:31:00', '10/07/2011 0:32:00', '10/07/2011 0:33:00', '10/07/2011 0:34:00', '10/07/2011 0:35:00', '10/07/2011 0:36:00', '10/07/2011 0:37:00', '10/07/2011 0:38:00', '10/07/2011 0:39:00', '10/07/2011 0:40:00', '10/07/2011 0:41:00', '10/07/2011 0:42:00', '10/07/2011 0:43:00', '10/07/2011 0:44:00', '10/07/2011 0:45:00', '10/07/2011 0:46:00', '10/07/2011 0:47:00', '10/07/2011 0:48:00', '10/07/2011 0:49:00', '10/07/2011 0:50:00', '10/07/2011 0:51:00', '10/07/2011 0:52:00', '10/07/2011 0:53:00', '10/07/2011 0:54:00', '10/07/2011 0:55:00', '10/07/2011 0:56:00', '10/07/2011 0:57:00', '10/07/2011 0:58:00', '10/07/2011 0:59:00', '10/07/2011 1:00:00', '10/07/2011 1:01:00', '10/07/2011 1:02:00', '10/07/2011 1:03:00', '10/07/2011 1:04:00', '10/07/2011 1:05:00', '10/07/2011 1:06:00', '10/07/2011 1:07:00', '10/07/2011 1:08:00', '10/07/2011 1:09:00', '10/07/2011 1:10:00', '10/07/2011 1:11:00', '10/07/2011 1:12:00', '10/07/2011 1:13:00', '10/07/2011 1:14:00', '10/07/2011 1:15:00', '10/07/2011 1:16:00', '10/07/2011 1:17:00', '10/07/2011 1:18:00', '10/07/2011 1:19:00', '10/07/2011 1:20:00', '10/07/2011 1:21:00', '10/07/2011 1:22:00', '10/07/2011 1:23:00', '10/07/2011 1:24:00', '10/07/2011 1:25:00', '10/07/2011 1:26:00', '10/07/2011 1:27:00', '10/07/2011 1:28:00', '10/07/2011 1:29:00', '10/07/2011 1:30:00', '10/07/2011 1:31:00', '10/07/2011 1:32:00', '10/07/2011 1:33:00', '10/07/2011 1:34:00', '10/07/2011 1:35:00', '10/07/2011 1:36:00', '10/07/2011 1:37:00', '10/07/2011 1:38:00', '10/07/2011 1:39:00', '10/07/2011 1:40:00', '10/07/2011 1:41:00', '10/07/2011 1:42:00', '10/07/2011 1:43:00', '10/07/2011 1:44:00', '10/07/2011 1:45:00', '10/07/2011 1:46:00', '10/07/2011 1:47:00', '10/07/2011 1:48:00', '10/07/2011 1:49:00', '10/07/2011 1:50:00', '10/07/2011 1:51:00', '10/07/2011 1:52:00', '10/07/2011 1:53:00', '10/07/2011 1:54:00', '10/07/2011 1:55:00', '10/07/2011 1:56:00', '10/07/2011 1:57:00', '10/07/2011 1:58:00', '10/07/2011 1:59:00', '10/07/2011 2:00:00', '10/07/2011 2:01:00', '10/07/2011 2:02:00', '10/07/2011 2:03:00', '10/07/2011 2:04:00', '10/07/2011 2:05:00', '10/07/2011 2:06:00', '10/07/2011 2:07:00', '10/07/2011 2:08:00', '10/07/2011 2:09:00', '10/07/2011 2:10:00', '10/07/2011 2:11:00', '10/07/2011 2:12:00', '10/07/2011 2:13:00', '10/07/2011 2:14:00', '10/07/2011 2:15:00', '10/07/2011 2:16:00', '10/07/2011 2:17:00', '10/07/2011 2:18:00', '10/07/2011 2:19:00', '10/07/2011 2:20:00', '10/07/2011 2:21:00', '10/07/2011 2:22:00', '10/07/2011 2:23:00', '10/07/2011 2:24:00', '10/07/2011 2:25:00', '10/07/2011 2:26:00', '10/07/2011 2:27:00', '10/07/2011 2:28:00', '10/07/2011 2:29:00', '10/07/2011 2:30:00', '10/07/2011 2:31:00', '10/07/2011 2:32:00', '10/07/2011 2:33:00', '10/07/2011 2:34:00', '10/07/2011 2:35:00', '10/07/2011 2:36:00', '10/07/2011 2:37:00', '10/07/2011 2:38:00', '10/07/2011 2:39:00', '10/07/2011 2:40:00', '10/07/2011 2:41:00', '10/07/2011 2:42:00', '10/07/2011 2:43:00', '10/07/2011 2:44:00', '10/07/2011 2:45:00', '10/07/2011 2:46:00', '10/07/2011 2:47:00', '10/07/2011 2:48:00', '10/07/2011 2:49:00', '10/07/2011 2:50:00', '10/07/2011 2:51:00', '10/07/2011 2:52:00', '10/07/2011 2:53:00', '10/07/2011 2:54:00', '10/07/2011 2:55:00', '10/07/2011 2:56:00', '10/07/2011 2:57:00', '10/07/2011 2:58:00', '10/07/2011 2:59:00', '10/07/2011 3:00:00', '10/07/2011 3:01:00', '10/07/2011 3:02:00', '10/07/2011 3:03:00', '10/07/2011 3:04:00', '10/07/2011 3:05:00', '10/07/2011 3:06:00', '10/07/2011 3:07:00', '10/07/2011 3:08:00', '10/07/2011 3:09:00', '10/07/2011 3:10:00', '10/07/2011 3:11:00', '10/07/2011 3:12:00', '10/07/2011 3:13:00', '10/07/2011 3:14:00', '10/07/2011 3:15:00', '10/07/2011 3:16:00', '10/07/2011 3:17:00', '10/07/2011 3:18:00', '10/07/2011 3:19:00', '10/07/2011 3:20:00', '10/07/2011 3:21:00', '10/07/2011 3:22:00', '10/07/2011 3:23:00', '10/07/2011 3:24:00', '10/07/2011 3:25:00', '10/07/2011 3:26:00', '10/07/2011 3:27:00', '10/07/2011 3:28:00', '10/07/2011 3:29:00', '10/07/2011 3:30:00', '10/07/2011 3:31:00', '10/07/2011 3:32:00', '10/07/2011 3:33:00', '10/07/2011 3:34:00', '10/07/2011 3:35:00', '10/07/2011 3:36:00', '10/07/2011 3:37:00', '10/07/2011 3:38:00', '10/07/2011 3:39:00', '10/07/2011 3:40:00', '10/07/2011 3:41:00', '10/07/2011 3:42:00', '10/07/2011 3:43:00', '10/07/2011 3:44:00', '10/07/2011 3:45:00', '10/07/2011 3:46:00', '10/07/2011 3:47:00', '10/07/2011 3:48:00', '10/07/2011 3:49:00', '10/07/2011 3:50:00', '10/07/2011 3:51:00', '10/07/2011 3:52:00', '10/07/2011 3:53:00', '10/07/2011 3:54:00', '10/07/2011 3:55:00', '10/07/2011 3:56:00', '10/07/2011 3:57:00', '10/07/2011 3:58:00', '10/07/2011 3:59:00', '10/07/2011 4:00:00', '10/07/2011 4:01:00', '10/07/2011 4:02:00', '10/07/2011 4:03:00', '10/07/2011 4:04:00', '10/07/2011 4:05:00', '10/07/2011 4:06:00', '10/07/2011 4:07:00', '10/07/2011 4:08:00', '10/07/2011 4:09:00', '10/07/2011 4:10:00', '10/07/2011 4:11:00', '10/07/2011 4:12:00', '10/07/2011 4:13:00', '10/07/2011 4:14:00', '10/07/2011 4:15:00', '10/07/2011 4:16:00', '10/07/2011 4:17:00', '10/07/2011 4:18:00', '10/07/2011 4:19:00', '10/07/2011 4:20:00', '10/07/2011 4:21:00', '10/07/2011 4:22:00', '10/07/2011 4:23:00', '10/07/2011 4:24:00', '10/07/2011 4:25:00', '10/07/2011 4:26:00', '10/07/2011 4:27:00', '10/07/2011 4:28:00', '10/07/2011 4:29:00', '10/07/2011 4:30:00', '10/07/2011 4:31:00', '10/07/2011 4:32:00', '10/07/2011 4:33:00', '10/07/2011 4:34:00', '10/07/2011 4:35:00', '10/07/2011 4:36:00', '10/07/2011 4:37:00', '10/07/2011 4:38:00', '10/07/2011 4:39:00', '10/07/2011 4:40:00', '10/07/2011 4:41:00', '10/07/2011 4:42:00', '10/07/2011 4:43:00', '10/07/2011 4:44:00', '10/07/2011 4:45:00', '10/07/2011 4:46:00', '10/07/2011 4:47:00', '10/07/2011 4:48:00', '10/07/2011 4:49:00', '10/07/2011 4:50:00', '10/07/2011 4:51:00', '10/07/2011 4:52:00', '10/07/2011 4:53:00', '10/07/2011 4:54:00', '10/07/2011 4:55:00', '10/07/2011 4:56:00', '10/07/2011 4:57:00', '10/07/2011 4:58:00', '10/07/2011 4:59:00', '10/07/2011 5:00:00', '10/07/2011 5:01:00', '10/07/2011 5:02:00', '10/07/2011 5:03:00', '10/07/2011 5:04:00', '10/07/2011 5:05:00', '10/07/2011 5:06:00', '10/07/2011 5:07:00', '10/07/2011 5:08:00', '10/07/2011 5:09:00', '10/07/2011 5:10:00', '10/07/2011 5:11:00', '10/07/2011 5:12:00', '10/07/2011 5:13:00', '10/07/2011 5:14:00', '10/07/2011 5:15:00', '10/07/2011 5:16:00', '10/07/2011 5:17:00', '10/07/2011 5:18:00', '10/07/2011 5:19:00', '10/07/2011 5:20:00', '10/07/2011 5:21:00', '10/07/2011 5:22:00', '10/07/2011 5:23:00', '10/07/2011 5:24:00', '10/07/2011 5:25:00', '10/07/2011 5:26:00', '10/07/2011 5:27:00', '10/07/2011 5:28:00', '10/07/2011 5:29:00', '10/07/2011 5:30:00', '10/07/2011 5:31:00', '10/07/2011 5:32:00', '10/07/2011 5:33:00', '10/07/2011 5:34:00', '10/07/2011 5:35:00', '10/07/2011 5:36:00', '10/07/2011 5:37:00', '10/07/2011 5:38:00', '10/07/2011 5:39:00', '10/07/2011 5:40:00', '10/07/2011 5:41:00', '10/07/2011 5:42:00', '10/07/2011 5:43:00', '10/07/2011 5:44:00', '10/07/2011 5:45:00', '10/07/2011 5:46:00', '10/07/2011 5:47:00', '10/07/2011 5:48:00', '10/07/2011 5:49:00', '10/07/2011 5:50:00', '10/07/2011 5:51:00', '10/07/2011 5:52:00', '10/07/2011 5:53:00', '10/07/2011 5:54:00', '10/07/2011 5:55:00', '10/07/2011 5:56:00', '10/07/2011 5:57:00', '10/07/2011 5:58:00', '10/07/2011 5:59:00', '10/07/2011 6:00:00', '10/07/2011 6:01:00', '10/07/2011 6:02:00', '10/07/2011 6:03:00', '10/07/2011 6:04:00', '10/07/2011 6:05:00', '10/07/2011 6:06:00', '10/07/2011 6:07:00', '10/07/2011 6:08:00', '10/07/2011 6:09:00', '10/07/2011 6:10:00', '10/07/2011 6:11:00', '10/07/2011 6:12:00', '10/07/2011 6:13:00', '10/07/2011 6:14:00', '10/07/2011 6:15:00', '10/07/2011 6:16:00', '10/07/2011 6:17:00', '10/07/2011 6:18:00', '10/07/2011 6:19:00', '10/07/2011 6:20:00', '10/07/2011 6:21:00', '10/07/2011 6:22:00', '10/07/2011 6:23:00', '10/07/2011 6:24:00', '10/07/2011 6:25:00', '10/07/2011 6:26:00', '10/07/2011 6:27:00', '10/07/2011 6:28:00', '10/07/2011 6:29:00', '10/07/2011 6:30:00', '10/07/2011 6:31:00', '10/07/2011 6:32:00', '10/07/2011 6:33:00', '10/07/2011 6:34:00', '10/07/2011 6:35:00', '10/07/2011 6:36:00', '10/07/2011 6:37:00', '10/07/2011 6:38:00', '10/07/2011 6:39:00', '10/07/2011 6:40:00', '10/07/2011 6:41:00', '10/07/2011 6:42:00', '10/07/2011 6:43:00', '10/07/2011 6:44:00', '10/07/2011 6:45:00', '10/07/2011 6:46:00', '10/07/2011 6:47:00', '10/07/2011 6:48:00', '10/07/2011 6:49:00', '10/07/2011 6:50:00', '10/07/2011 6:51:00', '10/07/2011 6:52:00', '10/07/2011 6:53:00', '10/07/2011 6:54:00', '10/07/2011 6:55:00', '10/07/2011 6:56:00', '10/07/2011 6:57:00', '10/07/2011 6:58:00', '10/07/2011 6:59:00', '10/07/2011 7:00:00', '10/07/2011 7:01:00', '10/07/2011 7:02:00', '10/07/2011 7:03:00', '10/07/2011 7:04:00', '10/07/2011 7:05:00', '10/07/2011 7:06:00', '10/07/2011 7:07:00', '10/07/2011 7:08:00', '10/07/2011 7:09:00', '10/07/2011 7:10:00', '10/07/2011 7:11:00', '10/07/2011 7:12:00', '10/07/2011 7:13:00', '10/07/2011 7:14:00', '10/07/2011 7:15:00', '10/07/2011 7:16:00', '10/07/2011 7:17:00', '10/07/2011 7:18:00', '10/07/2011 7:19:00', '10/07/2011 7:20:00', '10/07/2011 7:21:00', '10/07/2011 7:22:00', '10/07/2011 7:23:00', '10/07/2011 7:24:00', '10/07/2011 7:25:00', '10/07/2011 7:26:00', '10/07/2011 7:27:00', '10/07/2011 7:28:00', '10/07/2011 7:29:00', '10/07/2011 7:30:00', '10/07/2011 7:31:00', '10/07/2011 7:32:00', '10/07/2011 7:33:00', '10/07/2011 7:34:00', '10/07/2011 7:35:00', '10/07/2011 7:36:00', '10/07/2011 7:37:00', '10/07/2011 7:38:00', '10/07/2011 7:39:00', '10/07/2011 7:40:00', '10/07/2011 7:41:00', '10/07/2011 7:42:00', '10/07/2011 7:43:00', '10/07/2011 7:44:00', '10/07/2011 7:45:00', '10/07/2011 7:46:00', '10/07/2011 7:47:00', '10/07/2011 7:48:00', '10/07/2011 7:49:00', '10/07/2011 7:50:00', '10/07/2011 7:51:00', '10/07/2011 7:52:00', '10/07/2011 7:53:00', '10/07/2011 7:54:00', '10/07/2011 7:55:00', '10/07/2011 7:56:00', '10/07/2011 7:57:00', '10/07/2011 7:58:00', '10/07/2011 7:59:00', '10/07/2011 8:00:00', '10/07/2011 8:01:00', '10/07/2011 8:02:00', '10/07/2011 8:03:00', '10/07/2011 8:04:00', '10/07/2011 8:05:00', '10/07/2011 8:06:00', '10/07/2011 8:07:00', '10/07/2011 8:08:00', '10/07/2011 8:09:00', '10/07/2011 8:10:00', '10/07/2011 8:11:00', '10/07/2011 8:12:00', '10/07/2011 8:13:00', '10/07/2011 8:14:00', '10/07/2011 8:15:00', '10/07/2011 8:16:00', '10/07/2011 8:17:00', '10/07/2011 8:18:00', '10/07/2011 8:19:00', '10/07/2011 8:20:00', '10/07/2011 8:21:00', '10/07/2011 8:22:00', '10/07/2011 8:23:00', '10/07/2011 8:24:00', '10/07/2011 8:25:00', '10/07/2011 8:26:00', '10/07/2011 8:27:00', '10/07/2011 8:28:00', '10/07/2011 8:29:00', '10/07/2011 8:30:00', '10/07/2011 8:31:00', '10/07/2011 8:32:00', '10/07/2011 8:33:00', '10/07/2011 8:34:00', '10/07/2011 8:35:00', '10/07/2011 8:36:00', '10/07/2011 8:37:00', '10/07/2011 8:38:00', '10/07/2011 8:39:00', '10/07/2011 8:40:00', '10/07/2011 8:41:00', '10/07/2011 8:42:00', '10/07/2011 8:43:00', '10/07/2011 8:44:00', '10/07/2011 8:45:00', '10/07/2011 8:46:00', '10/07/2011 8:47:00', '10/07/2011 8:48:00', '10/07/2011 8:49:00', '10/07/2011 8:50:00', '10/07/2011 8:51:00', '10/07/2011 8:52:00', '10/07/2011 8:53:00', '10/07/2011 8:54:00', '10/07/2011 8:55:00', '10/07/2011 8:56:00', '10/07/2011 8:57:00', '10/07/2011 8:58:00', '10/07/2011 8:59:00', '10/07/2011 9:00:00', '10/07/2011 9:01:00', '10/07/2011 9:02:00', '10/07/2011 9:03:00', '10/07/2011 9:04:00', '10/07/2011 9:05:00', '10/07/2011 9:06:00', '10/07/2011 9:07:00', '10/07/2011 9:08:00', '10/07/2011 9:09:00', '10/07/2011 9:10:00', '10/07/2011 9:11:00', '10/07/2011 9:12:00', '10/07/2011 9:13:00', '10/07/2011 9:14:00', '10/07/2011 9:15:00', '10/07/2011 9:16:00', '10/07/2011 9:17:00', '10/07/2011 9:18:00', '10/07/2011 9:19:00', '10/07/2011 9:20:00', '10/07/2011 9:21:00', '10/07/2011 9:22:00', '10/07/2011 9:23:00', '10/07/2011 9:24:00', '10/07/2011 9:25:00', '10/07/2011 9:26:00', '10/07/2011 9:27:00', '10/07/2011 9:28:00', '10/07/2011 9:29:00', '10/07/2011 9:30:00', '10/07/2011 9:31:00', '10/07/2011 9:32:00', '10/07/2011 9:33:00', '10/07/2011 9:34:00', '10/07/2011 9:35:00', '10/07/2011 9:36:00', '10/07/2011 9:37:00', '10/07/2011 9:38:00', '10/07/2011 9:39:00', '10/07/2011 9:40:00', '10/07/2011 9:41:00', '10/07/2011 9:42:00', '10/07/2011 9:43:00', '10/07/2011 9:44:00', '10/07/2011 9:45:00', '10/07/2011 9:46:00', '10/07/2011 9:47:00', '10/07/2011 9:48:00', '10/07/2011 9:49:00', '10/07/2011 9:50:00', '10/07/2011 9:51:00', '10/07/2011 9:52:00', '10/07/2011 9:53:00', '10/07/2011 9:54:00', '10/07/2011 9:55:00', '10/07/2011 9:56:00', '10/07/2011 9:57:00', '10/07/2011 9:58:00', '10/07/2011 9:59:00', '10/07/2011 10:00:00', '10/07/2011 10:01:00', '10/07/2011 10:02:00', '10/07/2011 10:03:00', '10/07/2011 10:04:00', '10/07/2011 10:05:00', '10/07/2011 10:06:00', '10/07/2011 10:07:00', '10/07/2011 10:08:00', '10/07/2011 10:09:00', '10/07/2011 10:10:00', '10/07/2011 10:11:00', '10/07/2011 10:12:00', '10/07/2011 10:13:00', '10/07/2011 10:14:00', '10/07/2011 10:15:00', '10/07/2011 10:16:00', '10/07/2011 10:17:00', '10/07/2011 10:18:00', '10/07/2011 10:19:00', '10/07/2011 10:20:00', '10/07/2011 10:21:00', '10/07/2011 10:22:00', '10/07/2011 10:23:00', '10/07/2011 10:24:00', '10/07/2011 10:25:00', '10/07/2011 10:26:00', '10/07/2011 10:27:00', '10/07/2011 10:28:00', '10/07/2011 10:29:00', '10/07/2011 10:30:00', '10/07/2011 10:31:00', '10/07/2011 10:32:00', '10/07/2011 10:33:00', '10/07/2011 10:34:00', '10/07/2011 10:35:00', '10/07/2011 10:36:00', '10/07/2011 10:37:00', '10/07/2011 10:38:00', '10/07/2011 10:39:00', '10/07/2011 10:40:00', '10/07/2011 10:41:00', '10/07/2011 10:42:00', '10/07/2011 10:43:00', '10/07/2011 10:44:00', '10/07/2011 10:45:00', '10/07/2011 10:46:00', '10/07/2011 10:47:00', '10/07/2011 10:48:00', '10/07/2011 10:49:00', '10/07/2011 10:50:00', '10/07/2011 10:51:00', '10/07/2011 10:52:00', '10/07/2011 10:53:00', '10/07/2011 10:54:00', '10/07/2011 10:55:00', '10/07/2011 10:56:00', '10/07/2011 10:57:00', '10/07/2011 10:58:00', '10/07/2011 10:59:00', '10/07/2011 11:00:00', '10/07/2011 11:01:00', '10/07/2011 11:02:00', '10/07/2011 11:03:00', '10/07/2011 11:04:00', '10/07/2011 11:05:00', '10/07/2011 11:06:00', '10/07/2011 11:07:00', '10/07/2011 11:08:00', '10/07/2011 11:09:00', '10/07/2011 11:10:00', '10/07/2011 11:11:00', '10/07/2011 11:12:00', '10/07/2011 11:13:00', '10/07/2011 11:14:00', '10/07/2011 11:15:00', '10/07/2011 11:16:00', '10/07/2011 11:17:00', '10/07/2011 11:18:00', '10/07/2011 11:19:00', '10/07/2011 11:20:00', '10/07/2011 11:21:00', '10/07/2011 11:22:00', '10/07/2011 11:23:00', '10/07/2011 11:24:00', '10/07/2011 11:25:00', '10/07/2011 11:26:00', '10/07/2011 11:27:00', '10/07/2011 11:28:00', '10/07/2011 11:29:00', '10/07/2011 11:30:00', '10/07/2011 11:31:00', '10/07/2011 11:32:00', '10/07/2011 11:33:00', '10/07/2011 11:34:00', '10/07/2011 11:35:00', '10/07/2011 11:36:00', '10/07/2011 11:37:00', '10/07/2011 11:38:00', '10/07/2011 11:39:00', '10/07/2011 11:40:00', '10/07/2011 11:41:00', '10/07/2011 11:42:00', '10/07/2011 11:43:00', '10/07/2011 11:44:00', '10/07/2011 11:45:00', '10/07/2011 11:46:00', '10/07/2011 11:47:00', '10/07/2011 11:48:00', '10/07/2011 11:49:00', '10/07/2011 11:50:00', '10/07/2011 11:51:00', '10/07/2011 11:52:00', '10/07/2011 11:53:00', '10/07/2011 11:54:00', '10/07/2011 11:55:00', '10/07/2011 11:56:00', '10/07/2011 11:57:00', '10/07/2011 11:58:00', '10/07/2011 11:59:00', '10/07/2011 12:00:00', '10/07/2011 12:01:00', '10/07/2011 12:02:00', '10/07/2011 12:03:00', '10/07/2011 12:04:00', '10/07/2011 12:05:00', '10/07/2011 12:06:00', '10/07/2011 12:07:00', '10/07/2011 12:08:00', '10/07/2011 12:09:00', '10/07/2011 12:10:00', '10/07/2011 12:11:00', '10/07/2011 12:12:00', '10/07/2011 12:13:00', '10/07/2011 12:14:00', '10/07/2011 12:15:00', '10/07/2011 12:16:00', '10/07/2011 12:17:00', '10/07/2011 12:18:00', '10/07/2011 12:19:00', '10/07/2011 12:20:00', '10/07/2011 12:21:00', '10/07/2011 12:22:00', '10/07/2011 12:23:00', '10/07/2011 12:24:00', '10/07/2011 12:25:00', '10/07/2011 12:26:00', '10/07/2011 12:27:00', '10/07/2011 12:28:00', '10/07/2011 12:29:00', '10/07/2011 12:30:00', '10/07/2011 12:31:00', '10/07/2011 12:32:00', '10/07/2011 12:33:00', '10/07/2011 12:34:00', '10/07/2011 12:35:00', '10/07/2011 12:36:00', '10/07/2011 12:37:00', '10/07/2011 12:38:00', '10/07/2011 12:39:00', '10/07/2011 12:40:00', '10/07/2011 12:41:00', '10/07/2011 12:42:00', '10/07/2011 12:43:00', '10/07/2011 12:44:00', '10/07/2011 12:45:00', '10/07/2011 12:46:00', '10/07/2011 12:47:00', '10/07/2011 12:48:00', '10/07/2011 12:49:00', '10/07/2011 12:50:00', '10/07/2011 12:51:00', '10/07/2011 12:52:00', '10/07/2011 12:53:00', '10/07/2011 12:54:00', '10/07/2011 12:55:00', '10/07/2011 12:56:00', '10/07/2011 12:57:00', '10/07/2011 12:58:00', '10/07/2011 12:59:00', '10/07/2011 13:00:00', '10/07/2011 13:01:00', '10/07/2011 13:02:00', '10/07/2011 13:03:00', '10/07/2011 13:04:00', '10/07/2011 13:05:00', '10/07/2011 13:06:00', '10/07/2011 13:07:00', '10/07/2011 13:08:00', '10/07/2011 13:09:00', '10/07/2011 13:10:00', '10/07/2011 13:11:00', '10/07/2011 13:12:00', '10/07/2011 13:13:00', '10/07/2011 13:14:00', '10/07/2011 13:15:00', '10/07/2011 13:16:00', '10/07/2011 13:17:00', '10/07/2011 13:18:00', '10/07/2011 13:19:00', '10/07/2011 13:20:00', '10/07/2011 13:21:00', '10/07/2011 13:22:00', '10/07/2011 13:23:00', '10/07/2011 13:24:00', '10/07/2011 13:25:00', '10/07/2011 13:26:00', '10/07/2011 13:27:00', '10/07/2011 13:28:00', '10/07/2011 13:29:00', '10/07/2011 13:30:00', '10/07/2011 13:31:00', '10/07/2011 13:32:00', '10/07/2011 13:33:00', '10/07/2011 13:34:00', '10/07/2011 13:35:00', '10/07/2011 13:36:00', '10/07/2011 13:37:00', '10/07/2011 13:38:00', '10/07/2011 13:39:00', '10/07/2011 13:40:00', '10/07/2011 13:41:00', '10/07/2011 13:42:00', '10/07/2011 13:43:00', '10/07/2011 13:44:00', '10/07/2011 13:45:00', '10/07/2011 13:46:00', '10/07/2011 13:47:00', '10/07/2011 13:48:00', '10/07/2011 13:49:00', '10/07/2011 13:50:00', '10/07/2011 13:51:00', '10/07/2011 13:52:00', '10/07/2011 13:53:00', '10/07/2011 13:54:00', '10/07/2011 13:55:00', '10/07/2011 13:56:00', '10/07/2011 13:57:00', '10/07/2011 13:58:00', '10/07/2011 13:59:00', '10/07/2011 14:00:00', '10/07/2011 14:01:00', '10/07/2011 14:02:00', '10/07/2011 14:03:00', '10/07/2011 14:04:00', '10/07/2011 14:05:00', '10/07/2011 14:06:00', '10/07/2011 14:07:00', '10/07/2011 14:08:00', '10/07/2011 14:09:00', '10/07/2011 14:10:00', '10/07/2011 14:11:00', '10/07/2011 14:12:00', '10/07/2011 14:13:00', '10/07/2011 14:14:00', '10/07/2011 14:15:00', '10/07/2011 14:16:00', '10/07/2011 14:17:00', '10/07/2011 14:18:00', '10/07/2011 14:19:00', '10/07/2011 14:20:00', '10/07/2011 14:21:00', '10/07/2011 14:22:00', '10/07/2011 14:23:00', '10/07/2011 14:24:00', '10/07/2011 14:25:00', '10/07/2011 14:26:00', '10/07/2011 14:27:00', '10/07/2011 14:28:00', '10/07/2011 14:29:00', '10/07/2011 14:30:00', '10/07/2011 14:31:00', '10/07/2011 14:32:00', '10/07/2011 14:33:00', '10/07/2011 14:34:00', '10/07/2011 14:35:00', '10/07/2011 14:36:00', '10/07/2011 14:37:00', '10/07/2011 14:38:00', '10/07/2011 14:39:00', '10/07/2011 14:40:00', '10/07/2011 14:41:00', '10/07/2011 14:42:00', '10/07/2011 14:43:00', '10/07/2011 14:44:00', '10/07/2011 14:45:00', '10/07/2011 14:46:00', '10/07/2011 14:47:00', '10/07/2011 14:48:00', '10/07/2011 14:49:00', '10/07/2011 14:50:00', '10/07/2011 14:51:00', '10/07/2011 14:52:00', '10/07/2011 14:53:00', '10/07/2011 14:54:00', '10/07/2011 14:55:00', '10/07/2011 14:56:00', '10/07/2011 14:57:00', '10/07/2011 14:58:00', '10/07/2011 14:59:00', '10/07/2011 15:00:00', '10/07/2011 15:01:00', '10/07/2011 15:02:00', '10/07/2011 15:03:00', '10/07/2011 15:04:00', '10/07/2011 15:05:00', '10/07/2011 15:06:00', '10/07/2011 15:07:00', '10/07/2011 15:08:00', '10/07/2011 15:09:00', '10/07/2011 15:10:00', '10/07/2011 15:11:00', '10/07/2011 15:12:00', '10/07/2011 15:13:00', '10/07/2011 15:14:00', '10/07/2011 15:15:00', '10/07/2011 15:16:00', '10/07/2011 15:17:00', '10/07/2011 15:18:00', '10/07/2011 15:19:00', '10/07/2011 15:20:00', '10/07/2011 15:21:00', '10/07/2011 15:22:00', '10/07/2011 15:23:00', '10/07/2011 15:24:00', '10/07/2011 15:25:00', '10/07/2011 15:26:00', '10/07/2011 15:27:00', '10/07/2011 15:28:00', '10/07/2011 15:29:00', '10/07/2011 15:30:00', '10/07/2011 15:31:00', '10/07/2011 15:32:00', '10/07/2011 15:33:00', '10/07/2011 15:34:00', '10/07/2011 15:35:00', '10/07/2011 15:36:00', '10/07/2011 15:37:00', '10/07/2011 15:38:00', '10/07/2011 15:39:00', '10/07/2011 15:40:00', '10/07/2011 15:41:00', '10/07/2011 15:42:00', '10/07/2011 15:43:00', '10/07/2011 15:44:00', '10/07/2011 15:45:00', '10/07/2011 15:46:00', '10/07/2011 15:47:00', '10/07/2011 15:48:00', '10/07/2011 15:49:00', '10/07/2011 15:50:00', '10/07/2011 15:51:00', '10/07/2011 15:52:00', '10/07/2011 15:53:00', '10/07/2011 15:54:00', '10/07/2011 15:55:00', '10/07/2011 15:56:00', '10/07/2011 15:57:00', '10/07/2011 15:58:00', '10/07/2011 15:59:00', '10/07/2011 16:00:00', '10/07/2011 16:01:00', '10/07/2011 16:02:00', '10/07/2011 16:03:00', '10/07/2011 16:04:00', '10/07/2011 16:05:00', '10/07/2011 16:06:00', '10/07/2011 16:07:00', '10/07/2011 16:08:00', '10/07/2011 16:09:00', '10/07/2011 16:10:00', '10/07/2011 16:11:00', '10/07/2011 16:12:00', '10/07/2011 16:13:00', '10/07/2011 16:14:00', '10/07/2011 16:15:00', '10/07/2011 16:16:00', '10/07/2011 16:17:00', '10/07/2011 16:18:00', '10/07/2011 16:19:00', '10/07/2011 16:20:00', '10/07/2011 16:21:00', '10/07/2011 16:22:00', '10/07/2011 16:23:00', '10/07/2011 16:24:00', '10/07/2011 16:25:00', '10/07/2011 16:26:00', '10/07/2011 16:27:00', '10/07/2011 16:28:00', '10/07/2011 16:29:00', '10/07/2011 16:30:00', '10/07/2011 16:31:00', '10/07/2011 16:32:00', '10/07/2011 16:33:00', '10/07/2011 16:34:00', '10/07/2011 16:35:00', '10/07/2011 16:36:00', '10/07/2011 16:37:00', '10/07/2011 16:38:00', '10/07/2011 16:39:00', '10/07/2011 16:40:00', '10/07/2011 16:41:00', '10/07/2011 16:42:00', '10/07/2011 16:43:00', '10/07/2011 16:44:00', '10/07/2011 16:45:00', '10/07/2011 16:46:00', '10/07/2011 16:47:00', '10/07/2011 16:48:00', '10/07/2011 16:49:00', '10/07/2011 16:50:00', '10/07/2011 16:51:00', '10/07/2011 16:52:00', '10/07/2011 16:53:00', '10/07/2011 16:54:00', '10/07/2011 16:55:00', '10/07/2011 16:56:00', '10/07/2011 16:57:00', '10/07/2011 16:58:00', '10/07/2011 16:59:00', '10/07/2011 17:00:00', '10/07/2011 17:01:00', '10/07/2011 17:02:00', '10/07/2011 17:03:00', '10/07/2011 17:04:00', '10/07/2011 17:05:00', '10/07/2011 17:06:00', '10/07/2011 17:07:00', '10/07/2011 17:08:00', '10/07/2011 17:09:00', '10/07/2011 17:10:00', '10/07/2011 17:11:00', '10/07/2011 17:12:00', '10/07/2011 17:13:00', '10/07/2011 17:14:00', '10/07/2011 17:15:00', '10/07/2011 17:16:00', '10/07/2011 17:17:00', '10/07/2011 17:18:00', '10/07/2011 17:19:00', '10/07/2011 17:20:00', '10/07/2011 17:21:00', '10/07/2011 17:22:00', '10/07/2011 17:23:00', '10/07/2011 17:24:00', '10/07/2011 17:25:00', '10/07/2011 17:26:00', '10/07/2011 17:27:00', '10/07/2011 17:28:00', '10/07/2011 17:29:00', '10/07/2011 17:30:00', '10/07/2011 17:31:00', '10/07/2011 17:32:00', '10/07/2011 17:33:00', '10/07/2011 17:34:00', '10/07/2011 17:35:00', '10/07/2011 17:36:00', '10/07/2011 17:37:00', '10/07/2011 17:38:00', '10/07/2011 17:39:00', '10/07/2011 17:40:00', '10/07/2011 17:41:00', '10/07/2011 17:42:00', '10/07/2011 17:43:00', '10/07/2011 17:44:00', '10/07/2011 17:45:00', '10/07/2011 17:46:00', '10/07/2011 17:47:00', '10/07/2011 17:48:00', '10/07/2011 17:49:00', '10/07/2011 17:50:00', '10/07/2011 17:51:00', '10/07/2011 17:52:00', '10/07/2011 17:53:00', '10/07/2011 17:54:00', '10/07/2011 17:55:00', '10/07/2011 17:56:00', '10/07/2011 17:57:00', '10/07/2011 17:58:00', '10/07/2011 17:59:00', '10/07/2011 18:00:00', '10/07/2011 18:01:00', '10/07/2011 18:02:00', '10/07/2011 18:03:00', '10/07/2011 18:04:00', '10/07/2011 18:05:00', '10/07/2011 18:06:00', '10/07/2011 18:07:00', '10/07/2011 18:08:00', '10/07/2011 18:09:00', '10/07/2011 18:10:00', '10/07/2011 18:11:00', '10/07/2011 18:12:00', '10/07/2011 18:13:00', '10/07/2011 18:14:00', '10/07/2011 18:15:00', '10/07/2011 18:16:00', '10/07/2011 18:17:00', '10/07/2011 18:18:00', '10/07/2011 18:19:00', '10/07/2011 18:20:00', '10/07/2011 18:21:00', '10/07/2011 18:22:00', '10/07/2011 18:23:00', '10/07/2011 18:24:00', '10/07/2011 18:25:00', '10/07/2011 18:26:00', '10/07/2011 18:27:00', '10/07/2011 18:28:00', '10/07/2011 18:29:00', '10/07/2011 18:30:00', '10/07/2011 18:31:00', '10/07/2011 18:32:00', '10/07/2011 18:33:00', '10/07/2011 18:34:00', '10/07/2011 18:35:00', '10/07/2011 18:36:00', '10/07/2011 18:37:00', '10/07/2011 18:38:00', '10/07/2011 18:39:00', '10/07/2011 18:40:00', '10/07/2011 18:41:00', '10/07/2011 18:42:00', '10/07/2011 18:43:00', '10/07/2011 18:44:00', '10/07/2011 18:45:00', '10/07/2011 18:46:00', '10/07/2011 18:47:00', '10/07/2011 18:48:00', '10/07/2011 18:49:00', '10/07/2011 18:50:00', '10/07/2011 18:51:00', '10/07/2011 18:52:00', '10/07/2011 18:53:00', '10/07/2011 18:54:00', '10/07/2011 18:55:00', '10/07/2011 18:56:00', '10/07/2011 18:57:00', '10/07/2011 18:58:00', '10/07/2011 18:59:00', '10/07/2011 19:00:00', '10/07/2011 19:01:00', '10/07/2011 19:02:00', '10/07/2011 19:03:00', '10/07/2011 19:04:00', '10/07/2011 19:05:00', '10/07/2011 19:06:00', '10/07/2011 19:07:00', '10/07/2011 19:08:00', '10/07/2011 19:09:00', '10/07/2011 19:10:00', '10/07/2011 19:11:00', '10/07/2011 19:12:00', '10/07/2011 19:13:00', '10/07/2011 19:14:00', '10/07/2011 19:15:00', '10/07/2011 19:16:00', '10/07/2011 19:17:00', '10/07/2011 19:18:00', '10/07/2011 19:19:00', '10/07/2011 19:20:00', '10/07/2011 19:21:00', '10/07/2011 19:22:00', '10/07/2011 19:23:00', '10/07/2011 19:24:00', '10/07/2011 19:25:00', '10/07/2011 19:26:00', '10/07/2011 19:27:00', '10/07/2011 19:28:00', '10/07/2011 19:29:00', '10/07/2011 19:30:00', '10/07/2011 19:31:00', '10/07/2011 19:32:00', '10/07/2011 19:33:00', '10/07/2011 19:34:00', '10/07/2011 19:35:00', '10/07/2011 19:36:00', '10/07/2011 19:37:00', '10/07/2011 19:38:00', '10/07/2011 19:39:00', '10/07/2011 19:40:00', '10/07/2011 19:41:00', '10/07/2011 19:42:00', '10/07/2011 19:43:00', '10/07/2011 19:44:00', '10/07/2011 19:45:00', '10/07/2011 19:46:00', '10/07/2011 19:47:00', '10/07/2011 19:48:00', '10/07/2011 19:49:00', '10/07/2011 19:50:00', '10/07/2011 19:51:00', '10/07/2011 19:52:00', '10/07/2011 19:53:00', '10/07/2011 19:54:00', '10/07/2011 19:55:00', '10/07/2011 19:56:00', '10/07/2011 19:57:00', '10/07/2011 19:58:00', '10/07/2011 19:59:00', '10/07/2011 20:00:00', '10/07/2011 20:01:00', '10/07/2011 20:02:00', '10/07/2011 20:03:00', '10/07/2011 20:04:00', '10/07/2011 20:05:00', '10/07/2011 20:06:00', '10/07/2011 20:07:00', '10/07/2011 20:08:00', '10/07/2011 20:09:00', '10/07/2011 20:10:00', '10/07/2011 20:11:00', '10/07/2011 20:12:00', '10/07/2011 20:13:00', '10/07/2011 20:14:00', '10/07/2011 20:15:00', '10/07/2011 20:16:00', '10/07/2011 20:17:00', '10/07/2011 20:18:00', '10/07/2011 20:19:00', '10/07/2011 20:20:00', '10/07/2011 20:21:00', '10/07/2011 20:22:00', '10/07/2011 20:23:00', '10/07/2011 20:24:00', '10/07/2011 20:25:00', '10/07/2011 20:26:00', '10/07/2011 20:27:00', '10/07/2011 20:28:00', '10/07/2011 20:29:00', '10/07/2011 20:30:00', '10/07/2011 20:31:00', '10/07/2011 20:32:00', '10/07/2011 20:33:00', '10/07/2011 20:34:00', '10/07/2011 20:35:00', '10/07/2011 20:36:00', '10/07/2011 20:37:00', '10/07/2011 20:38:00', '10/07/2011 20:39:00', '10/07/2011 20:40:00', '10/07/2011 20:41:00', '10/07/2011 20:42:00', '10/07/2011 20:43:00', '10/07/2011 20:44:00', '10/07/2011 20:45:00', '10/07/2011 20:46:00', '10/07/2011 20:47:00', '10/07/2011 20:48:00', '10/07/2011 20:49:00', '10/07/2011 20:50:00', '10/07/2011 20:51:00', '10/07/2011 20:52:00', '10/07/2011 20:53:00', '10/07/2011 20:54:00', '10/07/2011 20:55:00', '10/07/2011 20:56:00', '10/07/2011 20:57:00', '10/07/2011 20:58:00', '10/07/2011 20:59:00', '10/07/2011 21:00:00', '10/07/2011 21:01:00', '10/07/2011 21:02:00', '10/07/2011 21:03:00', '10/07/2011 21:04:00', '10/07/2011 21:05:00', '10/07/2011 21:06:00', '10/07/2011 21:07:00', '10/07/2011 21:08:00', '10/07/2011 21:09:00', '10/07/2011 21:10:00', '10/07/2011 21:11:00', '10/07/2011 21:12:00', '10/07/2011 21:13:00', '10/07/2011 21:14:00', '10/07/2011 21:15:00', '10/07/2011 21:16:00', '10/07/2011 21:17:00', '10/07/2011 21:18:00', '10/07/2011 21:19:00', '10/07/2011 21:20:00', '10/07/2011 21:21:00', '10/07/2011 21:22:00', '10/07/2011 21:23:00', '10/07/2011 21:24:00', '10/07/2011 21:25:00', '10/07/2011 21:26:00', '10/07/2011 21:27:00', '10/07/2011 21:28:00', '10/07/2011 21:29:00', '10/07/2011 21:30:00', '10/07/2011 21:31:00', '10/07/2011 21:32:00', '10/07/2011 21:33:00', '10/07/2011 21:34:00', '10/07/2011 21:35:00', '10/07/2011 21:36:00', '10/07/2011 21:37:00', '10/07/2011 21:38:00', '10/07/2011 21:39:00', '10/07/2011 21:40:00', '10/07/2011 21:41:00', '10/07/2011 21:42:00', '10/07/2011 21:43:00', '10/07/2011 21:44:00', '10/07/2011 21:45:00', '10/07/2011 21:46:00', '10/07/2011 21:47:00', '10/07/2011 21:48:00', '10/07/2011 21:49:00', '10/07/2011 21:50:00', '10/07/2011 21:51:00', '10/07/2011 21:52:00', '10/07/2011 21:53:00', '10/07/2011 21:54:00', '10/07/2011 21:55:00', '10/07/2011 21:56:00', '10/07/2011 21:57:00', '10/07/2011 21:58:00', '10/07/2011 21:59:00', '10/07/2011 22:00:00', '10/07/2011 22:01:00', '10/07/2011 22:02:00', '10/07/2011 22:03:00', '10/07/2011 22:04:00', '10/07/2011 22:05:00', '10/07/2011 22:06:00', '10/07/2011 22:07:00', '10/07/2011 22:08:00', '10/07/2011 22:09:00', '10/07/2011 22:10:00', '10/07/2011 22:11:00', '10/07/2011 22:12:00', '10/07/2011 22:13:00', '10/07/2011 22:14:00', '10/07/2011 22:15:00', '10/07/2011 22:16:00', '10/07/2011 22:17:00', '10/07/2011 22:18:00', '10/07/2011 22:19:00', '10/07/2011 22:20:00', '10/07/2011 22:21:00', '10/07/2011 22:22:00', '10/07/2011 22:23:00', '10/07/2011 22:24:00', '10/07/2011 22:25:00', '10/07/2011 22:26:00', '10/07/2011 22:27:00', '10/07/2011 22:28:00', '10/07/2011 22:29:00', '10/07/2011 22:30:00', '10/07/2011 22:31:00', '10/07/2011 22:32:00', '10/07/2011 22:33:00', '10/07/2011 22:34:00', '10/07/2011 22:35:00', '10/07/2011 22:36:00', '10/07/2011 22:37:00', '10/07/2011 22:38:00', '10/07/2011 22:39:00', '10/07/2011 22:40:00', '10/07/2011 22:41:00', '10/07/2011 22:42:00', '10/07/2011 22:43:00', '10/07/2011 22:44:00', '10/07/2011 22:45:00', '10/07/2011 22:46:00', '10/07/2011 22:47:00', '10/07/2011 22:48:00', '10/07/2011 22:49:00', '10/07/2011 22:50:00', '10/07/2011 22:51:00', '10/07/2011 22:52:00', '10/07/2011 22:53:00', '10/07/2011 22:54:00', '10/07/2011 22:55:00', '10/07/2011 22:56:00', '10/07/2011 22:57:00', '10/07/2011 22:58:00', '10/07/2011 22:59:00', '10/07/2011 23:00:00', '10/07/2011 23:01:00', '10/07/2011 23:02:00', '10/07/2011 23:03:00', '10/07/2011 23:04:00', '10/07/2011 23:05:00', '10/07/2011 23:06:00', '10/07/2011 23:07:00', '10/07/2011 23:08:00', '10/07/2011 23:09:00', '10/07/2011 23:10:00', '10/07/2011 23:11:00', '10/07/2011 23:12:00', '10/07/2011 23:13:00', '10/07/2011 23:14:00', '10/07/2011 23:15:00', '10/07/2011 23:16:00', '10/07/2011 23:17:00', '10/07/2011 23:18:00', '10/07/2011 23:19:00', '10/07/2011 23:20:00', '10/07/2011 23:21:00', '10/07/2011 23:22:00', '10/07/2011 23:23:00', '10/07/2011 23:24:00', '10/07/2011 23:25:00', '10/07/2011 23:26:00', '10/07/2011 23:27:00', '10/07/2011 23:28:00', '10/07/2011 23:29:00', '10/07/2011 23:30:00', '10/07/2011 23:31:00', '10/07/2011 23:32:00', '10/07/2011 23:33:00', '10/07/2011 23:34:00', '10/07/2011 23:35:00', '10/07/2011 23:36:00', '10/07/2011 23:37:00', '10/07/2011 23:38:00', '10/07/2011 23:39:00', '10/07/2011 23:40:00', '10/07/2011 23:41:00', '10/07/2011 23:42:00', '10/07/2011 23:43:00', '10/07/2011 23:44:00', '10/07/2011 23:45:00', '10/07/2011 23:46:00', '10/07/2011 23:47:00', '10/07/2011 23:48:00', '10/07/2011 23:49:00', '10/07/2011 23:50:00', '10/07/2011 23:51:00', '10/07/2011 23:52:00', '10/07/2011 23:53:00', '10/07/2011 23:54:00', '10/07/2011 23:55:00', '10/07/2011 23:56:00', '10/07/2011 23:57:00', '10/07/2011 23:58:00', '10/07/2011 23:59:00', '11/07/2011 0:00:00', '11/07/2011 0:01:00', '11/07/2011 0:02:00', '11/07/2011 0:03:00', '11/07/2011 0:04:00', '11/07/2011 0:05:00', '11/07/2011 0:06:00', '11/07/2011 0:07:00', '11/07/2011 0:08:00', '11/07/2011 0:09:00', '11/07/2011 0:10:00', '11/07/2011 0:11:00', '11/07/2011 0:12:00', '11/07/2011 0:13:00', '11/07/2011 0:14:00', '11/07/2011 0:15:00', '11/07/2011 0:16:00', '11/07/2011 0:17:00', '11/07/2011 0:18:00', '11/07/2011 0:19:00', '11/07/2011 0:20:00', '11/07/2011 0:21:00', '11/07/2011 0:22:00', '11/07/2011 0:23:00', '11/07/2011 0:24:00', '11/07/2011 0:25:00', '11/07/2011 0:26:00', '11/07/2011 0:27:00', '11/07/2011 0:28:00', '11/07/2011 0:29:00', '11/07/2011 0:30:00', '11/07/2011 0:31:00', '11/07/2011 0:32:00', '11/07/2011 0:33:00', '11/07/2011 0:34:00', '11/07/2011 0:35:00', '11/07/2011 0:36:00', '11/07/2011 0:37:00', '11/07/2011 0:38:00', '11/07/2011 0:39:00', '11/07/2011 0:40:00', '11/07/2011 0:41:00', '11/07/2011 0:42:00', '11/07/2011 0:43:00', '11/07/2011 0:44:00', '11/07/2011 0:45:00', '11/07/2011 0:46:00', '11/07/2011 0:47:00', '11/07/2011 0:48:00', '11/07/2011 0:49:00', '11/07/2011 0:50:00', '11/07/2011 0:51:00', '11/07/2011 0:52:00', '11/07/2011 0:53:00', '11/07/2011 0:54:00', '11/07/2011 0:55:00', '11/07/2011 0:56:00', '11/07/2011 0:57:00', '11/07/2011 0:58:00', '11/07/2011 0:59:00', '11/07/2011 1:00:00', '11/07/2011 1:01:00', '11/07/2011 1:02:00', '11/07/2011 1:03:00', '11/07/2011 1:04:00', '11/07/2011 1:05:00', '11/07/2011 1:06:00', '11/07/2011 1:07:00', '11/07/2011 1:08:00', '11/07/2011 1:09:00', '11/07/2011 1:10:00', '11/07/2011 1:11:00', '11/07/2011 1:12:00', '11/07/2011 1:13:00', '11/07/2011 1:14:00', '11/07/2011 1:15:00', '11/07/2011 1:16:00', '11/07/2011 1:17:00', '11/07/2011 1:18:00', '11/07/2011 1:19:00', '11/07/2011 1:20:00', '11/07/2011 1:21:00', '11/07/2011 1:22:00', '11/07/2011 1:23:00', '11/07/2011 1:24:00', '11/07/2011 1:25:00', '11/07/2011 1:26:00', '11/07/2011 1:27:00', '11/07/2011 1:28:00', '11/07/2011 1:29:00', '11/07/2011 1:30:00', '11/07/2011 1:31:00', '11/07/2011 1:32:00', '11/07/2011 1:33:00', '11/07/2011 1:34:00', '11/07/2011 1:35:00', '11/07/2011 1:36:00', '11/07/2011 1:37:00', '11/07/2011 1:38:00', '11/07/2011 1:39:00', '11/07/2011 1:40:00', '11/07/2011 1:41:00', '11/07/2011 1:42:00', '11/07/2011 1:43:00', '11/07/2011 1:44:00', '11/07/2011 1:45:00', '11/07/2011 1:46:00', '11/07/2011 1:47:00', '11/07/2011 1:48:00', '11/07/2011 1:49:00', '11/07/2011 1:50:00', '11/07/2011 1:51:00', '11/07/2011 1:52:00', '11/07/2011 1:53:00', '11/07/2011 1:54:00', '11/07/2011 1:55:00', '11/07/2011 1:56:00', '11/07/2011 1:57:00', '11/07/2011 1:58:00', '11/07/2011 1:59:00', '11/07/2011 2:00:00', '11/07/2011 2:01:00', '11/07/2011 2:02:00', '11/07/2011 2:03:00', '11/07/2011 2:04:00', '11/07/2011 2:05:00', '11/07/2011 2:06:00', '11/07/2011 2:07:00', '11/07/2011 2:08:00', '11/07/2011 2:09:00', '11/07/2011 2:10:00', '11/07/2011 2:11:00', '11/07/2011 2:12:00', '11/07/2011 2:13:00', '11/07/2011 2:14:00', '11/07/2011 2:15:00', '11/07/2011 2:16:00', '11/07/2011 2:17:00', '11/07/2011 2:18:00', '11/07/2011 2:19:00', '11/07/2011 2:20:00', '11/07/2011 2:21:00', '11/07/2011 2:22:00', '11/07/2011 2:23:00', '11/07/2011 2:24:00', '11/07/2011 2:25:00', '11/07/2011 2:26:00', '11/07/2011 2:27:00', '11/07/2011 2:28:00', '11/07/2011 2:29:00', '11/07/2011 2:30:00', '11/07/2011 2:31:00', '11/07/2011 2:32:00', '11/07/2011 2:33:00', '11/07/2011 2:34:00', '11/07/2011 2:35:00', '11/07/2011 2:36:00', '11/07/2011 2:37:00', '11/07/2011 2:38:00', '11/07/2011 2:39:00', '11/07/2011 2:40:00', '11/07/2011 2:41:00', '11/07/2011 2:42:00', '11/07/2011 2:43:00', '11/07/2011 2:44:00', '11/07/2011 2:45:00', '11/07/2011 2:46:00', '11/07/2011 2:47:00', '11/07/2011 2:48:00', '11/07/2011 2:49:00', '11/07/2011 2:50:00', '11/07/2011 2:51:00', '11/07/2011 2:52:00', '11/07/2011 2:53:00', '11/07/2011 2:54:00', '11/07/2011 2:55:00', '11/07/2011 2:56:00', '11/07/2011 2:57:00', '11/07/2011 2:58:00', '11/07/2011 2:59:00', '11/07/2011 3:00:00', '11/07/2011 3:01:00', '11/07/2011 3:02:00', '11/07/2011 3:03:00', '11/07/2011 3:04:00', '11/07/2011 3:05:00', '11/07/2011 3:06:00', '11/07/2011 3:07:00', '11/07/2011 3:08:00', '11/07/2011 3:09:00', '11/07/2011 3:10:00', '11/07/2011 3:11:00', '11/07/2011 3:12:00', '11/07/2011 3:13:00', '11/07/2011 3:14:00', '11/07/2011 3:15:00', '11/07/2011 3:16:00', '11/07/2011 3:17:00', '11/07/2011 3:18:00', '11/07/2011 3:19:00', '11/07/2011 3:20:00', '11/07/2011 3:21:00', '11/07/2011 3:22:00', '11/07/2011 3:23:00', '11/07/2011 3:24:00', '11/07/2011 3:25:00', '11/07/2011 3:26:00', '11/07/2011 3:27:00', '11/07/2011 3:28:00', '11/07/2011 3:29:00', '11/07/2011 3:30:00', '11/07/2011 3:31:00', '11/07/2011 3:32:00', '11/07/2011 3:33:00', '11/07/2011 3:34:00', '11/07/2011 3:35:00', '11/07/2011 3:36:00', '11/07/2011 3:37:00', '11/07/2011 3:38:00', '11/07/2011 3:39:00', '11/07/2011 3:40:00', '11/07/2011 3:41:00', '11/07/2011 3:42:00', '11/07/2011 3:43:00', '11/07/2011 3:44:00', '11/07/2011 3:45:00', '11/07/2011 3:46:00', '11/07/2011 3:47:00', '11/07/2011 3:48:00', '11/07/2011 3:49:00', '11/07/2011 3:50:00', '11/07/2011 3:51:00', '11/07/2011 3:52:00', '11/07/2011 3:53:00', '11/07/2011 3:54:00', '11/07/2011 3:55:00', '11/07/2011 3:56:00', '11/07/2011 3:57:00', '11/07/2011 3:58:00', '11/07/2011 3:59:00', '11/07/2011 4:00:00', '11/07/2011 4:01:00', '11/07/2011 4:02:00', '11/07/2011 4:03:00', '11/07/2011 4:04:00', '11/07/2011 4:05:00', '11/07/2011 4:06:00', '11/07/2011 4:07:00', '11/07/2011 4:08:00', '11/07/2011 4:09:00', '11/07/2011 4:10:00', '11/07/2011 4:11:00', '11/07/2011 4:12:00', '11/07/2011 4:13:00', '11/07/2011 4:14:00', '11/07/2011 4:15:00', '11/07/2011 4:16:00', '11/07/2011 4:17:00', '11/07/2011 4:18:00', '11/07/2011 4:19:00', '11/07/2011 4:20:00', '11/07/2011 4:21:00', '11/07/2011 4:22:00', '11/07/2011 4:23:00', '11/07/2011 4:24:00', '11/07/2011 4:25:00', '11/07/2011 4:26:00', '11/07/2011 4:27:00', '11/07/2011 4:28:00', '11/07/2011 4:29:00', '11/07/2011 4:30:00', '11/07/2011 4:31:00', '11/07/2011 4:32:00', '11/07/2011 4:33:00', '11/07/2011 4:34:00', '11/07/2011 4:35:00', '11/07/2011 4:36:00', '11/07/2011 4:37:00', '11/07/2011 4:38:00', '11/07/2011 4:39:00', '11/07/2011 4:40:00', '11/07/2011 4:41:00', '11/07/2011 4:42:00', '11/07/2011 4:43:00', '11/07/2011 4:44:00', '11/07/2011 4:45:00', '11/07/2011 4:46:00', '11/07/2011 4:47:00', '11/07/2011 4:48:00', '11/07/2011 4:49:00', '11/07/2011 4:50:00', '11/07/2011 4:51:00', '11/07/2011 4:52:00', '11/07/2011 4:53:00', '11/07/2011 4:54:00', '11/07/2011 4:55:00', '11/07/2011 4:56:00', '11/07/2011 4:57:00', '11/07/2011 4:58:00', '11/07/2011 4:59:00', '11/07/2011 5:00:00', '11/07/2011 5:01:00', '11/07/2011 5:02:00', '11/07/2011 5:03:00', '11/07/2011 5:04:00', '11/07/2011 5:05:00', '11/07/2011 5:06:00', '11/07/2011 5:07:00', '11/07/2011 5:08:00', '11/07/2011 5:09:00', '11/07/2011 5:10:00', '11/07/2011 5:11:00', '11/07/2011 5:12:00', '11/07/2011 5:13:00', '11/07/2011 5:14:00', '11/07/2011 5:15:00', '11/07/2011 5:16:00', '11/07/2011 5:17:00', '11/07/2011 5:18:00', '11/07/2011 5:19:00', '11/07/2011 5:20:00', '11/07/2011 5:21:00', '11/07/2011 5:22:00', '11/07/2011 5:23:00', '11/07/2011 5:24:00', '11/07/2011 5:25:00', '11/07/2011 5:26:00', '11/07/2011 5:27:00', '11/07/2011 5:28:00', '11/07/2011 5:29:00', '11/07/2011 5:30:00', '11/07/2011 5:31:00', '11/07/2011 5:32:00', '11/07/2011 5:33:00', '11/07/2011 5:34:00', '11/07/2011 5:35:00', '11/07/2011 5:36:00', '11/07/2011 5:37:00', '11/07/2011 5:38:00', '11/07/2011 5:39:00', '11/07/2011 5:40:00', '11/07/2011 5:41:00', '11/07/2011 5:42:00', '11/07/2011 5:43:00', '11/07/2011 5:44:00', '11/07/2011 5:45:00', '11/07/2011 5:46:00', '11/07/2011 5:47:00', '11/07/2011 5:48:00', '11/07/2011 5:49:00', '11/07/2011 5:50:00', '11/07/2011 5:51:00', '11/07/2011 5:52:00', '11/07/2011 5:53:00', '11/07/2011 5:54:00', '11/07/2011 5:55:00', '11/07/2011 5:56:00', '11/07/2011 5:57:00', '11/07/2011 5:58:00', '11/07/2011 5:59:00', '11/07/2011 6:00:00', '11/07/2011 6:01:00', '11/07/2011 6:02:00', '11/07/2011 6:03:00', '11/07/2011 6:04:00', '11/07/2011 6:05:00', '11/07/2011 6:06:00', '11/07/2011 6:07:00', '11/07/2011 6:08:00', '11/07/2011 6:09:00', '11/07/2011 6:10:00', '11/07/2011 6:11:00', '11/07/2011 6:12:00', '11/07/2011 6:13:00', '11/07/2011 6:14:00', '11/07/2011 6:15:00', '11/07/2011 6:16:00', '11/07/2011 6:17:00', '11/07/2011 6:18:00', '11/07/2011 6:19:00', '11/07/2011 6:20:00', '11/07/2011 6:21:00', '11/07/2011 6:22:00', '11/07/2011 6:23:00', '11/07/2011 6:24:00', '11/07/2011 6:25:00', '11/07/2011 6:26:00', '11/07/2011 6:27:00', '11/07/2011 6:28:00', '11/07/2011 6:29:00', '11/07/2011 6:30:00', '11/07/2011 6:31:00', '11/07/2011 6:32:00', '11/07/2011 6:33:00', '11/07/2011 6:34:00', '11/07/2011 6:35:00', '11/07/2011 6:36:00', '11/07/2011 6:37:00', '11/07/2011 6:38:00', '11/07/2011 6:39:00', '11/07/2011 6:40:00', '11/07/2011 6:41:00', '11/07/2011 6:42:00', '11/07/2011 6:43:00', '11/07/2011 6:44:00', '11/07/2011 6:45:00', '11/07/2011 6:46:00', '11/07/2011 6:47:00', '11/07/2011 6:48:00', '11/07/2011 6:49:00', '11/07/2011 6:50:00', '11/07/2011 6:51:00', '11/07/2011 6:52:00', '11/07/2011 6:53:00', '11/07/2011 6:54:00', '11/07/2011 6:55:00', '11/07/2011 6:56:00', '11/07/2011 6:57:00', '11/07/2011 6:58:00', '11/07/2011 6:59:00', '11/07/2011 7:00:00', '11/07/2011 7:01:00', '11/07/2011 7:02:00', '11/07/2011 7:03:00', '11/07/2011 7:04:00', '11/07/2011 7:05:00', '11/07/2011 7:06:00', '11/07/2011 7:07:00', '11/07/2011 7:08:00', '11/07/2011 7:09:00', '11/07/2011 7:10:00', '11/07/2011 7:11:00', '11/07/2011 7:12:00', '11/07/2011 7:13:00', '11/07/2011 7:14:00', '11/07/2011 7:15:00', '11/07/2011 7:16:00', '11/07/2011 7:17:00', '11/07/2011 7:18:00', '11/07/2011 7:19:00', '11/07/2011 7:20:00', '11/07/2011 7:21:00', '11/07/2011 7:22:00', '11/07/2011 7:23:00', '11/07/2011 7:24:00', '11/07/2011 7:25:00', '11/07/2011 7:26:00', '11/07/2011 7:27:00', '11/07/2011 7:28:00', '11/07/2011 7:29:00', '11/07/2011 7:30:00', '11/07/2011 7:31:00', '11/07/2011 7:32:00', '11/07/2011 7:33:00', '11/07/2011 7:34:00', '11/07/2011 7:35:00', '11/07/2011 7:36:00', '11/07/2011 7:37:00', '11/07/2011 7:38:00', '11/07/2011 7:39:00', '11/07/2011 7:40:00', '11/07/2011 7:41:00', '11/07/2011 7:42:00', '11/07/2011 7:43:00', '11/07/2011 7:44:00', '11/07/2011 7:45:00', '11/07/2011 7:46:00', '11/07/2011 7:47:00', '11/07/2011 7:48:00', '11/07/2011 7:49:00', '11/07/2011 7:50:00', '11/07/2011 7:51:00', '11/07/2011 7:52:00', '11/07/2011 7:53:00', '11/07/2011 7:54:00', '11/07/2011 7:55:00', '11/07/2011 7:56:00', '11/07/2011 7:57:00', '11/07/2011 7:58:00', '11/07/2011 7:59:00', '11/07/2011 8:00:00', '11/07/2011 8:01:00', '11/07/2011 8:02:00', '11/07/2011 8:03:00', '11/07/2011 8:04:00', '11/07/2011 8:05:00', '11/07/2011 8:06:00', '11/07/2011 8:07:00', '11/07/2011 8:08:00', '11/07/2011 8:09:00', '11/07/2011 8:10:00', '11/07/2011 8:11:00', '11/07/2011 8:12:00', '11/07/2011 8:13:00', '11/07/2011 8:14:00', '11/07/2011 8:15:00', '11/07/2011 8:16:00', '11/07/2011 8:17:00', '11/07/2011 8:18:00', '11/07/2011 8:19:00', '11/07/2011 8:20:00', '11/07/2011 8:21:00', '11/07/2011 8:22:00', '11/07/2011 8:23:00', '11/07/2011 8:24:00', '11/07/2011 8:25:00', '11/07/2011 8:26:00', '11/07/2011 8:27:00', '11/07/2011 8:28:00', '11/07/2011 8:29:00', '11/07/2011 8:30:00', '11/07/2011 8:31:00', '11/07/2011 8:32:00', '11/07/2011 8:33:00', '11/07/2011 8:34:00', '11/07/2011 8:35:00', '11/07/2011 8:36:00', '11/07/2011 8:37:00', '11/07/2011 8:38:00', '11/07/2011 8:39:00', '11/07/2011 8:40:00', '11/07/2011 8:41:00', '11/07/2011 8:42:00', '11/07/2011 8:43:00', '11/07/2011 8:44:00', '11/07/2011 8:45:00', '11/07/2011 8:46:00', '11/07/2011 8:47:00', '11/07/2011 8:48:00', '11/07/2011 8:49:00', '11/07/2011 8:50:00', '11/07/2011 8:51:00', '11/07/2011 8:52:00', '11/07/2011 8:53:00', '11/07/2011 8:54:00', '11/07/2011 8:55:00', '11/07/2011 8:56:00', '11/07/2011 8:57:00', '11/07/2011 8:58:00', '11/07/2011 8:59:00', '11/07/2011 9:00:00', '11/07/2011 9:01:00', '11/07/2011 9:02:00', '11/07/2011 9:03:00', '11/07/2011 9:04:00', '11/07/2011 9:05:00', '11/07/2011 9:06:00', '11/07/2011 9:07:00', '11/07/2011 9:08:00', '11/07/2011 9:09:00', '11/07/2011 9:10:00', '11/07/2011 9:11:00', '11/07/2011 9:12:00', '11/07/2011 9:13:00', '11/07/2011 9:14:00', '11/07/2011 9:15:00', '11/07/2011 9:16:00', '11/07/2011 9:17:00', '11/07/2011 9:18:00', '11/07/2011 9:19:00', '11/07/2011 9:20:00', '11/07/2011 9:21:00', '11/07/2011 9:22:00', '11/07/2011 9:23:00', '11/07/2011 9:24:00', '11/07/2011 9:25:00', '11/07/2011 9:26:00', '11/07/2011 9:27:00', '11/07/2011 9:28:00', '11/07/2011 9:29:00', '11/07/2011 9:30:00', '11/07/2011 9:31:00', '11/07/2011 9:32:00', '11/07/2011 9:33:00', '11/07/2011 9:34:00', '11/07/2011 9:35:00', '11/07/2011 9:36:00', '11/07/2011 9:37:00', '11/07/2011 9:38:00', '11/07/2011 9:39:00', '11/07/2011 9:40:00', '11/07/2011 9:41:00', '11/07/2011 9:42:00', '11/07/2011 9:43:00', '11/07/2011 9:44:00', '11/07/2011 9:45:00', '11/07/2011 9:46:00', '11/07/2011 9:47:00', '11/07/2011 9:48:00', '11/07/2011 9:49:00', '11/07/2011 9:50:00', '11/07/2011 9:51:00', '11/07/2011 9:52:00', '11/07/2011 9:53:00', '11/07/2011 9:54:00', '11/07/2011 9:55:00', '11/07/2011 9:56:00', '11/07/2011 9:57:00', '11/07/2011 9:58:00', '11/07/2011 9:59:00', '11/07/2011 10:00:00', '11/07/2011 10:01:00', '11/07/2011 10:02:00', '11/07/2011 10:03:00', '11/07/2011 10:04:00', '11/07/2011 10:05:00', '11/07/2011 10:06:00', '11/07/2011 10:07:00', '11/07/2011 10:08:00', '11/07/2011 10:09:00', '11/07/2011 10:10:00', '11/07/2011 10:11:00', '11/07/2011 10:12:00', '11/07/2011 10:13:00', '11/07/2011 10:14:00', '11/07/2011 10:15:00', '11/07/2011 10:16:00', '11/07/2011 10:17:00', '11/07/2011 10:18:00', '11/07/2011 10:19:00', '11/07/2011 10:20:00', '11/07/2011 10:21:00', '11/07/2011 10:22:00', '11/07/2011 10:23:00', '11/07/2011 10:24:00', '11/07/2011 10:25:00', '11/07/2011 10:26:00', '11/07/2011 10:27:00', '11/07/2011 10:28:00', '11/07/2011 10:29:00', '11/07/2011 10:30:00', '11/07/2011 10:31:00', '11/07/2011 10:32:00', '11/07/2011 10:33:00', '11/07/2011 10:34:00', '11/07/2011 10:35:00', '11/07/2011 10:36:00', '11/07/2011 10:37:00', '11/07/2011 10:38:00', '11/07/2011 10:39:00', '11/07/2011 10:40:00', '11/07/2011 10:41:00', '11/07/2011 10:42:00', '11/07/2011 10:43:00', '11/07/2011 10:44:00', '11/07/2011 10:45:00', '11/07/2011 10:46:00', '11/07/2011 10:47:00', '11/07/2011 10:48:00', '11/07/2011 10:49:00', '11/07/2011 10:50:00', '11/07/2011 10:51:00', '11/07/2011 10:52:00', '11/07/2011 10:53:00', '11/07/2011 10:54:00', '11/07/2011 10:55:00', '11/07/2011 10:56:00', '11/07/2011 10:57:00', '11/07/2011 10:58:00', '11/07/2011 10:59:00', '11/07/2011 11:00:00', '11/07/2011 11:01:00', '11/07/2011 11:02:00', '11/07/2011 11:03:00', '11/07/2011 11:04:00', '11/07/2011 11:05:00', '11/07/2011 11:06:00', '11/07/2011 11:07:00', '11/07/2011 11:08:00', '11/07/2011 11:09:00', '11/07/2011 11:10:00', '11/07/2011 11:11:00', '11/07/2011 11:12:00', '11/07/2011 11:13:00', '11/07/2011 11:14:00', '11/07/2011 11:15:00', '11/07/2011 11:16:00', '11/07/2011 11:17:00', '11/07/2011 11:18:00', '11/07/2011 11:19:00', '11/07/2011 11:20:00', '11/07/2011 11:21:00', '11/07/2011 11:22:00', '11/07/2011 11:23:00', '11/07/2011 11:24:00', '11/07/2011 11:25:00', '11/07/2011 11:26:00', '11/07/2011 11:27:00', '11/07/2011 11:28:00', '11/07/2011 11:29:00', '11/07/2011 11:30:00', '11/07/2011 11:31:00', '11/07/2011 11:32:00', '11/07/2011 11:33:00', '11/07/2011 11:34:00', '11/07/2011 11:35:00', '11/07/2011 11:36:00', '11/07/2011 11:37:00', '11/07/2011 11:38:00', '11/07/2011 11:39:00', '11/07/2011 11:40:00', '11/07/2011 11:41:00', '11/07/2011 11:42:00', '11/07/2011 11:43:00', '11/07/2011 11:44:00', '11/07/2011 11:45:00', '11/07/2011 11:46:00', '11/07/2011 11:47:00', '11/07/2011 11:48:00', '11/07/2011 11:49:00', '11/07/2011 11:50:00', '11/07/2011 11:51:00', '11/07/2011 11:52:00', '11/07/2011 11:53:00', '11/07/2011 11:54:00', '11/07/2011 11:55:00', '11/07/2011 11:56:00', '11/07/2011 11:57:00', '11/07/2011 11:58:00', '11/07/2011 11:59:00', '11/07/2011 12:00:00', '11/07/2011 12:01:00', '11/07/2011 12:02:00', '11/07/2011 12:03:00', '11/07/2011 12:04:00', '11/07/2011 12:05:00', '11/07/2011 12:06:00', '11/07/2011 12:07:00', '11/07/2011 12:08:00', '11/07/2011 12:09:00', '11/07/2011 12:10:00', '11/07/2011 12:11:00', '11/07/2011 12:12:00', '11/07/2011 12:13:00', '11/07/2011 12:14:00', '11/07/2011 12:15:00', '11/07/2011 12:16:00', '11/07/2011 12:17:00', '11/07/2011 12:18:00', '11/07/2011 12:19:00', '11/07/2011 12:20:00', '11/07/2011 12:21:00', '11/07/2011 12:22:00', '11/07/2011 12:23:00', '11/07/2011 12:24:00', '11/07/2011 12:25:00', '11/07/2011 12:26:00', '11/07/2011 12:27:00', '11/07/2011 12:28:00', '11/07/2011 12:29:00', '11/07/2011 12:30:00', '11/07/2011 12:31:00', '11/07/2011 12:32:00', '11/07/2011 12:33:00', '11/07/2011 12:34:00', '11/07/2011 12:35:00', '11/07/2011 12:36:00', '11/07/2011 12:37:00', '11/07/2011 12:38:00', '11/07/2011 12:39:00', '11/07/2011 12:40:00', '11/07/2011 12:41:00', '11/07/2011 12:42:00', '11/07/2011 12:43:00', '11/07/2011 12:44:00', '11/07/2011 12:45:00', '11/07/2011 12:46:00', '11/07/2011 12:47:00', '11/07/2011 12:48:00', '11/07/2011 12:49:00', '11/07/2011 12:50:00', '11/07/2011 12:51:00', '11/07/2011 12:52:00', '11/07/2011 12:53:00', '11/07/2011 12:54:00', '11/07/2011 12:55:00', '11/07/2011 12:56:00', '11/07/2011 12:57:00', '11/07/2011 12:58:00', '11/07/2011 12:59:00', '11/07/2011 13:00:00', '11/07/2011 13:01:00', '11/07/2011 13:02:00', '11/07/2011 13:03:00', '11/07/2011 13:04:00', '11/07/2011 13:05:00', '11/07/2011 13:06:00', '11/07/2011 13:07:00', '11/07/2011 13:08:00', '11/07/2011 13:09:00', '11/07/2011 13:10:00', '11/07/2011 13:11:00', '11/07/2011 13:12:00', '11/07/2011 13:13:00', '11/07/2011 13:14:00', '11/07/2011 13:15:00', '11/07/2011 13:16:00', '11/07/2011 13:17:00', '11/07/2011 13:18:00', '11/07/2011 13:19:00', '11/07/2011 13:20:00', '11/07/2011 13:21:00', '11/07/2011 13:22:00', '11/07/2011 13:23:00', '11/07/2011 13:24:00', '11/07/2011 13:25:00', '11/07/2011 13:26:00', '11/07/2011 13:27:00', '11/07/2011 13:28:00', '11/07/2011 13:29:00', '11/07/2011 13:30:00', '11/07/2011 13:31:00', '11/07/2011 13:32:00', '11/07/2011 13:33:00', '11/07/2011 13:34:00', '11/07/2011 13:35:00', '11/07/2011 13:36:00', '11/07/2011 13:37:00', '11/07/2011 13:38:00', '11/07/2011 13:39:00', '11/07/2011 13:40:00', '11/07/2011 13:41:00', '11/07/2011 13:42:00', '11/07/2011 13:43:00', '11/07/2011 13:44:00', '11/07/2011 13:45:00', '11/07/2011 13:46:00', '11/07/2011 13:47:00', '11/07/2011 13:48:00', '11/07/2011 13:49:00', '11/07/2011 13:50:00', '11/07/2011 13:51:00', '11/07/2011 13:52:00', '11/07/2011 13:53:00', '11/07/2011 13:54:00', '11/07/2011 13:55:00', '11/07/2011 13:56:00', '11/07/2011 13:57:00', '11/07/2011 13:58:00', '11/07/2011 13:59:00', '11/07/2011 14:00:00', '11/07/2011 14:01:00', '11/07/2011 14:02:00', '11/07/2011 14:03:00', '11/07/2011 14:04:00', '11/07/2011 14:05:00', '11/07/2011 14:06:00', '11/07/2011 14:07:00', '11/07/2011 14:08:00', '11/07/2011 14:09:00', '11/07/2011 14:10:00', '11/07/2011 14:11:00', '11/07/2011 14:12:00', '11/07/2011 14:13:00', '11/07/2011 14:14:00', '11/07/2011 14:15:00', '11/07/2011 14:16:00', '11/07/2011 14:17:00', '11/07/2011 14:18:00', '11/07/2011 14:19:00', '11/07/2011 14:20:00', '11/07/2011 14:21:00', '11/07/2011 14:22:00', '11/07/2011 14:23:00', '11/07/2011 14:24:00', '11/07/2011 14:25:00', '11/07/2011 14:26:00', '11/07/2011 14:27:00', '11/07/2011 14:28:00', '11/07/2011 14:29:00', '11/07/2011 14:30:00', '11/07/2011 14:31:00', '11/07/2011 14:32:00', '11/07/2011 14:33:00', '11/07/2011 14:34:00', '11/07/2011 14:35:00', '11/07/2011 14:36:00', '11/07/2011 14:37:00', '11/07/2011 14:38:00', '11/07/2011 14:39:00', '11/07/2011 14:40:00', '11/07/2011 14:41:00', '11/07/2011 14:42:00', '11/07/2011 14:43:00', '11/07/2011 14:44:00', '11/07/2011 14:45:00', '11/07/2011 14:46:00', '11/07/2011 14:47:00', '11/07/2011 14:48:00', '11/07/2011 14:49:00', '11/07/2011 14:50:00', '11/07/2011 14:51:00', '11/07/2011 14:52:00', '11/07/2011 14:53:00', '11/07/2011 14:54:00', '11/07/2011 14:55:00', '11/07/2011 14:56:00', '11/07/2011 14:57:00', '11/07/2011 14:58:00', '11/07/2011 14:59:00', '11/07/2011 15:00:00', '11/07/2011 15:01:00', '11/07/2011 15:02:00', '11/07/2011 15:03:00', '11/07/2011 15:04:00', '11/07/2011 15:05:00', '11/07/2011 15:06:00', '11/07/2011 15:07:00', '11/07/2011 15:08:00', '11/07/2011 15:09:00', '11/07/2011 15:10:00', '11/07/2011 15:11:00', '11/07/2011 15:12:00', '11/07/2011 15:13:00', '11/07/2011 15:14:00', '11/07/2011 15:15:00', '11/07/2011 15:16:00', '11/07/2011 15:17:00', '11/07/2011 15:18:00', '11/07/2011 15:19:00', '11/07/2011 15:20:00', '11/07/2011 15:21:00', '11/07/2011 15:22:00', '11/07/2011 15:23:00', '11/07/2011 15:24:00', '11/07/2011 15:25:00', '11/07/2011 15:26:00', '11/07/2011 15:27:00', '11/07/2011 15:28:00', '11/07/2011 15:29:00', '11/07/2011 15:30:00', '11/07/2011 15:31:00', '11/07/2011 15:32:00', '11/07/2011 15:33:00', '11/07/2011 15:34:00', '11/07/2011 15:35:00', '11/07/2011 15:36:00', '11/07/2011 15:37:00', '11/07/2011 15:38:00', '11/07/2011 15:39:00', '11/07/2011 15:40:00', '11/07/2011 15:41:00', '11/07/2011 15:42:00', '11/07/2011 15:43:00', '11/07/2011 15:44:00', '11/07/2011 15:45:00', '11/07/2011 15:46:00', '11/07/2011 15:47:00', '11/07/2011 15:48:00', '11/07/2011 15:49:00', '11/07/2011 15:50:00', '11/07/2011 15:51:00', '11/07/2011 15:52:00', '11/07/2011 15:53:00', '11/07/2011 15:54:00', '11/07/2011 15:55:00', '11/07/2011 15:56:00', '11/07/2011 15:57:00', '11/07/2011 15:58:00', '11/07/2011 15:59:00', '11/07/2011 16:00:00', '11/07/2011 16:01:00', '11/07/2011 16:02:00', '11/07/2011 16:03:00', '11/07/2011 16:04:00', '11/07/2011 16:05:00', '11/07/2011 16:06:00', '11/07/2011 16:07:00', '11/07/2011 16:08:00', '11/07/2011 16:09:00', '11/07/2011 16:10:00', '11/07/2011 16:11:00', '11/07/2011 16:12:00', '11/07/2011 16:13:00', '11/07/2011 16:14:00', '11/07/2011 16:15:00', '11/07/2011 16:16:00', '11/07/2011 16:17:00', '11/07/2011 16:18:00', '11/07/2011 16:19:00', '11/07/2011 16:20:00', '11/07/2011 16:21:00', '11/07/2011 16:22:00', '11/07/2011 16:23:00', '11/07/2011 16:24:00', '11/07/2011 16:25:00', '11/07/2011 16:26:00', '11/07/2011 16:27:00', '11/07/2011 16:28:00', '11/07/2011 16:29:00', '11/07/2011 16:30:00', '11/07/2011 16:31:00', '11/07/2011 16:32:00', '11/07/2011 16:33:00', '11/07/2011 16:34:00', '11/07/2011 16:35:00', '11/07/2011 16:36:00', '11/07/2011 16:37:00', '11/07/2011 16:38:00', '11/07/2011 16:39:00', '11/07/2011 16:40:00', '11/07/2011 16:41:00', '11/07/2011 16:42:00', '11/07/2011 16:43:00', '11/07/2011 16:44:00', '11/07/2011 16:45:00', '11/07/2011 16:46:00', '11/07/2011 16:47:00', '11/07/2011 16:48:00', '11/07/2011 16:49:00', '11/07/2011 16:50:00', '11/07/2011 16:51:00', '11/07/2011 16:52:00', '11/07/2011 16:53:00', '11/07/2011 16:54:00', '11/07/2011 16:55:00', '11/07/2011 16:56:00', '11/07/2011 16:57:00', '11/07/2011 16:58:00', '11/07/2011 16:59:00', '11/07/2011 17:00:00', '11/07/2011 17:01:00', '11/07/2011 17:02:00', '11/07/2011 17:03:00', '11/07/2011 17:04:00', '11/07/2011 17:05:00', '11/07/2011 17:06:00', '11/07/2011 17:07:00', '11/07/2011 17:08:00', '11/07/2011 17:09:00', '11/07/2011 17:10:00', '11/07/2011 17:11:00', '11/07/2011 17:12:00', '11/07/2011 17:13:00', '11/07/2011 17:14:00', '11/07/2011 17:15:00', '11/07/2011 17:16:00', '11/07/2011 17:17:00', '11/07/2011 17:18:00', '11/07/2011 17:19:00', '11/07/2011 17:20:00', '11/07/2011 17:21:00', '11/07/2011 17:22:00', '11/07/2011 17:23:00', '11/07/2011 17:24:00', '11/07/2011 17:25:00', '11/07/2011 17:26:00', '11/07/2011 17:27:00', '11/07/2011 17:28:00', '11/07/2011 17:29:00', '11/07/2011 17:30:00', '11/07/2011 17:31:00', '11/07/2011 17:32:00', '11/07/2011 17:33:00', '11/07/2011 17:34:00', '11/07/2011 17:35:00', '11/07/2011 17:36:00', '11/07/2011 17:37:00', '11/07/2011 17:38:00', '11/07/2011 17:39:00', '11/07/2011 17:40:00', '11/07/2011 17:41:00', '11/07/2011 17:42:00', '11/07/2011 17:43:00', '11/07/2011 17:44:00', '11/07/2011 17:45:00', '11/07/2011 17:46:00', '11/07/2011 17:47:00', '11/07/2011 17:48:00', '11/07/2011 17:49:00', '11/07/2011 17:50:00', '11/07/2011 17:51:00', '11/07/2011 17:52:00', '11/07/2011 17:53:00', '11/07/2011 17:54:00', '11/07/2011 17:55:00', '11/07/2011 17:56:00', '11/07/2011 17:57:00', '11/07/2011 17:58:00', '11/07/2011 17:59:00', '11/07/2011 18:00:00', '11/07/2011 18:01:00', '11/07/2011 18:02:00', '11/07/2011 18:03:00', '11/07/2011 18:04:00', '11/07/2011 18:05:00', '11/07/2011 18:06:00', '11/07/2011 18:07:00', '11/07/2011 18:08:00', '11/07/2011 18:09:00', '11/07/2011 18:10:00', '11/07/2011 18:11:00', '11/07/2011 18:12:00', '11/07/2011 18:13:00', '11/07/2011 18:14:00', '11/07/2011 18:15:00', '11/07/2011 18:16:00', '11/07/2011 18:17:00', '11/07/2011 18:18:00', '11/07/2011 18:19:00', '11/07/2011 18:20:00', '11/07/2011 18:21:00', '11/07/2011 18:22:00', '11/07/2011 18:23:00', '11/07/2011 18:24:00', '11/07/2011 18:25:00', '11/07/2011 18:26:00', '11/07/2011 18:27:00', '11/07/2011 18:28:00', '11/07/2011 18:29:00', '11/07/2011 18:30:00', '11/07/2011 18:31:00', '11/07/2011 18:32:00', '11/07/2011 18:33:00', '11/07/2011 18:34:00', '11/07/2011 18:35:00', '11/07/2011 18:36:00', '11/07/2011 18:37:00', '11/07/2011 18:38:00', '11/07/2011 18:39:00', '11/07/2011 18:40:00', '11/07/2011 18:41:00', '11/07/2011 18:42:00', '11/07/2011 18:43:00', '11/07/2011 18:44:00', '11/07/2011 18:45:00', '11/07/2011 18:46:00', '11/07/2011 18:47:00', '11/07/2011 18:48:00', '11/07/2011 18:49:00', '11/07/2011 18:50:00', '11/07/2011 18:51:00', '11/07/2011 18:52:00', '11/07/2011 18:53:00', '11/07/2011 18:54:00', '11/07/2011 18:55:00', '11/07/2011 18:56:00', '11/07/2011 18:57:00', '11/07/2011 18:58:00', '11/07/2011 18:59:00', '11/07/2011 19:00:00', '11/07/2011 19:01:00', '11/07/2011 19:02:00', '11/07/2011 19:03:00', '11/07/2011 19:04:00', '11/07/2011 19:05:00', '11/07/2011 19:06:00', '11/07/2011 19:07:00', '11/07/2011 19:08:00', '11/07/2011 19:09:00', '11/07/2011 19:10:00', '11/07/2011 19:11:00', '11/07/2011 19:12:00', '11/07/2011 19:13:00', '11/07/2011 19:14:00', '11/07/2011 19:15:00', '11/07/2011 19:16:00', '11/07/2011 19:17:00', '11/07/2011 19:18:00', '11/07/2011 19:19:00', '11/07/2011 19:20:00', '11/07/2011 19:21:00', '11/07/2011 19:22:00', '11/07/2011 19:23:00', '11/07/2011 19:24:00', '11/07/2011 19:25:00', '11/07/2011 19:26:00', '11/07/2011 19:27:00', '11/07/2011 19:28:00', '11/07/2011 19:29:00', '11/07/2011 19:30:00', '11/07/2011 19:31:00', '11/07/2011 19:32:00', '11/07/2011 19:33:00', '11/07/2011 19:34:00', '11/07/2011 19:35:00', '11/07/2011 19:36:00', '11/07/2011 19:37:00', '11/07/2011 19:38:00', '11/07/2011 19:39:00', '11/07/2011 19:40:00', '11/07/2011 19:41:00', '11/07/2011 19:42:00', '11/07/2011 19:43:00', '11/07/2011 19:44:00', '11/07/2011 19:45:00', '11/07/2011 19:46:00', '11/07/2011 19:47:00', '11/07/2011 19:48:00', '11/07/2011 19:49:00', '11/07/2011 19:50:00', '11/07/2011 19:51:00', '11/07/2011 19:52:00', '11/07/2011 19:53:00', '11/07/2011 19:54:00', '11/07/2011 19:55:00', '11/07/2011 19:56:00', '11/07/2011 19:57:00', '11/07/2011 19:58:00', '11/07/2011 19:59:00', '11/07/2011 20:00:00', '11/07/2011 20:01:00', '11/07/2011 20:02:00', '11/07/2011 20:03:00', '11/07/2011 20:04:00', '11/07/2011 20:05:00', '11/07/2011 20:06:00', '11/07/2011 20:07:00', '11/07/2011 20:08:00', '11/07/2011 20:09:00', '11/07/2011 20:10:00', '11/07/2011 20:11:00', '11/07/2011 20:12:00', '11/07/2011 20:13:00', '11/07/2011 20:14:00', '11/07/2011 20:15:00', '11/07/2011 20:16:00', '11/07/2011 20:17:00', '11/07/2011 20:18:00', '11/07/2011 20:19:00', '11/07/2011 20:20:00', '11/07/2011 20:21:00', '11/07/2011 20:22:00', '11/07/2011 20:23:00', '11/07/2011 20:24:00', '11/07/2011 20:25:00', '11/07/2011 20:26:00', '11/07/2011 20:27:00', '11/07/2011 20:28:00', '11/07/2011 20:29:00', '11/07/2011 20:30:00', '11/07/2011 20:31:00', '11/07/2011 20:32:00', '11/07/2011 20:33:00', '11/07/2011 20:34:00', '11/07/2011 20:35:00', '11/07/2011 20:36:00', '11/07/2011 20:37:00', '11/07/2011 20:38:00', '11/07/2011 20:39:00', '11/07/2011 20:40:00', '11/07/2011 20:41:00', '11/07/2011 20:42:00', '11/07/2011 20:43:00', '11/07/2011 20:44:00', '11/07/2011 20:45:00', '11/07/2011 20:46:00', '11/07/2011 20:47:00', '11/07/2011 20:48:00', '11/07/2011 20:49:00', '11/07/2011 20:50:00', '11/07/2011 20:51:00', '11/07/2011 20:52:00', '11/07/2011 20:53:00', '11/07/2011 20:54:00', '11/07/2011 20:55:00', '11/07/2011 20:56:00', '11/07/2011 20:57:00', '11/07/2011 20:58:00', '11/07/2011 20:59:00', '11/07/2011 21:00:00', '11/07/2011 21:01:00', '11/07/2011 21:02:00', '11/07/2011 21:03:00', '11/07/2011 21:04:00', '11/07/2011 21:05:00', '11/07/2011 21:06:00', '11/07/2011 21:07:00', '11/07/2011 21:08:00', '11/07/2011 21:09:00', '11/07/2011 21:10:00', '11/07/2011 21:11:00', '11/07/2011 21:12:00', '11/07/2011 21:13:00', '11/07/2011 21:14:00', '11/07/2011 21:15:00', '11/07/2011 21:16:00', '11/07/2011 21:17:00', '11/07/2011 21:18:00', '11/07/2011 21:19:00', '11/07/2011 21:20:00', '11/07/2011 21:21:00', '11/07/2011 21:22:00', '11/07/2011 21:23:00', '11/07/2011 21:24:00', '11/07/2011 21:25:00', '11/07/2011 21:26:00', '11/07/2011 21:27:00', '11/07/2011 21:28:00', '11/07/2011 21:29:00', '11/07/2011 21:30:00', '11/07/2011 21:31:00', '11/07/2011 21:32:00', '11/07/2011 21:33:00', '11/07/2011 21:34:00', '11/07/2011 21:35:00', '11/07/2011 21:36:00', '11/07/2011 21:37:00', '11/07/2011 21:38:00', '11/07/2011 21:39:00', '11/07/2011 21:40:00', '11/07/2011 21:41:00', '11/07/2011 21:42:00', '11/07/2011 21:43:00', '11/07/2011 21:44:00', '11/07/2011 21:45:00', '11/07/2011 21:46:00', '11/07/2011 21:47:00', '11/07/2011 21:48:00', '11/07/2011 21:49:00', '11/07/2011 21:50:00', '11/07/2011 21:51:00', '11/07/2011 21:52:00', '11/07/2011 21:53:00', '11/07/2011 21:54:00', '11/07/2011 21:55:00', '11/07/2011 21:56:00', '11/07/2011 21:57:00', '11/07/2011 21:58:00', '11/07/2011 21:59:00', '11/07/2011 22:00:00', '11/07/2011 22:01:00', '11/07/2011 22:02:00', '11/07/2011 22:03:00', '11/07/2011 22:04:00', '11/07/2011 22:05:00', '11/07/2011 22:06:00', '11/07/2011 22:07:00', '11/07/2011 22:08:00', '11/07/2011 22:09:00', '11/07/2011 22:10:00', '11/07/2011 22:11:00', '11/07/2011 22:12:00', '11/07/2011 22:13:00', '11/07/2011 22:14:00', '11/07/2011 22:15:00', '11/07/2011 22:16:00', '11/07/2011 22:17:00', '11/07/2011 22:18:00', '11/07/2011 22:19:00', '11/07/2011 22:20:00', '11/07/2011 22:21:00', '11/07/2011 22:22:00', '11/07/2011 22:23:00', '11/07/2011 22:24:00', '11/07/2011 22:25:00', '11/07/2011 22:26:00', '11/07/2011 22:27:00', '11/07/2011 22:28:00', '11/07/2011 22:29:00', '11/07/2011 22:30:00', '11/07/2011 22:31:00', '11/07/2011 22:32:00', '11/07/2011 22:33:00', '11/07/2011 22:34:00', '11/07/2011 22:35:00', '11/07/2011 22:36:00', '11/07/2011 22:37:00', '11/07/2011 22:38:00', '11/07/2011 22:39:00', '11/07/2011 22:40:00', '11/07/2011 22:41:00', '11/07/2011 22:42:00', '11/07/2011 22:43:00', '11/07/2011 22:44:00', '11/07/2011 22:45:00', '11/07/2011 22:46:00', '11/07/2011 22:47:00', '11/07/2011 22:48:00', '11/07/2011 22:49:00', '11/07/2011 22:50:00', '11/07/2011 22:51:00', '11/07/2011 22:52:00', '11/07/2011 22:53:00', '11/07/2011 22:54:00', '11/07/2011 22:55:00', '11/07/2011 22:56:00', '11/07/2011 22:57:00', '11/07/2011 22:58:00', '11/07/2011 22:59:00', '11/07/2011 23:00:00', '11/07/2011 23:01:00', '11/07/2011 23:02:00', '11/07/2011 23:03:00', '11/07/2011 23:04:00', '11/07/2011 23:05:00', '11/07/2011 23:06:00', '11/07/2011 23:07:00', '11/07/2011 23:08:00', '11/07/2011 23:09:00', '11/07/2011 23:10:00', '11/07/2011 23:11:00', '11/07/2011 23:12:00', '11/07/2011 23:13:00', '11/07/2011 23:14:00', '11/07/2011 23:15:00', '11/07/2011 23:16:00', '11/07/2011 23:17:00', '11/07/2011 23:18:00', '11/07/2011 23:19:00', '11/07/2011 23:20:00', '11/07/2011 23:21:00', '11/07/2011 23:22:00', '11/07/2011 23:23:00', '11/07/2011 23:24:00', '11/07/2011 23:25:00', '11/07/2011 23:26:00', '11/07/2011 23:27:00', '11/07/2011 23:28:00', '11/07/2011 23:29:00', '11/07/2011 23:30:00', '11/07/2011 23:31:00', '11/07/2011 23:32:00', '11/07/2011 23:33:00', '11/07/2011 23:34:00', '11/07/2011 23:35:00', '11/07/2011 23:36:00', '11/07/2011 23:37:00', '11/07/2011 23:38:00', '11/07/2011 23:39:00', '11/07/2011 23:40:00', '11/07/2011 23:41:00', '11/07/2011 23:42:00', '11/07/2011 23:43:00', '11/07/2011 23:44:00', '11/07/2011 23:45:00', '11/07/2011 23:46:00', '11/07/2011 23:47:00', '11/07/2011 23:48:00', '11/07/2011 23:49:00', '11/07/2011 23:50:00', '11/07/2011 23:51:00', '11/07/2011 23:52:00', '11/07/2011 23:53:00', '11/07/2011 23:54:00', '11/07/2011 23:55:00', '11/07/2011 23:56:00', '11/07/2011 23:57:00', '11/07/2011 23:58:00', '11/07/2011 23:59:00', '12/07/2011 0:00:00', '12/07/2011 0:01:00', '12/07/2011 0:02:00', '12/07/2011 0:03:00', '12/07/2011 0:04:00', '12/07/2011 0:05:00', '12/07/2011 0:06:00', '12/07/2011 0:07:00', '12/07/2011 0:08:00', '12/07/2011 0:09:00', '12/07/2011 0:10:00', '12/07/2011 0:11:00', '12/07/2011 0:12:00', '12/07/2011 0:13:00', '12/07/2011 0:14:00', '12/07/2011 0:15:00', '12/07/2011 0:16:00', '12/07/2011 0:17:00', '12/07/2011 0:18:00', '12/07/2011 0:19:00', '12/07/2011 0:20:00', '12/07/2011 0:21:00', '12/07/2011 0:22:00', '12/07/2011 0:23:00', '12/07/2011 0:24:00', '12/07/2011 0:25:00', '12/07/2011 0:26:00', '12/07/2011 0:27:00', '12/07/2011 0:28:00', '12/07/2011 0:29:00', '12/07/2011 0:30:00', '12/07/2011 0:31:00', '12/07/2011 0:32:00', '12/07/2011 0:33:00', '12/07/2011 0:34:00', '12/07/2011 0:35:00', '12/07/2011 0:36:00', '12/07/2011 0:37:00', '12/07/2011 0:38:00', '12/07/2011 0:39:00', '12/07/2011 0:40:00', '12/07/2011 0:41:00', '12/07/2011 0:42:00', '12/07/2011 0:43:00', '12/07/2011 0:44:00', '12/07/2011 0:45:00', '12/07/2011 0:46:00', '12/07/2011 0:47:00', '12/07/2011 0:48:00', '12/07/2011 0:49:00', '12/07/2011 0:50:00', '12/07/2011 0:51:00', '12/07/2011 0:52:00', '12/07/2011 0:53:00', '12/07/2011 0:54:00', '12/07/2011 0:55:00', '12/07/2011 0:56:00', '12/07/2011 0:57:00', '12/07/2011 0:58:00', '12/07/2011 0:59:00', '12/07/2011 1:00:00', '12/07/2011 1:01:00', '12/07/2011 1:02:00', '12/07/2011 1:03:00', '12/07/2011 1:04:00', '12/07/2011 1:05:00', '12/07/2011 1:06:00', '12/07/2011 1:07:00', '12/07/2011 1:08:00', '12/07/2011 1:09:00', '12/07/2011 1:10:00', '12/07/2011 1:11:00', '12/07/2011 1:12:00', '12/07/2011 1:13:00', '12/07/2011 1:14:00', '12/07/2011 1:15:00', '12/07/2011 1:16:00', '12/07/2011 1:17:00', '12/07/2011 1:18:00', '12/07/2011 1:19:00', '12/07/2011 1:20:00', '12/07/2011 1:21:00', '12/07/2011 1:22:00', '12/07/2011 1:23:00', '12/07/2011 1:24:00', '12/07/2011 1:25:00', '12/07/2011 1:26:00', '12/07/2011 1:27:00', '12/07/2011 1:28:00', '12/07/2011 1:29:00', '12/07/2011 1:30:00', '12/07/2011 1:31:00', '12/07/2011 1:32:00', '12/07/2011 1:33:00', '12/07/2011 1:34:00', '12/07/2011 1:35:00', '12/07/2011 1:36:00', '12/07/2011 1:37:00', '12/07/2011 1:38:00', '12/07/2011 1:39:00', '12/07/2011 1:40:00', '12/07/2011 1:41:00', '12/07/2011 1:42:00', '12/07/2011 1:43:00', '12/07/2011 1:44:00', '12/07/2011 1:45:00', '12/07/2011 1:46:00', '12/07/2011 1:47:00', '12/07/2011 1:48:00', '12/07/2011 1:49:00', '12/07/2011 1:50:00', '12/07/2011 1:51:00', '12/07/2011 1:52:00', '12/07/2011 1:53:00', '12/07/2011 1:54:00', '12/07/2011 1:55:00', '12/07/2011 1:56:00', '12/07/2011 1:57:00', '12/07/2011 1:58:00', '12/07/2011 1:59:00', '12/07/2011 2:00:00', '12/07/2011 2:01:00', '12/07/2011 2:02:00', '12/07/2011 2:03:00', '12/07/2011 2:04:00', '12/07/2011 2:05:00', '12/07/2011 2:06:00', '12/07/2011 2:07:00', '12/07/2011 2:08:00', '12/07/2011 2:09:00', '12/07/2011 2:10:00', '12/07/2011 2:11:00', '12/07/2011 2:12:00', '12/07/2011 2:13:00', '12/07/2011 2:14:00', '12/07/2011 2:15:00', '12/07/2011 2:16:00', '12/07/2011 2:17:00', '12/07/2011 2:18:00', '12/07/2011 2:19:00', '12/07/2011 2:20:00', '12/07/2011 2:21:00', '12/07/2011 2:22:00', '12/07/2011 2:23:00', '12/07/2011 2:24:00', '12/07/2011 2:25:00', '12/07/2011 2:26:00', '12/07/2011 2:27:00', '12/07/2011 2:28:00', '12/07/2011 2:29:00', '12/07/2011 2:30:00', '12/07/2011 2:31:00', '12/07/2011 2:32:00', '12/07/2011 2:33:00', '12/07/2011 2:34:00', '12/07/2011 2:35:00', '12/07/2011 2:36:00', '12/07/2011 2:37:00', '12/07/2011 2:38:00', '12/07/2011 2:39:00', '12/07/2011 2:40:00', '12/07/2011 2:41:00', '12/07/2011 2:42:00', '12/07/2011 2:43:00', '12/07/2011 2:44:00', '12/07/2011 2:45:00', '12/07/2011 2:46:00', '12/07/2011 2:47:00', '12/07/2011 2:48:00', '12/07/2011 2:49:00', '12/07/2011 2:50:00', '12/07/2011 2:51:00', '12/07/2011 2:52:00', '12/07/2011 2:53:00', '12/07/2011 2:54:00', '12/07/2011 2:55:00', '12/07/2011 2:56:00', '12/07/2011 2:57:00', '12/07/2011 2:58:00', '12/07/2011 2:59:00', '12/07/2011 3:00:00', '12/07/2011 3:01:00', '12/07/2011 3:02:00', '12/07/2011 3:03:00', '12/07/2011 3:04:00', '12/07/2011 3:05:00', '12/07/2011 3:06:00', '12/07/2011 3:07:00', '12/07/2011 3:08:00', '12/07/2011 3:09:00', '12/07/2011 3:10:00', '12/07/2011 3:11:00', '12/07/2011 3:12:00', '12/07/2011 3:13:00', '12/07/2011 3:14:00', '12/07/2011 3:15:00', '12/07/2011 3:16:00', '12/07/2011 3:17:00', '12/07/2011 3:18:00', '12/07/2011 3:19:00', '12/07/2011 3:20:00', '12/07/2011 3:21:00', '12/07/2011 3:22:00', '12/07/2011 3:23:00', '12/07/2011 3:24:00', '12/07/2011 3:25:00', '12/07/2011 3:26:00', '12/07/2011 3:27:00', '12/07/2011 3:28:00', '12/07/2011 3:29:00', '12/07/2011 3:30:00', '12/07/2011 3:31:00', '12/07/2011 3:32:00', '12/07/2011 3:33:00', '12/07/2011 3:34:00', '12/07/2011 3:35:00', '12/07/2011 3:36:00', '12/07/2011 3:37:00', '12/07/2011 3:38:00', '12/07/2011 3:39:00', '12/07/2011 3:40:00', '12/07/2011 3:41:00', '12/07/2011 3:42:00', '12/07/2011 3:43:00', '12/07/2011 3:44:00', '12/07/2011 3:45:00', '12/07/2011 3:46:00', '12/07/2011 3:47:00', '12/07/2011 3:48:00', '12/07/2011 3:49:00', '12/07/2011 3:50:00', '12/07/2011 3:51:00', '12/07/2011 3:52:00', '12/07/2011 3:53:00', '12/07/2011 3:54:00', '12/07/2011 3:55:00', '12/07/2011 3:56:00', '12/07/2011 3:57:00', '12/07/2011 3:58:00', '12/07/2011 3:59:00', '12/07/2011 4:00:00', '12/07/2011 4:01:00', '12/07/2011 4:02:00', '12/07/2011 4:03:00', '12/07/2011 4:04:00', '12/07/2011 4:05:00', '12/07/2011 4:06:00', '12/07/2011 4:07:00', '12/07/2011 4:08:00', '12/07/2011 4:09:00', '12/07/2011 4:10:00', '12/07/2011 4:11:00', '12/07/2011 4:12:00', '12/07/2011 4:13:00', '12/07/2011 4:14:00', '12/07/2011 4:15:00', '12/07/2011 4:16:00', '12/07/2011 4:17:00', '12/07/2011 4:18:00', '12/07/2011 4:19:00', '12/07/2011 4:20:00', '12/07/2011 4:21:00', '12/07/2011 4:22:00', '12/07/2011 4:23:00', '12/07/2011 4:24:00', '12/07/2011 4:25:00', '12/07/2011 4:26:00', '12/07/2011 4:27:00', '12/07/2011 4:28:00', '12/07/2011 4:29:00', '12/07/2011 4:30:00', '12/07/2011 4:31:00', '12/07/2011 4:32:00', '12/07/2011 4:33:00', '12/07/2011 4:34:00', '12/07/2011 4:35:00', '12/07/2011 4:36:00', '12/07/2011 4:37:00', '12/07/2011 4:38:00', '12/07/2011 4:39:00', '12/07/2011 4:40:00', '12/07/2011 4:41:00', '12/07/2011 4:42:00', '12/07/2011 4:43:00', '12/07/2011 4:44:00', '12/07/2011 4:45:00', '12/07/2011 4:46:00', '12/07/2011 4:47:00', '12/07/2011 4:48:00', '12/07/2011 4:49:00', '12/07/2011 4:50:00', '12/07/2011 4:51:00', '12/07/2011 4:52:00', '12/07/2011 4:53:00', '12/07/2011 4:54:00', '12/07/2011 4:55:00', '12/07/2011 4:56:00', '12/07/2011 4:57:00', '12/07/2011 4:58:00', '12/07/2011 4:59:00', '12/07/2011 5:00:00', '12/07/2011 5:01:00', '12/07/2011 5:02:00', '12/07/2011 5:03:00', '12/07/2011 5:04:00', '12/07/2011 5:05:00', '12/07/2011 5:06:00', '12/07/2011 5:07:00', '12/07/2011 5:08:00', '12/07/2011 5:09:00', '12/07/2011 5:10:00', '12/07/2011 5:11:00', '12/07/2011 5:12:00', '12/07/2011 5:13:00', '12/07/2011 5:14:00', '12/07/2011 5:15:00', '12/07/2011 5:16:00', '12/07/2011 5:17:00', '12/07/2011 5:18:00', '12/07/2011 5:19:00', '12/07/2011 5:20:00', '12/07/2011 5:21:00', '12/07/2011 5:22:00', '12/07/2011 5:23:00', '12/07/2011 5:24:00', '12/07/2011 5:25:00', '12/07/2011 5:26:00', '12/07/2011 5:27:00', '12/07/2011 5:28:00', '12/07/2011 5:29:00', '12/07/2011 5:30:00', '12/07/2011 5:31:00', '12/07/2011 5:32:00', '12/07/2011 5:33:00', '12/07/2011 5:34:00', '12/07/2011 5:35:00', '12/07/2011 5:36:00', '12/07/2011 5:37:00', '12/07/2011 5:38:00', '12/07/2011 5:39:00', '12/07/2011 5:40:00', '12/07/2011 5:41:00', '12/07/2011 5:42:00', '12/07/2011 5:43:00', '12/07/2011 5:44:00', '12/07/2011 5:45:00', '12/07/2011 5:46:00', '12/07/2011 5:47:00', '12/07/2011 5:48:00', '12/07/2011 5:49:00', '12/07/2011 5:50:00', '12/07/2011 5:51:00', '12/07/2011 5:52:00', '12/07/2011 5:53:00', '12/07/2011 5:54:00', '12/07/2011 5:55:00', '12/07/2011 5:56:00', '12/07/2011 5:57:00', '12/07/2011 5:58:00', '12/07/2011 5:59:00', '12/07/2011 6:00:00', '12/07/2011 6:01:00', '12/07/2011 6:02:00', '12/07/2011 6:03:00', '12/07/2011 6:04:00', '12/07/2011 6:05:00', '12/07/2011 6:06:00', '12/07/2011 6:07:00', '12/07/2011 6:08:00', '12/07/2011 6:09:00', '12/07/2011 6:10:00', '12/07/2011 6:11:00', '12/07/2011 6:12:00', '12/07/2011 6:13:00', '12/07/2011 6:14:00', '12/07/2011 6:15:00', '12/07/2011 6:16:00', '12/07/2011 6:17:00', '12/07/2011 6:18:00', '12/07/2011 6:19:00', '12/07/2011 6:20:00', '12/07/2011 6:21:00', '12/07/2011 6:22:00', '12/07/2011 6:23:00', '12/07/2011 6:24:00', '12/07/2011 6:25:00', '12/07/2011 6:26:00', '12/07/2011 6:27:00', '12/07/2011 6:28:00', '12/07/2011 6:29:00', '12/07/2011 6:30:00', '12/07/2011 6:31:00', '12/07/2011 6:32:00', '12/07/2011 6:33:00', '12/07/2011 6:34:00', '12/07/2011 6:35:00', '12/07/2011 6:36:00', '12/07/2011 6:37:00', '12/07/2011 6:38:00', '12/07/2011 6:39:00', '12/07/2011 6:40:00', '12/07/2011 6:41:00', '12/07/2011 6:42:00', '12/07/2011 6:43:00', '12/07/2011 6:44:00', '12/07/2011 6:45:00', '12/07/2011 6:46:00', '12/07/2011 6:47:00', '12/07/2011 6:48:00', '12/07/2011 6:49:00', '12/07/2011 6:50:00', '12/07/2011 6:51:00', '12/07/2011 6:52:00', '12/07/2011 6:53:00', '12/07/2011 6:54:00', '12/07/2011 6:55:00', '12/07/2011 6:56:00', '12/07/2011 6:57:00', '12/07/2011 6:58:00', '12/07/2011 6:59:00', '12/07/2011 7:00:00', '12/07/2011 7:01:00', '12/07/2011 7:02:00', '12/07/2011 7:03:00', '12/07/2011 7:04:00', '12/07/2011 7:05:00', '12/07/2011 7:06:00', '12/07/2011 7:07:00', '12/07/2011 7:08:00', '12/07/2011 7:09:00', '12/07/2011 7:10:00', '12/07/2011 7:11:00', '12/07/2011 7:12:00', '12/07/2011 7:13:00', '12/07/2011 7:14:00', '12/07/2011 7:15:00', '12/07/2011 7:16:00', '12/07/2011 7:17:00', '12/07/2011 7:18:00', '12/07/2011 7:19:00', '12/07/2011 7:20:00', '12/07/2011 7:21:00', '12/07/2011 7:22:00', '12/07/2011 7:23:00', '12/07/2011 7:24:00', '12/07/2011 7:25:00', '12/07/2011 7:26:00', '12/07/2011 7:27:00', '12/07/2011 7:28:00', '12/07/2011 7:29:00', '12/07/2011 7:30:00', '12/07/2011 7:31:00', '12/07/2011 7:32:00', '12/07/2011 7:33:00', '12/07/2011 7:34:00', '12/07/2011 7:35:00', '12/07/2011 7:36:00', '12/07/2011 7:37:00', '12/07/2011 7:38:00', '12/07/2011 7:39:00', '12/07/2011 7:40:00', '12/07/2011 7:41:00', '12/07/2011 7:42:00', '12/07/2011 7:43:00', '12/07/2011 7:44:00', '12/07/2011 7:45:00', '12/07/2011 7:46:00', '12/07/2011 7:47:00', '12/07/2011 7:48:00', '12/07/2011 7:49:00', '12/07/2011 7:50:00', '12/07/2011 7:51:00', '12/07/2011 7:52:00', '12/07/2011 7:53:00', '12/07/2011 7:54:00', '12/07/2011 7:55:00', '12/07/2011 7:56:00', '12/07/2011 7:57:00', '12/07/2011 7:58:00', '12/07/2011 7:59:00', '12/07/2011 8:00:00', '12/07/2011 8:01:00', '12/07/2011 8:02:00', '12/07/2011 8:03:00', '12/07/2011 8:04:00', '12/07/2011 8:05:00', '12/07/2011 8:06:00', '12/07/2011 8:07:00', '12/07/2011 8:08:00', '12/07/2011 8:09:00', '12/07/2011 8:10:00', '12/07/2011 8:11:00', '12/07/2011 8:12:00', '12/07/2011 8:13:00', '12/07/2011 8:14:00', '12/07/2011 8:15:00', '12/07/2011 8:16:00', '12/07/2011 8:17:00', '12/07/2011 8:18:00', '12/07/2011 8:19:00', '12/07/2011 8:20:00', '12/07/2011 8:21:00', '12/07/2011 8:22:00', '12/07/2011 8:23:00', '12/07/2011 8:24:00', '12/07/2011 8:25:00', '12/07/2011 8:26:00', '12/07/2011 8:27:00', '12/07/2011 8:28:00', '12/07/2011 8:29:00', '12/07/2011 8:30:00', '12/07/2011 8:31:00', '12/07/2011 8:32:00', '12/07/2011 8:33:00', '12/07/2011 8:34:00', '12/07/2011 8:35:00', '12/07/2011 8:36:00', '12/07/2011 8:37:00', '12/07/2011 8:38:00', '12/07/2011 8:39:00', '12/07/2011 8:40:00', '12/07/2011 8:41:00', '12/07/2011 8:42:00', '12/07/2011 8:43:00', '12/07/2011 8:44:00', '12/07/2011 8:45:00', '12/07/2011 8:46:00', '12/07/2011 8:47:00', '12/07/2011 8:48:00', '12/07/2011 8:49:00', '12/07/2011 8:50:00', '12/07/2011 8:51:00', '12/07/2011 8:52:00', '12/07/2011 8:53:00', '12/07/2011 8:54:00', '12/07/2011 8:55:00', '12/07/2011 8:56:00', '12/07/2011 8:57:00', '12/07/2011 8:58:00', '12/07/2011 8:59:00', '12/07/2011 9:00:00', '12/07/2011 9:01:00', '12/07/2011 9:02:00', '12/07/2011 9:03:00', '12/07/2011 9:04:00', '12/07/2011 9:05:00', '12/07/2011 9:06:00', '12/07/2011 9:07:00', '12/07/2011 9:08:00', '12/07/2011 9:09:00', '12/07/2011 9:10:00', '12/07/2011 9:11:00', '12/07/2011 9:12:00', '12/07/2011 9:13:00', '12/07/2011 9:14:00', '12/07/2011 9:15:00', '12/07/2011 9:16:00', '12/07/2011 9:17:00', '12/07/2011 9:18:00', '12/07/2011 9:19:00', '12/07/2011 9:20:00', '12/07/2011 9:21:00', '12/07/2011 9:22:00', '12/07/2011 9:23:00', '12/07/2011 9:24:00', '12/07/2011 9:25:00', '12/07/2011 9:26:00', '12/07/2011 9:27:00', '12/07/2011 9:28:00', '12/07/2011 9:29:00', '12/07/2011 9:30:00', '12/07/2011 9:31:00', '12/07/2011 9:32:00', '12/07/2011 9:33:00', '12/07/2011 9:34:00', '12/07/2011 9:35:00', '12/07/2011 9:36:00', '12/07/2011 9:37:00', '12/07/2011 9:38:00', '12/07/2011 9:39:00', '12/07/2011 9:40:00', '12/07/2011 9:41:00', '12/07/2011 9:42:00', '12/07/2011 9:43:00', '12/07/2011 9:44:00', '12/07/2011 9:45:00', '12/07/2011 9:46:00', '12/07/2011 9:47:00', '12/07/2011 9:48:00', '12/07/2011 9:49:00', '12/07/2011 9:50:00', '12/07/2011 9:51:00', '12/07/2011 9:52:00', '12/07/2011 9:53:00', '12/07/2011 9:54:00', '12/07/2011 9:55:00', '12/07/2011 9:56:00', '12/07/2011 9:57:00', '12/07/2011 9:58:00', '12/07/2011 9:59:00', '12/07/2011 10:00:00', '12/07/2011 10:01:00', '12/07/2011 10:02:00', '12/07/2011 10:03:00', '12/07/2011 10:04:00', '12/07/2011 10:05:00', '12/07/2011 10:06:00', '12/07/2011 10:07:00', '12/07/2011 10:08:00', '12/07/2011 10:09:00', '12/07/2011 10:10:00', '12/07/2011 10:11:00', '12/07/2011 10:12:00', '12/07/2011 10:13:00', '12/07/2011 10:14:00', '12/07/2011 10:15:00', '12/07/2011 10:16:00', '12/07/2011 10:17:00', '12/07/2011 10:18:00', '12/07/2011 10:19:00', '12/07/2011 10:20:00', '12/07/2011 10:21:00', '12/07/2011 10:22:00', '12/07/2011 10:23:00', '12/07/2011 10:24:00', '12/07/2011 10:25:00', '12/07/2011 10:26:00', '12/07/2011 10:27:00', '12/07/2011 10:28:00', '12/07/2011 10:29:00', '12/07/2011 10:30:00', '12/07/2011 10:31:00', '12/07/2011 10:32:00', '12/07/2011 10:33:00', '12/07/2011 10:34:00', '12/07/2011 10:35:00', '12/07/2011 10:36:00', '12/07/2011 10:37:00', '12/07/2011 10:38:00', '12/07/2011 10:39:00', '12/07/2011 10:40:00', '12/07/2011 10:41:00', '12/07/2011 10:42:00', '12/07/2011 10:43:00', '12/07/2011 10:44:00', '12/07/2011 10:45:00', '12/07/2011 10:46:00', '12/07/2011 10:47:00', '12/07/2011 10:48:00', '12/07/2011 10:49:00', '12/07/2011 10:50:00', '12/07/2011 10:51:00', '12/07/2011 10:52:00', '12/07/2011 10:53:00', '12/07/2011 10:54:00', '12/07/2011 10:55:00', '12/07/2011 10:56:00', '12/07/2011 10:57:00', '12/07/2011 10:58:00', '12/07/2011 10:59:00', '12/07/2011 11:00:00', '12/07/2011 11:01:00', '12/07/2011 11:02:00', '12/07/2011 11:03:00', '12/07/2011 11:04:00', '12/07/2011 11:05:00', '12/07/2011 11:06:00', '12/07/2011 11:07:00', '12/07/2011 11:08:00', '12/07/2011 11:09:00', '12/07/2011 11:10:00', '12/07/2011 11:11:00', '12/07/2011 11:12:00', '12/07/2011 11:13:00', '12/07/2011 11:14:00', '12/07/2011 11:15:00', '12/07/2011 11:16:00', '12/07/2011 11:17:00', '12/07/2011 11:18:00', '12/07/2011 11:19:00', '12/07/2011 11:20:00', '12/07/2011 11:21:00', '12/07/2011 11:22:00', '12/07/2011 11:23:00', '12/07/2011 11:24:00', '12/07/2011 11:25:00', '12/07/2011 11:26:00', '12/07/2011 11:27:00', '12/07/2011 11:28:00', '12/07/2011 11:29:00', '12/07/2011 11:30:00', '12/07/2011 11:31:00', '12/07/2011 11:32:00', '12/07/2011 11:33:00', '12/07/2011 11:34:00', '12/07/2011 11:35:00', '12/07/2011 11:36:00', '12/07/2011 11:37:00', '12/07/2011 11:38:00', '12/07/2011 11:39:00', '12/07/2011 11:40:00', '12/07/2011 11:41:00', '12/07/2011 11:42:00', '12/07/2011 11:43:00', '12/07/2011 11:44:00', '12/07/2011 11:45:00', '12/07/2011 11:46:00', '12/07/2011 11:47:00', '12/07/2011 11:48:00', '12/07/2011 11:49:00', '12/07/2011 11:50:00', '12/07/2011 11:51:00', '12/07/2011 11:52:00', '12/07/2011 11:53:00', '12/07/2011 11:54:00', '12/07/2011 11:55:00', '12/07/2011 11:56:00', '12/07/2011 11:57:00', '12/07/2011 11:58:00', '12/07/2011 11:59:00', '12/07/2011 12:00:00', '12/07/2011 12:01:00', '12/07/2011 12:02:00', '12/07/2011 12:03:00', '12/07/2011 12:04:00', '12/07/2011 12:05:00', '12/07/2011 12:06:00', '12/07/2011 12:07:00', '12/07/2011 12:08:00', '12/07/2011 12:09:00', '12/07/2011 12:10:00', '12/07/2011 12:11:00', '12/07/2011 12:12:00', '12/07/2011 12:13:00', '12/07/2011 12:14:00', '12/07/2011 12:15:00', '12/07/2011 12:16:00', '12/07/2011 12:17:00', '12/07/2011 12:18:00', '12/07/2011 12:19:00', '12/07/2011 12:20:00', '12/07/2011 12:21:00', '12/07/2011 12:22:00', '12/07/2011 12:23:00', '12/07/2011 12:24:00', '12/07/2011 12:25:00', '12/07/2011 12:26:00', '12/07/2011 12:27:00', '12/07/2011 12:28:00', '12/07/2011 12:29:00', '12/07/2011 12:30:00', '12/07/2011 12:31:00', '12/07/2011 12:32:00', '12/07/2011 12:33:00', '12/07/2011 12:34:00', '12/07/2011 12:35:00', '12/07/2011 12:36:00', '12/07/2011 12:37:00', '12/07/2011 12:38:00', '12/07/2011 12:39:00', '12/07/2011 12:40:00', '12/07/2011 12:41:00', '12/07/2011 12:42:00', '12/07/2011 12:43:00', '12/07/2011 12:44:00', '12/07/2011 12:45:00', '12/07/2011 12:46:00', '12/07/2011 12:47:00', '12/07/2011 12:48:00', '12/07/2011 12:49:00', '12/07/2011 12:50:00', '12/07/2011 12:51:00', '12/07/2011 12:52:00', '12/07/2011 12:53:00', '12/07/2011 12:54:00', '12/07/2011 12:55:00', '12/07/2011 12:56:00', '12/07/2011 12:57:00', '12/07/2011 12:58:00', '12/07/2011 12:59:00', '12/07/2011 13:00:00', '12/07/2011 13:01:00', '12/07/2011 13:02:00', '12/07/2011 13:03:00', '12/07/2011 13:04:00', '12/07/2011 13:05:00', '12/07/2011 13:06:00', '12/07/2011 13:07:00', '12/07/2011 13:08:00', '12/07/2011 13:09:00', '12/07/2011 13:10:00', '12/07/2011 13:11:00', '12/07/2011 13:12:00', '12/07/2011 13:13:00', '12/07/2011 13:14:00', '12/07/2011 13:15:00', '12/07/2011 13:16:00', '12/07/2011 13:17:00', '12/07/2011 13:18:00', '12/07/2011 13:19:00', '12/07/2011 13:20:00', '12/07/2011 13:21:00', '12/07/2011 13:22:00', '12/07/2011 13:23:00', '12/07/2011 13:24:00', '12/07/2011 13:25:00', '12/07/2011 13:26:00', '12/07/2011 13:27:00', '12/07/2011 13:28:00', '12/07/2011 13:29:00', '12/07/2011 13:30:00', '12/07/2011 13:31:00', '12/07/2011 13:32:00', '12/07/2011 13:33:00', '12/07/2011 13:34:00', '12/07/2011 13:35:00', '12/07/2011 13:36:00', '12/07/2011 13:37:00', '12/07/2011 13:38:00', '12/07/2011 13:39:00', '12/07/2011 13:40:00', '12/07/2011 13:41:00', '12/07/2011 13:42:00', '12/07/2011 13:43:00', '12/07/2011 13:44:00', '12/07/2011 13:45:00', '12/07/2011 13:46:00', '12/07/2011 13:47:00', '12/07/2011 13:48:00', '12/07/2011 13:49:00', '12/07/2011 13:50:00', '12/07/2011 13:51:00', '12/07/2011 13:52:00', '12/07/2011 13:53:00', '12/07/2011 13:54:00', '12/07/2011 13:55:00', '12/07/2011 13:56:00', '12/07/2011 13:57:00', '12/07/2011 13:58:00', '12/07/2011 13:59:00', '12/07/2011 14:00:00', '12/07/2011 14:01:00', '12/07/2011 14:02:00', '12/07/2011 14:03:00', '12/07/2011 14:04:00', '12/07/2011 14:05:00', '12/07/2011 14:06:00', '12/07/2011 14:07:00', '12/07/2011 14:08:00', '12/07/2011 14:09:00', '12/07/2011 14:10:00', '12/07/2011 14:11:00', '12/07/2011 14:12:00', '12/07/2011 14:13:00', '12/07/2011 14:14:00', '12/07/2011 14:15:00', '12/07/2011 14:16:00', '12/07/2011 14:17:00', '12/07/2011 14:18:00', '12/07/2011 14:19:00', '12/07/2011 14:20:00', '12/07/2011 14:21:00', '12/07/2011 14:22:00', '12/07/2011 14:23:00', '12/07/2011 14:24:00', '12/07/2011 14:25:00', '12/07/2011 14:26:00', '12/07/2011 14:27:00', '12/07/2011 14:28:00', '12/07/2011 14:29:00', '12/07/2011 14:30:00', '12/07/2011 14:31:00', '12/07/2011 14:32:00', '12/07/2011 14:33:00', '12/07/2011 14:34:00', '12/07/2011 14:35:00', '12/07/2011 14:36:00', '12/07/2011 14:37:00', '12/07/2011 14:38:00', '12/07/2011 14:39:00', '12/07/2011 14:40:00', '12/07/2011 14:41:00', '12/07/2011 14:42:00', '12/07/2011 14:43:00', '12/07/2011 14:44:00', '12/07/2011 14:45:00', '12/07/2011 14:46:00', '12/07/2011 14:47:00', '12/07/2011 14:48:00', '12/07/2011 14:49:00', '12/07/2011 14:50:00', '12/07/2011 14:51:00', '12/07/2011 14:52:00', '12/07/2011 14:53:00', '12/07/2011 14:54:00', '12/07/2011 14:55:00', '12/07/2011 14:56:00', '12/07/2011 14:57:00', '12/07/2011 14:58:00', '12/07/2011 14:59:00', '12/07/2011 15:00:00', '12/07/2011 15:01:00', '12/07/2011 15:02:00', '12/07/2011 15:03:00', '12/07/2011 15:04:00', '12/07/2011 15:05:00', '12/07/2011 15:06:00', '12/07/2011 15:07:00', '12/07/2011 15:08:00', '12/07/2011 15:09:00', '12/07/2011 15:10:00', '12/07/2011 15:11:00', '12/07/2011 15:12:00', '12/07/2011 15:13:00', '12/07/2011 15:14:00', '12/07/2011 15:15:00', '12/07/2011 15:16:00', '12/07/2011 15:17:00', '12/07/2011 15:18:00', '12/07/2011 15:19:00', '12/07/2011 15:20:00', '12/07/2011 15:21:00', '12/07/2011 15:22:00', '12/07/2011 15:23:00', '12/07/2011 15:24:00', '12/07/2011 15:25:00', '12/07/2011 15:26:00', '12/07/2011 15:27:00', '12/07/2011 15:28:00', '12/07/2011 15:29:00', '12/07/2011 15:30:00', '12/07/2011 15:31:00', '12/07/2011 15:32:00', '12/07/2011 15:33:00', '12/07/2011 15:34:00', '12/07/2011 15:35:00', '12/07/2011 15:36:00', '12/07/2011 15:37:00', '12/07/2011 15:38:00', '12/07/2011 15:39:00', '12/07/2011 15:40:00', '12/07/2011 15:41:00', '12/07/2011 15:42:00', '12/07/2011 15:43:00', '12/07/2011 15:44:00', '12/07/2011 15:45:00', '12/07/2011 15:46:00', '12/07/2011 15:47:00', '12/07/2011 15:48:00', '12/07/2011 15:49:00', '12/07/2011 15:50:00', '12/07/2011 15:51:00', '12/07/2011 15:52:00', '12/07/2011 15:53:00', '12/07/2011 15:54:00', '12/07/2011 15:55:00', '12/07/2011 15:56:00', '12/07/2011 15:57:00', '12/07/2011 15:58:00', '12/07/2011 15:59:00', '12/07/2011 16:00:00', '12/07/2011 16:01:00', '12/07/2011 16:02:00', '12/07/2011 16:03:00', '12/07/2011 16:04:00', '12/07/2011 16:05:00', '12/07/2011 16:06:00', '12/07/2011 16:07:00', '12/07/2011 16:08:00', '12/07/2011 16:09:00', '12/07/2011 16:10:00', '12/07/2011 16:11:00', '12/07/2011 16:12:00', '12/07/2011 16:13:00', '12/07/2011 16:14:00', '12/07/2011 16:15:00', '12/07/2011 16:16:00', '12/07/2011 16:17:00', '12/07/2011 16:18:00', '12/07/2011 16:19:00', '12/07/2011 16:20:00', '12/07/2011 16:21:00', '12/07/2011 16:22:00', '12/07/2011 16:23:00', '12/07/2011 16:24:00', '12/07/2011 16:25:00', '12/07/2011 16:26:00', '12/07/2011 16:27:00', '12/07/2011 16:28:00', '12/07/2011 16:29:00', '12/07/2011 16:30:00', '12/07/2011 16:31:00', '12/07/2011 16:32:00', '12/07/2011 16:33:00', '12/07/2011 16:34:00', '12/07/2011 16:35:00', '12/07/2011 16:36:00', '12/07/2011 16:37:00', '12/07/2011 16:38:00', '12/07/2011 16:39:00', '12/07/2011 16:40:00', '12/07/2011 16:41:00', '12/07/2011 16:42:00', '12/07/2011 16:43:00', '12/07/2011 16:44:00', '12/07/2011 16:45:00', '12/07/2011 16:46:00', '12/07/2011 16:47:00', '12/07/2011 16:48:00', '12/07/2011 16:49:00', '12/07/2011 16:50:00', '12/07/2011 16:51:00', '12/07/2011 16:52:00', '12/07/2011 16:53:00', '12/07/2011 16:54:00', '12/07/2011 16:55:00', '12/07/2011 16:56:00', '12/07/2011 16:57:00', '12/07/2011 16:58:00', '12/07/2011 16:59:00', '12/07/2011 17:00:00', '12/07/2011 17:01:00', '12/07/2011 17:02:00', '12/07/2011 17:03:00', '12/07/2011 17:04:00', '12/07/2011 17:05:00', '12/07/2011 17:06:00', '12/07/2011 17:07:00', '12/07/2011 17:08:00', '12/07/2011 17:09:00', '12/07/2011 17:10:00', '12/07/2011 17:11:00', '12/07/2011 17:12:00', '12/07/2011 17:13:00', '12/07/2011 17:14:00', '12/07/2011 17:15:00', '12/07/2011 17:16:00', '12/07/2011 17:17:00', '12/07/2011 17:18:00', '12/07/2011 17:19:00', '12/07/2011 17:20:00', '12/07/2011 17:21:00', '12/07/2011 17:22:00', '12/07/2011 17:23:00', '12/07/2011 17:24:00', '12/07/2011 17:25:00', '12/07/2011 17:26:00', '12/07/2011 17:27:00', '12/07/2011 17:28:00', '12/07/2011 17:29:00', '12/07/2011 17:30:00', '12/07/2011 17:31:00', '12/07/2011 17:32:00', '12/07/2011 17:33:00', '12/07/2011 17:34:00', '12/07/2011 17:35:00', '12/07/2011 17:36:00', '12/07/2011 17:37:00', '12/07/2011 17:38:00', '12/07/2011 17:39:00', '12/07/2011 17:40:00', '12/07/2011 17:41:00', '12/07/2011 17:42:00', '12/07/2011 17:43:00', '12/07/2011 17:44:00', '12/07/2011 17:45:00', '12/07/2011 17:46:00', '12/07/2011 17:47:00', '12/07/2011 17:48:00', '12/07/2011 17:49:00', '12/07/2011 17:50:00', '12/07/2011 17:51:00', '12/07/2011 17:52:00', '12/07/2011 17:53:00', '12/07/2011 17:54:00', '12/07/2011 17:55:00', '12/07/2011 17:56:00', '12/07/2011 17:57:00', '12/07/2011 17:58:00', '12/07/2011 17:59:00', '12/07/2011 18:00:00', '12/07/2011 18:01:00', '12/07/2011 18:02:00', '12/07/2011 18:03:00', '12/07/2011 18:04:00', '12/07/2011 18:05:00', '12/07/2011 18:06:00', '12/07/2011 18:07:00', '12/07/2011 18:08:00', '12/07/2011 18:09:00', '12/07/2011 18:10:00', '12/07/2011 18:11:00', '12/07/2011 18:12:00', '12/07/2011 18:13:00', '12/07/2011 18:14:00', '12/07/2011 18:15:00', '12/07/2011 18:16:00', '12/07/2011 18:17:00', '12/07/2011 18:18:00', '12/07/2011 18:19:00', '12/07/2011 18:20:00', '12/07/2011 18:21:00', '12/07/2011 18:22:00', '12/07/2011 18:23:00', '12/07/2011 18:24:00', '12/07/2011 18:25:00', '12/07/2011 18:26:00', '12/07/2011 18:27:00', '12/07/2011 18:28:00', '12/07/2011 18:29:00', '12/07/2011 18:30:00', '12/07/2011 18:31:00', '12/07/2011 18:32:00', '12/07/2011 18:33:00', '12/07/2011 18:34:00', '12/07/2011 18:35:00', '12/07/2011 18:36:00', '12/07/2011 18:37:00', '12/07/2011 18:38:00', '12/07/2011 18:39:00', '12/07/2011 18:40:00', '12/07/2011 18:41:00', '12/07/2011 18:42:00', '12/07/2011 18:43:00', '12/07/2011 18:44:00', '12/07/2011 18:45:00', '12/07/2011 18:46:00', '12/07/2011 18:47:00', '12/07/2011 18:48:00', '12/07/2011 18:49:00', '12/07/2011 18:50:00', '12/07/2011 18:51:00', '12/07/2011 18:52:00', '12/07/2011 18:53:00', '12/07/2011 18:54:00', '12/07/2011 18:55:00', '12/07/2011 18:56:00', '12/07/2011 18:57:00', '12/07/2011 18:58:00', '12/07/2011 18:59:00', '12/07/2011 19:00:00', '12/07/2011 19:01:00', '12/07/2011 19:02:00', '12/07/2011 19:03:00', '12/07/2011 19:04:00', '12/07/2011 19:05:00', '12/07/2011 19:06:00', '12/07/2011 19:07:00', '12/07/2011 19:08:00', '12/07/2011 19:09:00', '12/07/2011 19:10:00', '12/07/2011 19:11:00', '12/07/2011 19:12:00', '12/07/2011 19:13:00', '12/07/2011 19:14:00', '12/07/2011 19:15:00', '12/07/2011 19:16:00', '12/07/2011 19:17:00', '12/07/2011 19:18:00', '12/07/2011 19:19:00', '12/07/2011 19:20:00', '12/07/2011 19:21:00', '12/07/2011 19:22:00', '12/07/2011 19:23:00', '12/07/2011 19:24:00', '12/07/2011 19:25:00', '12/07/2011 19:26:00', '12/07/2011 19:27:00', '12/07/2011 19:28:00', '12/07/2011 19:29:00', '12/07/2011 19:30:00', '12/07/2011 19:31:00', '12/07/2011 19:32:00', '12/07/2011 19:33:00', '12/07/2011 19:34:00', '12/07/2011 19:35:00', '12/07/2011 19:36:00', '12/07/2011 19:37:00', '12/07/2011 19:38:00', '12/07/2011 19:39:00', '12/07/2011 19:40:00', '12/07/2011 19:41:00', '12/07/2011 19:42:00', '12/07/2011 19:43:00', '12/07/2011 19:44:00', '12/07/2011 19:45:00', '12/07/2011 19:46:00', '12/07/2011 19:47:00', '12/07/2011 19:48:00', '12/07/2011 19:49:00', '12/07/2011 19:50:00', '12/07/2011 19:51:00', '12/07/2011 19:52:00', '12/07/2011 19:53:00', '12/07/2011 19:54:00', '12/07/2011 19:55:00', '12/07/2011 19:56:00', '12/07/2011 19:57:00', '12/07/2011 19:58:00', '12/07/2011 19:59:00', '12/07/2011 20:00:00', '12/07/2011 20:01:00', '12/07/2011 20:02:00', '12/07/2011 20:03:00', '12/07/2011 20:04:00', '12/07/2011 20:05:00', '12/07/2011 20:06:00', '12/07/2011 20:07:00', '12/07/2011 20:08:00', '12/07/2011 20:09:00', '12/07/2011 20:10:00', '12/07/2011 20:11:00', '12/07/2011 20:12:00', '12/07/2011 20:13:00', '12/07/2011 20:14:00', '12/07/2011 20:15:00', '12/07/2011 20:16:00', '12/07/2011 20:17:00', '12/07/2011 20:18:00', '12/07/2011 20:19:00', '12/07/2011 20:20:00', '12/07/2011 20:21:00', '12/07/2011 20:22:00', '12/07/2011 20:23:00', '12/07/2011 20:24:00', '12/07/2011 20:25:00', '12/07/2011 20:26:00', '12/07/2011 20:27:00', '12/07/2011 20:28:00', '12/07/2011 20:29:00', '12/07/2011 20:30:00', '12/07/2011 20:31:00', '12/07/2011 20:32:00', '12/07/2011 20:33:00', '12/07/2011 20:34:00', '12/07/2011 20:35:00', '12/07/2011 20:36:00', '12/07/2011 20:37:00', '12/07/2011 20:38:00', '12/07/2011 20:39:00', '12/07/2011 20:40:00', '12/07/2011 20:41:00', '12/07/2011 20:42:00', '12/07/2011 20:43:00', '12/07/2011 20:44:00', '12/07/2011 20:45:00', '12/07/2011 20:46:00', '12/07/2011 20:47:00', '12/07/2011 20:48:00', '12/07/2011 20:49:00', '12/07/2011 20:50:00', '12/07/2011 20:51:00', '12/07/2011 20:52:00', '12/07/2011 20:53:00', '12/07/2011 20:54:00', '12/07/2011 20:55:00', '12/07/2011 20:56:00', '12/07/2011 20:57:00', '12/07/2011 20:58:00', '12/07/2011 20:59:00', '12/07/2011 21:00:00', '12/07/2011 21:01:00', '12/07/2011 21:02:00', '12/07/2011 21:03:00', '12/07/2011 21:04:00', '12/07/2011 21:05:00', '12/07/2011 21:06:00', '12/07/2011 21:07:00', '12/07/2011 21:08:00', '12/07/2011 21:09:00', '12/07/2011 21:10:00', '12/07/2011 21:11:00', '12/07/2011 21:12:00', '12/07/2011 21:13:00', '12/07/2011 21:14:00', '12/07/2011 21:15:00', '12/07/2011 21:16:00', '12/07/2011 21:17:00', '12/07/2011 21:18:00', '12/07/2011 21:19:00', '12/07/2011 21:20:00', '12/07/2011 21:21:00', '12/07/2011 21:22:00', '12/07/2011 21:23:00', '12/07/2011 21:24:00', '12/07/2011 21:25:00', '12/07/2011 21:26:00', '12/07/2011 21:27:00', '12/07/2011 21:28:00', '12/07/2011 21:29:00', '12/07/2011 21:30:00', '12/07/2011 21:31:00', '12/07/2011 21:32:00', '12/07/2011 21:33:00', '12/07/2011 21:34:00', '12/07/2011 21:35:00', '12/07/2011 21:36:00', '12/07/2011 21:37:00', '12/07/2011 21:38:00', '12/07/2011 21:39:00', '12/07/2011 21:40:00', '12/07/2011 21:41:00', '12/07/2011 21:42:00', '12/07/2011 21:43:00', '12/07/2011 21:44:00', '12/07/2011 21:45:00', '12/07/2011 21:46:00', '12/07/2011 21:47:00', '12/07/2011 21:48:00', '12/07/2011 21:49:00', '12/07/2011 21:50:00', '12/07/2011 21:51:00', '12/07/2011 21:52:00', '12/07/2011 21:53:00', '12/07/2011 21:54:00', '12/07/2011 21:55:00', '12/07/2011 21:56:00', '12/07/2011 21:57:00', '12/07/2011 21:58:00', '12/07/2011 21:59:00', '12/07/2011 22:00:00', '12/07/2011 22:01:00', '12/07/2011 22:02:00', '12/07/2011 22:03:00', '12/07/2011 22:04:00', '12/07/2011 22:05:00', '12/07/2011 22:06:00', '12/07/2011 22:07:00', '12/07/2011 22:08:00', '12/07/2011 22:09:00', '12/07/2011 22:10:00', '12/07/2011 22:11:00', '12/07/2011 22:12:00', '12/07/2011 22:13:00', '12/07/2011 22:14:00', '12/07/2011 22:15:00', '12/07/2011 22:16:00', '12/07/2011 22:17:00', '12/07/2011 22:18:00', '12/07/2011 22:19:00', '12/07/2011 22:20:00', '12/07/2011 22:21:00', '12/07/2011 22:22:00', '12/07/2011 22:23:00', '12/07/2011 22:24:00', '12/07/2011 22:25:00', '12/07/2011 22:26:00', '12/07/2011 22:27:00', '12/07/2011 22:28:00', '12/07/2011 22:29:00', '12/07/2011 22:30:00', '12/07/2011 22:31:00', '12/07/2011 22:32:00', '12/07/2011 22:33:00', '12/07/2011 22:34:00', '12/07/2011 22:35:00', '12/07/2011 22:36:00', '12/07/2011 22:37:00', '12/07/2011 22:38:00', '12/07/2011 22:39:00', '12/07/2011 22:40:00', '12/07/2011 22:41:00', '12/07/2011 22:42:00', '12/07/2011 22:43:00', '12/07/2011 22:44:00', '12/07/2011 22:45:00', '12/07/2011 22:46:00', '12/07/2011 22:47:00', '12/07/2011 22:48:00', '12/07/2011 22:49:00', '12/07/2011 22:50:00', '12/07/2011 22:51:00', '12/07/2011 22:52:00', '12/07/2011 22:53:00', '12/07/2011 22:54:00', '12/07/2011 22:55:00', '12/07/2011 22:56:00', '12/07/2011 22:57:00', '12/07/2011 22:58:00', '12/07/2011 22:59:00', '12/07/2011 23:00:00', '12/07/2011 23:01:00', '12/07/2011 23:02:00', '12/07/2011 23:03:00', '12/07/2011 23:04:00', '12/07/2011 23:05:00', '12/07/2011 23:06:00', '12/07/2011 23:07:00', '12/07/2011 23:08:00', '12/07/2011 23:09:00', '12/07/2011 23:10:00', '12/07/2011 23:11:00', '12/07/2011 23:12:00', '12/07/2011 23:13:00', '12/07/2011 23:14:00', '12/07/2011 23:15:00', '12/07/2011 23:16:00', '12/07/2011 23:17:00', '12/07/2011 23:18:00', '12/07/2011 23:19:00', '12/07/2011 23:20:00', '12/07/2011 23:21:00', '12/07/2011 23:22:00', '12/07/2011 23:23:00', '12/07/2011 23:24:00', '12/07/2011 23:25:00', '12/07/2011 23:26:00', '12/07/2011 23:27:00', '12/07/2011 23:28:00', '12/07/2011 23:29:00', '12/07/2011 23:30:00', '12/07/2011 23:31:00', '12/07/2011 23:32:00', '12/07/2011 23:33:00', '12/07/2011 23:34:00', '12/07/2011 23:35:00', '12/07/2011 23:36:00', '12/07/2011 23:37:00', '12/07/2011 23:38:00', '12/07/2011 23:39:00', '12/07/2011 23:40:00', '12/07/2011 23:41:00', '12/07/2011 23:42:00', '12/07/2011 23:43:00', '12/07/2011 23:44:00', '12/07/2011 23:45:00', '12/07/2011 23:46:00', '12/07/2011 23:47:00', '12/07/2011 23:48:00', '12/07/2011 23:49:00', '12/07/2011 23:50:00', '12/07/2011 23:51:00', '12/07/2011 23:52:00', '12/07/2011 23:53:00', '12/07/2011 23:54:00', '12/07/2011 23:55:00', '12/07/2011 23:56:00', '12/07/2011 23:57:00', '12/07/2011 23:58:00', '12/07/2011 23:59:00', '13/07/2011 0:00:00', '13/07/2011 0:01:00', '13/07/2011 0:02:00', '13/07/2011 0:03:00', '13/07/2011 0:04:00', '13/07/2011 0:05:00', '13/07/2011 0:06:00', '13/07/2011 0:07:00', '13/07/2011 0:08:00', '13/07/2011 0:09:00', '13/07/2011 0:10:00', '13/07/2011 0:11:00', '13/07/2011 0:12:00', '13/07/2011 0:13:00', '13/07/2011 0:14:00', '13/07/2011 0:15:00', '13/07/2011 0:16:00', '13/07/2011 0:17:00', '13/07/2011 0:18:00', '13/07/2011 0:19:00', '13/07/2011 0:20:00', '13/07/2011 0:21:00', '13/07/2011 0:22:00', '13/07/2011 0:23:00', '13/07/2011 0:24:00', '13/07/2011 0:25:00', '13/07/2011 0:26:00', '13/07/2011 0:27:00', '13/07/2011 0:28:00', '13/07/2011 0:29:00', '13/07/2011 0:30:00', '13/07/2011 0:31:00', '13/07/2011 0:32:00', '13/07/2011 0:33:00', '13/07/2011 0:34:00', '13/07/2011 0:35:00', '13/07/2011 0:36:00', '13/07/2011 0:37:00', '13/07/2011 0:38:00', '13/07/2011 0:39:00', '13/07/2011 0:40:00', '13/07/2011 0:41:00', '13/07/2011 0:42:00', '13/07/2011 0:43:00', '13/07/2011 0:44:00', '13/07/2011 0:45:00', '13/07/2011 0:46:00', '13/07/2011 0:47:00', '13/07/2011 0:48:00', '13/07/2011 0:49:00', '13/07/2011 0:50:00', '13/07/2011 0:51:00', '13/07/2011 0:52:00', '13/07/2011 0:53:00', '13/07/2011 0:54:00', '13/07/2011 0:55:00', '13/07/2011 0:56:00', '13/07/2011 0:57:00', '13/07/2011 0:58:00', '13/07/2011 0:59:00', '13/07/2011 1:00:00', '13/07/2011 1:01:00', '13/07/2011 1:02:00', '13/07/2011 1:03:00', '13/07/2011 1:04:00', '13/07/2011 1:05:00', '13/07/2011 1:06:00', '13/07/2011 1:07:00', '13/07/2011 1:08:00', '13/07/2011 1:09:00', '13/07/2011 1:10:00', '13/07/2011 1:11:00', '13/07/2011 1:12:00', '13/07/2011 1:13:00', '13/07/2011 1:14:00', '13/07/2011 1:15:00', '13/07/2011 1:16:00', '13/07/2011 1:17:00', '13/07/2011 1:18:00', '13/07/2011 1:19:00', '13/07/2011 1:20:00', '13/07/2011 1:21:00', '13/07/2011 1:22:00', '13/07/2011 1:23:00', '13/07/2011 1:24:00', '13/07/2011 1:25:00', '13/07/2011 1:26:00', '13/07/2011 1:27:00', '13/07/2011 1:28:00', '13/07/2011 1:29:00', '13/07/2011 1:30:00', '13/07/2011 1:31:00', '13/07/2011 1:32:00', '13/07/2011 1:33:00', '13/07/2011 1:34:00', '13/07/2011 1:35:00', '13/07/2011 1:36:00', '13/07/2011 1:37:00', '13/07/2011 1:38:00', '13/07/2011 1:39:00', '13/07/2011 1:40:00', '13/07/2011 1:41:00', '13/07/2011 1:42:00', '13/07/2011 1:43:00', '13/07/2011 1:44:00', '13/07/2011 1:45:00', '13/07/2011 1:46:00', '13/07/2011 1:47:00', '13/07/2011 1:48:00', '13/07/2011 1:49:00', '13/07/2011 1:50:00', '13/07/2011 1:51:00', '13/07/2011 1:52:00', '13/07/2011 1:53:00', '13/07/2011 1:54:00', '13/07/2011 1:55:00', '13/07/2011 1:56:00', '13/07/2011 1:57:00', '13/07/2011 1:58:00', '13/07/2011 1:59:00', '13/07/2011 2:00:00', '13/07/2011 2:01:00', '13/07/2011 2:02:00', '13/07/2011 2:03:00', '13/07/2011 2:04:00', '13/07/2011 2:05:00', '13/07/2011 2:06:00', '13/07/2011 2:07:00', '13/07/2011 2:08:00', '13/07/2011 2:09:00', '13/07/2011 2:10:00', '13/07/2011 2:11:00', '13/07/2011 2:12:00', '13/07/2011 2:13:00', '13/07/2011 2:14:00', '13/07/2011 2:15:00', '13/07/2011 2:16:00', '13/07/2011 2:17:00', '13/07/2011 2:18:00', '13/07/2011 2:19:00', '13/07/2011 2:20:00', '13/07/2011 2:21:00', '13/07/2011 2:22:00', '13/07/2011 2:23:00', '13/07/2011 2:24:00', '13/07/2011 2:25:00', '13/07/2011 2:26:00', '13/07/2011 2:27:00', '13/07/2011 2:28:00', '13/07/2011 2:29:00', '13/07/2011 2:30:00', '13/07/2011 2:31:00', '13/07/2011 2:32:00', '13/07/2011 2:33:00', '13/07/2011 2:34:00', '13/07/2011 2:35:00', '13/07/2011 2:36:00', '13/07/2011 2:37:00', '13/07/2011 2:38:00', '13/07/2011 2:39:00', '13/07/2011 2:40:00', '13/07/2011 2:41:00', '13/07/2011 2:42:00', '13/07/2011 2:43:00', '13/07/2011 2:44:00', '13/07/2011 2:45:00', '13/07/2011 2:46:00', '13/07/2011 2:47:00', '13/07/2011 2:48:00', '13/07/2011 2:49:00', '13/07/2011 2:50:00', '13/07/2011 2:51:00', '13/07/2011 2:52:00', '13/07/2011 2:53:00', '13/07/2011 2:54:00', '13/07/2011 2:55:00', '13/07/2011 2:56:00', '13/07/2011 2:57:00', '13/07/2011 2:58:00', '13/07/2011 2:59:00', '13/07/2011 3:00:00', '13/07/2011 3:01:00', '13/07/2011 3:02:00', '13/07/2011 3:03:00', '13/07/2011 3:04:00', '13/07/2011 3:05:00', '13/07/2011 3:06:00', '13/07/2011 3:07:00', '13/07/2011 3:08:00', '13/07/2011 3:09:00', '13/07/2011 3:10:00', '13/07/2011 3:11:00', '13/07/2011 3:12:00', '13/07/2011 3:13:00', '13/07/2011 3:14:00', '13/07/2011 3:15:00', '13/07/2011 3:16:00', '13/07/2011 3:17:00', '13/07/2011 3:18:00', '13/07/2011 3:19:00', '13/07/2011 3:20:00', '13/07/2011 3:21:00', '13/07/2011 3:22:00', '13/07/2011 3:23:00', '13/07/2011 3:24:00', '13/07/2011 3:25:00', '13/07/2011 3:26:00', '13/07/2011 3:27:00', '13/07/2011 3:28:00', '13/07/2011 3:29:00', '13/07/2011 3:30:00', '13/07/2011 3:31:00', '13/07/2011 3:32:00', '13/07/2011 3:33:00', '13/07/2011 3:34:00', '13/07/2011 3:35:00', '13/07/2011 3:36:00', '13/07/2011 3:37:00', '13/07/2011 3:38:00', '13/07/2011 3:39:00', '13/07/2011 3:40:00', '13/07/2011 3:41:00', '13/07/2011 3:42:00', '13/07/2011 3:43:00', '13/07/2011 3:44:00', '13/07/2011 3:45:00', '13/07/2011 3:46:00', '13/07/2011 3:47:00', '13/07/2011 3:48:00', '13/07/2011 3:49:00', '13/07/2011 3:50:00', '13/07/2011 3:51:00', '13/07/2011 3:52:00', '13/07/2011 3:53:00', '13/07/2011 3:54:00', '13/07/2011 3:55:00', '13/07/2011 3:56:00', '13/07/2011 3:57:00', '13/07/2011 3:58:00', '13/07/2011 3:59:00', '13/07/2011 4:00:00', '13/07/2011 4:01:00', '13/07/2011 4:02:00', '13/07/2011 4:03:00', '13/07/2011 4:04:00', '13/07/2011 4:05:00', '13/07/2011 4:06:00', '13/07/2011 4:07:00', '13/07/2011 4:08:00', '13/07/2011 4:09:00', '13/07/2011 4:10:00', '13/07/2011 4:11:00', '13/07/2011 4:12:00', '13/07/2011 4:13:00', '13/07/2011 4:14:00', '13/07/2011 4:15:00', '13/07/2011 4:16:00', '13/07/2011 4:17:00', '13/07/2011 4:18:00', '13/07/2011 4:19:00', '13/07/2011 4:20:00', '13/07/2011 4:21:00', '13/07/2011 4:22:00', '13/07/2011 4:23:00', '13/07/2011 4:24:00', '13/07/2011 4:25:00', '13/07/2011 4:26:00', '13/07/2011 4:27:00', '13/07/2011 4:28:00', '13/07/2011 4:29:00', '13/07/2011 4:30:00', '13/07/2011 4:31:00', '13/07/2011 4:32:00', '13/07/2011 4:33:00', '13/07/2011 4:34:00', '13/07/2011 4:35:00', '13/07/2011 4:36:00', '13/07/2011 4:37:00', '13/07/2011 4:38:00', '13/07/2011 4:39:00', '13/07/2011 4:40:00', '13/07/2011 4:41:00', '13/07/2011 4:42:00', '13/07/2011 4:43:00', '13/07/2011 4:44:00', '13/07/2011 4:45:00', '13/07/2011 4:46:00', '13/07/2011 4:47:00', '13/07/2011 4:48:00', '13/07/2011 4:49:00', '13/07/2011 4:50:00', '13/07/2011 4:51:00', '13/07/2011 4:52:00', '13/07/2011 4:53:00', '13/07/2011 4:54:00', '13/07/2011 4:55:00', '13/07/2011 4:56:00', '13/07/2011 4:57:00', '13/07/2011 4:58:00', '13/07/2011 4:59:00', '13/07/2011 5:00:00', '13/07/2011 5:01:00', '13/07/2011 5:02:00', '13/07/2011 5:03:00', '13/07/2011 5:04:00', '13/07/2011 5:05:00', '13/07/2011 5:06:00', '13/07/2011 5:07:00', '13/07/2011 5:08:00', '13/07/2011 5:09:00', '13/07/2011 5:10:00', '13/07/2011 5:11:00', '13/07/2011 5:12:00', '13/07/2011 5:13:00', '13/07/2011 5:14:00', '13/07/2011 5:15:00', '13/07/2011 5:16:00', '13/07/2011 5:17:00', '13/07/2011 5:18:00', '13/07/2011 5:19:00', '13/07/2011 5:20:00', '13/07/2011 5:21:00', '13/07/2011 5:22:00', '13/07/2011 5:23:00', '13/07/2011 5:24:00', '13/07/2011 5:25:00', '13/07/2011 5:26:00', '13/07/2011 5:27:00', '13/07/2011 5:28:00', '13/07/2011 5:29:00', '13/07/2011 5:30:00', '13/07/2011 5:31:00', '13/07/2011 5:32:00', '13/07/2011 5:33:00', '13/07/2011 5:34:00', '13/07/2011 5:35:00', '13/07/2011 5:36:00', '13/07/2011 5:37:00', '13/07/2011 5:38:00', '13/07/2011 5:39:00', '13/07/2011 5:40:00', '13/07/2011 5:41:00', '13/07/2011 5:42:00', '13/07/2011 5:43:00', '13/07/2011 5:44:00', '13/07/2011 5:45:00', '13/07/2011 5:46:00', '13/07/2011 5:47:00', '13/07/2011 5:48:00', '13/07/2011 5:49:00', '13/07/2011 5:50:00', '13/07/2011 5:51:00', '13/07/2011 5:52:00', '13/07/2011 5:53:00', '13/07/2011 5:54:00', '13/07/2011 5:55:00', '13/07/2011 5:56:00', '13/07/2011 5:57:00', '13/07/2011 5:58:00', '13/07/2011 5:59:00', '13/07/2011 6:00:00', '13/07/2011 6:01:00', '13/07/2011 6:02:00', '13/07/2011 6:03:00', '13/07/2011 6:04:00', '13/07/2011 6:05:00', '13/07/2011 6:06:00', '13/07/2011 6:07:00', '13/07/2011 6:08:00', '13/07/2011 6:09:00', '13/07/2011 6:10:00', '13/07/2011 6:11:00', '13/07/2011 6:12:00', '13/07/2011 6:13:00', '13/07/2011 6:14:00', '13/07/2011 6:15:00', '13/07/2011 6:16:00', '13/07/2011 6:17:00', '13/07/2011 6:18:00', '13/07/2011 6:19:00', '13/07/2011 6:20:00', '13/07/2011 6:21:00', '13/07/2011 6:22:00', '13/07/2011 6:23:00', '13/07/2011 6:24:00', '13/07/2011 6:25:00', '13/07/2011 6:26:00', '13/07/2011 6:27:00', '13/07/2011 6:28:00', '13/07/2011 6:29:00', '13/07/2011 6:30:00', '13/07/2011 6:31:00', '13/07/2011 6:32:00', '13/07/2011 6:33:00', '13/07/2011 6:34:00', '13/07/2011 6:35:00', '13/07/2011 6:36:00', '13/07/2011 6:37:00', '13/07/2011 6:38:00', '13/07/2011 6:39:00', '13/07/2011 6:40:00', '13/07/2011 6:41:00', '13/07/2011 6:42:00', '13/07/2011 6:43:00', '13/07/2011 6:44:00', '13/07/2011 6:45:00', '13/07/2011 6:46:00', '13/07/2011 6:47:00', '13/07/2011 6:48:00', '13/07/2011 6:49:00', '13/07/2011 6:50:00', '13/07/2011 6:51:00', '13/07/2011 6:52:00', '13/07/2011 6:53:00', '13/07/2011 6:54:00', '13/07/2011 6:55:00', '13/07/2011 6:56:00', '13/07/2011 6:57:00', '13/07/2011 6:58:00', '13/07/2011 6:59:00', '13/07/2011 7:00:00', '13/07/2011 7:01:00', '13/07/2011 7:02:00', '13/07/2011 7:03:00', '13/07/2011 7:04:00', '13/07/2011 7:05:00', '13/07/2011 7:06:00', '13/07/2011 7:07:00', '13/07/2011 7:08:00', '13/07/2011 7:09:00', '13/07/2011 7:10:00', '13/07/2011 7:11:00', '13/07/2011 7:12:00', '13/07/2011 7:13:00', '13/07/2011 7:14:00', '13/07/2011 7:15:00', '13/07/2011 7:16:00', '13/07/2011 7:17:00', '13/07/2011 7:18:00', '13/07/2011 7:19:00', '13/07/2011 7:20:00', '13/07/2011 7:21:00', '13/07/2011 7:22:00', '13/07/2011 7:23:00', '13/07/2011 7:24:00', '13/07/2011 7:25:00', '13/07/2011 7:26:00', '13/07/2011 7:27:00', '13/07/2011 7:28:00', '13/07/2011 7:29:00', '13/07/2011 7:30:00', '13/07/2011 7:31:00', '13/07/2011 7:32:00', '13/07/2011 7:33:00', '13/07/2011 7:34:00', '13/07/2011 7:35:00', '13/07/2011 7:36:00', '13/07/2011 7:37:00', '13/07/2011 7:38:00', '13/07/2011 7:39:00', '13/07/2011 7:40:00', '13/07/2011 7:41:00', '13/07/2011 7:42:00', '13/07/2011 7:43:00', '13/07/2011 7:44:00', '13/07/2011 7:45:00', '13/07/2011 7:46:00', '13/07/2011 7:47:00', '13/07/2011 7:48:00', '13/07/2011 7:49:00', '13/07/2011 7:50:00', '13/07/2011 7:51:00', '13/07/2011 7:52:00', '13/07/2011 7:53:00', '13/07/2011 7:54:00', '13/07/2011 7:55:00', '13/07/2011 7:56:00', '13/07/2011 7:57:00', '13/07/2011 7:58:00', '13/07/2011 7:59:00', '13/07/2011 8:00:00', '13/07/2011 8:01:00', '13/07/2011 8:02:00', '13/07/2011 8:03:00', '13/07/2011 8:04:00', '13/07/2011 8:05:00', '13/07/2011 8:06:00', '13/07/2011 8:07:00', '13/07/2011 8:08:00', '13/07/2011 8:09:00', '13/07/2011 8:10:00', '13/07/2011 8:11:00', '13/07/2011 8:12:00', '13/07/2011 8:13:00', '13/07/2011 8:14:00', '13/07/2011 8:15:00', '13/07/2011 8:16:00', '13/07/2011 8:17:00', '13/07/2011 8:18:00', '13/07/2011 8:19:00', '13/07/2011 8:20:00', '13/07/2011 8:21:00', '13/07/2011 8:22:00', '13/07/2011 8:23:00', '13/07/2011 8:24:00', '13/07/2011 8:25:00', '13/07/2011 8:26:00', '13/07/2011 8:27:00', '13/07/2011 8:28:00', '13/07/2011 8:29:00', '13/07/2011 8:30:00', '13/07/2011 8:31:00', '13/07/2011 8:32:00', '13/07/2011 8:33:00', '13/07/2011 8:34:00', '13/07/2011 8:35:00', '13/07/2011 8:36:00', '13/07/2011 8:37:00', '13/07/2011 8:38:00', '13/07/2011 8:39:00', '13/07/2011 8:40:00', '13/07/2011 8:41:00', '13/07/2011 8:42:00', '13/07/2011 8:43:00', '13/07/2011 8:44:00', '13/07/2011 8:45:00', '13/07/2011 8:46:00', '13/07/2011 8:47:00', '13/07/2011 8:48:00', '13/07/2011 8:49:00', '13/07/2011 8:50:00', '13/07/2011 8:51:00', '13/07/2011 8:52:00', '13/07/2011 8:53:00', '13/07/2011 8:54:00', '13/07/2011 8:55:00', '13/07/2011 8:56:00', '13/07/2011 8:57:00', '13/07/2011 8:58:00', '13/07/2011 8:59:00', '13/07/2011 9:00:00', '13/07/2011 9:01:00', '13/07/2011 9:02:00', '13/07/2011 9:03:00', '13/07/2011 9:04:00', '13/07/2011 9:05:00', '13/07/2011 9:06:00', '13/07/2011 9:07:00', '13/07/2011 9:08:00', '13/07/2011 9:09:00', '13/07/2011 9:10:00', '13/07/2011 9:11:00', '13/07/2011 9:12:00', '13/07/2011 9:13:00', '13/07/2011 9:14:00', '13/07/2011 9:15:00', '13/07/2011 9:16:00', '13/07/2011 9:17:00', '13/07/2011 9:18:00', '13/07/2011 9:19:00', '13/07/2011 9:20:00', '13/07/2011 9:21:00', '13/07/2011 9:22:00', '13/07/2011 9:23:00', '13/07/2011 9:24:00', '13/07/2011 9:25:00', '13/07/2011 9:26:00', '13/07/2011 9:27:00', '13/07/2011 9:28:00', '13/07/2011 9:29:00', '13/07/2011 9:30:00', '13/07/2011 9:31:00', '13/07/2011 9:32:00', '13/07/2011 9:33:00', '13/07/2011 9:34:00', '13/07/2011 9:35:00', '13/07/2011 9:36:00', '13/07/2011 9:37:00', '13/07/2011 9:38:00', '13/07/2011 9:39:00', '13/07/2011 9:40:00', '13/07/2011 9:41:00', '13/07/2011 9:42:00', '13/07/2011 9:43:00', '13/07/2011 9:44:00', '13/07/2011 9:45:00', '13/07/2011 9:46:00', '13/07/2011 9:47:00', '13/07/2011 9:48:00', '13/07/2011 9:49:00', '13/07/2011 9:50:00', '13/07/2011 9:51:00', '13/07/2011 9:52:00', '13/07/2011 9:53:00', '13/07/2011 9:54:00', '13/07/2011 9:55:00', '13/07/2011 9:56:00', '13/07/2011 9:57:00', '13/07/2011 9:58:00', '13/07/2011 9:59:00', '13/07/2011 10:00:00', '13/07/2011 10:01:00', '13/07/2011 10:02:00', '13/07/2011 10:03:00', '13/07/2011 10:04:00', '13/07/2011 10:05:00', '13/07/2011 10:06:00', '13/07/2011 10:07:00', '13/07/2011 10:08:00', '13/07/2011 10:09:00', '13/07/2011 10:10:00', '13/07/2011 10:11:00', '13/07/2011 10:12:00', '13/07/2011 10:13:00', '13/07/2011 10:14:00', '13/07/2011 10:15:00', '13/07/2011 10:16:00', '13/07/2011 10:17:00', '13/07/2011 10:18:00', '13/07/2011 10:19:00', '13/07/2011 10:20:00', '13/07/2011 10:21:00', '13/07/2011 10:22:00', '13/07/2011 10:23:00', '13/07/2011 10:24:00', '13/07/2011 10:25:00', '13/07/2011 10:26:00', '13/07/2011 10:27:00', '13/07/2011 10:28:00', '13/07/2011 10:29:00', '13/07/2011 10:30:00', '13/07/2011 10:31:00', '13/07/2011 10:32:00', '13/07/2011 10:33:00', '13/07/2011 10:34:00', '13/07/2011 10:35:00', '13/07/2011 10:36:00', '13/07/2011 10:37:00', '13/07/2011 10:38:00', '13/07/2011 10:39:00', '13/07/2011 10:40:00', '13/07/2011 10:41:00', '13/07/2011 10:42:00', '13/07/2011 10:43:00', '13/07/2011 10:44:00', '13/07/2011 10:45:00', '13/07/2011 10:46:00', '13/07/2011 10:47:00', '13/07/2011 10:48:00', '13/07/2011 10:49:00', '13/07/2011 10:50:00', '13/07/2011 10:51:00', '13/07/2011 10:52:00', '13/07/2011 10:53:00', '13/07/2011 10:54:00', '13/07/2011 10:55:00', '13/07/2011 10:56:00', '13/07/2011 10:57:00', '13/07/2011 10:58:00', '13/07/2011 10:59:00', '13/07/2011 11:00:00', '13/07/2011 11:01:00', '13/07/2011 11:02:00', '13/07/2011 11:03:00', '13/07/2011 11:04:00', '13/07/2011 11:05:00', '13/07/2011 11:06:00', '13/07/2011 11:07:00', '13/07/2011 11:08:00', '13/07/2011 11:09:00', '13/07/2011 11:10:00', '13/07/2011 11:11:00', '13/07/2011 11:12:00', '13/07/2011 11:13:00', '13/07/2011 11:14:00', '13/07/2011 11:15:00', '13/07/2011 11:16:00', '13/07/2011 11:17:00', '13/07/2011 11:18:00', '13/07/2011 11:19:00', '13/07/2011 11:20:00', '13/07/2011 11:21:00', '13/07/2011 11:22:00', '13/07/2011 11:23:00', '13/07/2011 11:24:00', '13/07/2011 11:25:00', '13/07/2011 11:26:00', '13/07/2011 11:27:00', '13/07/2011 11:28:00', '13/07/2011 11:29:00', '13/07/2011 11:30:00', '13/07/2011 11:31:00', '13/07/2011 11:32:00', '13/07/2011 11:33:00', '13/07/2011 11:34:00', '13/07/2011 11:35:00', '13/07/2011 11:36:00', '13/07/2011 11:37:00', '13/07/2011 11:38:00', '13/07/2011 11:39:00', '13/07/2011 11:40:00', '13/07/2011 11:41:00', '13/07/2011 11:42:00', '13/07/2011 11:43:00', '13/07/2011 11:44:00', '13/07/2011 11:45:00', '13/07/2011 11:46:00', '13/07/2011 11:47:00', '13/07/2011 11:48:00', '13/07/2011 11:49:00', '13/07/2011 11:50:00', '13/07/2011 11:51:00', '13/07/2011 11:52:00', '13/07/2011 11:53:00', '13/07/2011 11:54:00', '13/07/2011 11:55:00', '13/07/2011 11:56:00', '13/07/2011 11:57:00', '13/07/2011 11:58:00', '13/07/2011 11:59:00', '13/07/2011 12:00:00', '13/07/2011 12:01:00', '13/07/2011 12:02:00', '13/07/2011 12:03:00', '13/07/2011 12:04:00', '13/07/2011 12:05:00', '13/07/2011 12:06:00', '13/07/2011 12:07:00', '13/07/2011 12:08:00', '13/07/2011 12:09:00', '13/07/2011 12:10:00', '13/07/2011 12:11:00', '13/07/2011 12:12:00', '13/07/2011 12:13:00', '13/07/2011 12:14:00', '13/07/2011 12:15:00', '13/07/2011 12:16:00', '13/07/2011 12:17:00', '13/07/2011 12:18:00', '13/07/2011 12:19:00', '13/07/2011 12:20:00', '13/07/2011 12:21:00', '13/07/2011 12:22:00', '13/07/2011 12:23:00', '13/07/2011 12:24:00', '13/07/2011 12:25:00', '13/07/2011 12:26:00', '13/07/2011 12:27:00', '13/07/2011 12:28:00', '13/07/2011 12:29:00', '13/07/2011 12:30:00', '13/07/2011 12:31:00', '13/07/2011 12:32:00', '13/07/2011 12:33:00', '13/07/2011 12:34:00', '13/07/2011 12:35:00', '13/07/2011 12:36:00', '13/07/2011 12:37:00', '13/07/2011 12:38:00', '13/07/2011 12:39:00', '13/07/2011 12:40:00', '13/07/2011 12:41:00', '13/07/2011 12:42:00', '13/07/2011 12:43:00', '13/07/2011 12:44:00', '13/07/2011 12:45:00', '13/07/2011 12:46:00', '13/07/2011 12:47:00', '13/07/2011 12:48:00', '13/07/2011 12:49:00', '13/07/2011 12:50:00', '13/07/2011 12:51:00', '13/07/2011 12:52:00', '13/07/2011 12:53:00', '13/07/2011 12:54:00', '13/07/2011 12:55:00', '13/07/2011 12:56:00', '13/07/2011 12:57:00', '13/07/2011 12:58:00', '13/07/2011 12:59:00', '13/07/2011 13:00:00', '13/07/2011 13:01:00', '13/07/2011 13:02:00', '13/07/2011 13:03:00', '13/07/2011 13:04:00', '13/07/2011 13:05:00', '13/07/2011 13:06:00', '13/07/2011 13:07:00', '13/07/2011 13:08:00', '13/07/2011 13:09:00', '13/07/2011 13:10:00', '13/07/2011 13:11:00', '13/07/2011 13:12:00', '13/07/2011 13:13:00', '13/07/2011 13:14:00', '13/07/2011 13:15:00', '13/07/2011 13:16:00', '13/07/2011 13:17:00', '13/07/2011 13:18:00', '13/07/2011 13:19:00', '13/07/2011 13:20:00', '13/07/2011 13:21:00', '13/07/2011 13:22:00', '13/07/2011 13:23:00', '13/07/2011 13:24:00', '13/07/2011 13:25:00', '13/07/2011 13:26:00', '13/07/2011 13:27:00', '13/07/2011 13:28:00', '13/07/2011 13:29:00', '13/07/2011 13:30:00', '13/07/2011 13:31:00', '13/07/2011 13:32:00', '13/07/2011 13:33:00', '13/07/2011 13:34:00', '13/07/2011 13:35:00', '13/07/2011 13:36:00', '13/07/2011 13:37:00', '13/07/2011 13:38:00', '13/07/2011 13:39:00', '13/07/2011 13:40:00', '13/07/2011 13:41:00', '13/07/2011 13:42:00', '13/07/2011 13:43:00', '13/07/2011 13:44:00', '13/07/2011 13:45:00', '13/07/2011 13:46:00', '13/07/2011 13:47:00', '13/07/2011 13:48:00', '13/07/2011 13:49:00', '13/07/2011 13:50:00', '13/07/2011 13:51:00', '13/07/2011 13:52:00', '13/07/2011 13:53:00', '13/07/2011 13:54:00', '13/07/2011 13:55:00', '13/07/2011 13:56:00', '13/07/2011 13:57:00', '13/07/2011 13:58:00', '13/07/2011 13:59:00', '13/07/2011 14:00:00', '13/07/2011 14:01:00', '13/07/2011 14:02:00', '13/07/2011 14:03:00', '13/07/2011 14:04:00', '13/07/2011 14:05:00', '13/07/2011 14:06:00', '13/07/2011 14:07:00', '13/07/2011 14:08:00', '13/07/2011 14:09:00', '13/07/2011 14:10:00', '13/07/2011 14:11:00', '13/07/2011 14:12:00', '13/07/2011 14:13:00', '13/07/2011 14:14:00', '13/07/2011 14:15:00', '13/07/2011 14:16:00', '13/07/2011 14:17:00', '13/07/2011 14:18:00', '13/07/2011 14:19:00', '13/07/2011 14:20:00', '13/07/2011 14:21:00', '13/07/2011 14:22:00', '13/07/2011 14:23:00', '13/07/2011 14:24:00', '13/07/2011 14:25:00', '13/07/2011 14:26:00', '13/07/2011 14:27:00', '13/07/2011 14:28:00', '13/07/2011 14:29:00', '13/07/2011 14:30:00', '13/07/2011 14:31:00', '13/07/2011 14:32:00', '13/07/2011 14:33:00', '13/07/2011 14:34:00', '13/07/2011 14:35:00', '13/07/2011 14:36:00', '13/07/2011 14:37:00', '13/07/2011 14:38:00', '13/07/2011 14:39:00', '13/07/2011 14:40:00', '13/07/2011 14:41:00', '13/07/2011 14:42:00', '13/07/2011 14:43:00', '13/07/2011 14:44:00', '13/07/2011 14:45:00', '13/07/2011 14:46:00', '13/07/2011 14:47:00', '13/07/2011 14:48:00', '13/07/2011 14:49:00', '13/07/2011 14:50:00', '13/07/2011 14:51:00', '13/07/2011 14:52:00', '13/07/2011 14:53:00', '13/07/2011 14:54:00', '13/07/2011 14:55:00', '13/07/2011 14:56:00', '13/07/2011 14:57:00', '13/07/2011 14:58:00', '13/07/2011 14:59:00', '13/07/2011 15:00:00', '13/07/2011 15:01:00', '13/07/2011 15:02:00', '13/07/2011 15:03:00', '13/07/2011 15:04:00', '13/07/2011 15:05:00', '13/07/2011 15:06:00', '13/07/2011 15:07:00', '13/07/2011 15:08:00', '13/07/2011 15:09:00', '13/07/2011 15:10:00', '13/07/2011 15:11:00', '13/07/2011 15:12:00', '13/07/2011 15:13:00', '13/07/2011 15:14:00', '13/07/2011 15:15:00', '13/07/2011 15:16:00', '13/07/2011 15:17:00', '13/07/2011 15:18:00', '13/07/2011 15:19:00', '13/07/2011 15:20:00', '13/07/2011 15:21:00', '13/07/2011 15:22:00', '13/07/2011 15:23:00', '13/07/2011 15:24:00', '13/07/2011 15:25:00', '13/07/2011 15:26:00', '13/07/2011 15:27:00', '13/07/2011 15:28:00', '13/07/2011 15:29:00', '13/07/2011 15:30:00', '13/07/2011 15:31:00', '13/07/2011 15:32:00', '13/07/2011 15:33:00', '13/07/2011 15:34:00', '13/07/2011 15:35:00', '13/07/2011 15:36:00', '13/07/2011 15:37:00', '13/07/2011 15:38:00', '13/07/2011 15:39:00', '13/07/2011 15:40:00', '13/07/2011 15:41:00', '13/07/2011 15:42:00', '13/07/2011 15:43:00', '13/07/2011 15:44:00', '13/07/2011 15:45:00', '13/07/2011 15:46:00', '13/07/2011 15:47:00', '13/07/2011 15:48:00', '13/07/2011 15:49:00', '13/07/2011 15:50:00', '13/07/2011 15:51:00', '13/07/2011 15:52:00', '13/07/2011 15:53:00', '13/07/2011 15:54:00', '13/07/2011 15:55:00', '13/07/2011 15:56:00', '13/07/2011 15:57:00', '13/07/2011 15:58:00', '13/07/2011 15:59:00', '13/07/2011 16:00:00', '13/07/2011 16:01:00', '13/07/2011 16:02:00', '13/07/2011 16:03:00', '13/07/2011 16:04:00', '13/07/2011 16:05:00', '13/07/2011 16:06:00', '13/07/2011 16:07:00', '13/07/2011 16:08:00', '13/07/2011 16:09:00', '13/07/2011 16:10:00', '13/07/2011 16:11:00', '13/07/2011 16:12:00', '13/07/2011 16:13:00', '13/07/2011 16:14:00', '13/07/2011 16:15:00', '13/07/2011 16:16:00', '13/07/2011 16:17:00', '13/07/2011 16:18:00', '13/07/2011 16:19:00', '13/07/2011 16:20:00', '13/07/2011 16:21:00', '13/07/2011 16:22:00', '13/07/2011 16:23:00', '13/07/2011 16:24:00', '13/07/2011 16:25:00', '13/07/2011 16:26:00', '13/07/2011 16:27:00', '13/07/2011 16:28:00', '13/07/2011 16:29:00', '13/07/2011 16:30:00', '13/07/2011 16:31:00', '13/07/2011 16:32:00', '13/07/2011 16:33:00', '13/07/2011 16:34:00', '13/07/2011 16:35:00', '13/07/2011 16:36:00', '13/07/2011 16:37:00', '13/07/2011 16:38:00', '13/07/2011 16:39:00', '13/07/2011 16:40:00', '13/07/2011 16:41:00', '13/07/2011 16:42:00', '13/07/2011 16:43:00', '13/07/2011 16:44:00', '13/07/2011 16:45:00', '13/07/2011 16:46:00', '13/07/2011 16:47:00', '13/07/2011 16:48:00', '13/07/2011 16:49:00', '13/07/2011 16:50:00', '13/07/2011 16:51:00', '13/07/2011 16:52:00', '13/07/2011 16:53:00', '13/07/2011 16:54:00', '13/07/2011 16:55:00', '13/07/2011 16:56:00', '13/07/2011 16:57:00', '13/07/2011 16:58:00', '13/07/2011 16:59:00', '13/07/2011 17:00:00', '13/07/2011 17:01:00', '13/07/2011 17:02:00', '13/07/2011 17:03:00', '13/07/2011 17:04:00', '13/07/2011 17:05:00', '13/07/2011 17:06:00', '13/07/2011 17:07:00', '13/07/2011 17:08:00', '13/07/2011 17:09:00', '13/07/2011 17:10:00', '13/07/2011 17:11:00', '13/07/2011 17:12:00', '13/07/2011 17:13:00', '13/07/2011 17:14:00', '13/07/2011 17:15:00', '13/07/2011 17:16:00', '13/07/2011 17:17:00', '13/07/2011 17:18:00', '13/07/2011 17:19:00', '13/07/2011 17:20:00', '13/07/2011 17:21:00', '13/07/2011 17:22:00', '13/07/2011 17:23:00', '13/07/2011 17:24:00', '13/07/2011 17:25:00', '13/07/2011 17:26:00', '13/07/2011 17:27:00', '13/07/2011 17:28:00', '13/07/2011 17:29:00', '13/07/2011 17:30:00', '13/07/2011 17:31:00', '13/07/2011 17:32:00', '13/07/2011 17:33:00', '13/07/2011 17:34:00', '13/07/2011 17:35:00', '13/07/2011 17:36:00', '13/07/2011 17:37:00', '13/07/2011 17:38:00', '13/07/2011 17:39:00', '13/07/2011 17:40:00', '13/07/2011 17:41:00', '13/07/2011 17:42:00', '13/07/2011 17:43:00', '13/07/2011 17:44:00', '13/07/2011 17:45:00', '13/07/2011 17:46:00', '13/07/2011 17:47:00', '13/07/2011 17:48:00', '13/07/2011 17:49:00', '13/07/2011 17:50:00', '13/07/2011 17:51:00', '13/07/2011 17:52:00', '13/07/2011 17:53:00', '13/07/2011 17:54:00', '13/07/2011 17:55:00', '13/07/2011 17:56:00', '13/07/2011 17:57:00', '13/07/2011 17:58:00', '13/07/2011 17:59:00', '13/07/2011 18:00:00', '13/07/2011 18:01:00', '13/07/2011 18:02:00', '13/07/2011 18:03:00', '13/07/2011 18:04:00', '13/07/2011 18:05:00', '13/07/2011 18:06:00', '13/07/2011 18:07:00', '13/07/2011 18:08:00', '13/07/2011 18:09:00', '13/07/2011 18:10:00', '13/07/2011 18:11:00', '13/07/2011 18:12:00', '13/07/2011 18:13:00', '13/07/2011 18:14:00', '13/07/2011 18:15:00', '13/07/2011 18:16:00', '13/07/2011 18:17:00', '13/07/2011 18:18:00', '13/07/2011 18:19:00', '13/07/2011 18:20:00', '13/07/2011 18:21:00', '13/07/2011 18:22:00', '13/07/2011 18:23:00', '13/07/2011 18:24:00', '13/07/2011 18:25:00', '13/07/2011 18:26:00', '13/07/2011 18:27:00', '13/07/2011 18:28:00', '13/07/2011 18:29:00', '13/07/2011 18:30:00', '13/07/2011 18:31:00', '13/07/2011 18:32:00', '13/07/2011 18:33:00', '13/07/2011 18:34:00', '13/07/2011 18:35:00', '13/07/2011 18:36:00', '13/07/2011 18:37:00', '13/07/2011 18:38:00', '13/07/2011 18:39:00', '13/07/2011 18:40:00', '13/07/2011 18:41:00', '13/07/2011 18:42:00', '13/07/2011 18:43:00', '13/07/2011 18:44:00', '13/07/2011 18:45:00', '13/07/2011 18:46:00', '13/07/2011 18:47:00', '13/07/2011 18:48:00', '13/07/2011 18:49:00', '13/07/2011 18:50:00', '13/07/2011 18:51:00', '13/07/2011 18:52:00', '13/07/2011 18:53:00', '13/07/2011 18:54:00', '13/07/2011 18:55:00', '13/07/2011 18:56:00', '13/07/2011 18:57:00', '13/07/2011 18:58:00', '13/07/2011 18:59:00', '13/07/2011 19:00:00', '13/07/2011 19:01:00', '13/07/2011 19:02:00', '13/07/2011 19:03:00', '13/07/2011 19:04:00', '13/07/2011 19:05:00', '13/07/2011 19:06:00', '13/07/2011 19:07:00', '13/07/2011 19:08:00', '13/07/2011 19:09:00', '13/07/2011 19:10:00', '13/07/2011 19:11:00', '13/07/2011 19:12:00', '13/07/2011 19:13:00', '13/07/2011 19:14:00', '13/07/2011 19:15:00', '13/07/2011 19:16:00', '13/07/2011 19:17:00', '13/07/2011 19:18:00', '13/07/2011 19:19:00', '13/07/2011 19:20:00', '13/07/2011 19:21:00', '13/07/2011 19:22:00', '13/07/2011 19:23:00', '13/07/2011 19:24:00', '13/07/2011 19:25:00', '13/07/2011 19:26:00', '13/07/2011 19:27:00', '13/07/2011 19:28:00', '13/07/2011 19:29:00', '13/07/2011 19:30:00', '13/07/2011 19:31:00', '13/07/2011 19:32:00', '13/07/2011 19:33:00', '13/07/2011 19:34:00', '13/07/2011 19:35:00', '13/07/2011 19:36:00', '13/07/2011 19:37:00', '13/07/2011 19:38:00', '13/07/2011 19:39:00', '13/07/2011 19:40:00', '13/07/2011 19:41:00', '13/07/2011 19:42:00', '13/07/2011 19:43:00', '13/07/2011 19:44:00', '13/07/2011 19:45:00', '13/07/2011 19:46:00', '13/07/2011 19:47:00', '13/07/2011 19:48:00', '13/07/2011 19:49:00', '13/07/2011 19:50:00', '13/07/2011 19:51:00', '13/07/2011 19:52:00', '13/07/2011 19:53:00', '13/07/2011 19:54:00', '13/07/2011 19:55:00', '13/07/2011 19:56:00', '13/07/2011 19:57:00', '13/07/2011 19:58:00', '13/07/2011 19:59:00', '13/07/2011 20:00:00', '13/07/2011 20:01:00', '13/07/2011 20:02:00', '13/07/2011 20:03:00', '13/07/2011 20:04:00', '13/07/2011 20:05:00', '13/07/2011 20:06:00', '13/07/2011 20:07:00', '13/07/2011 20:08:00', '13/07/2011 20:09:00', '13/07/2011 20:10:00', '13/07/2011 20:11:00', '13/07/2011 20:12:00', '13/07/2011 20:13:00', '13/07/2011 20:14:00', '13/07/2011 20:15:00', '13/07/2011 20:16:00', '13/07/2011 20:17:00', '13/07/2011 20:18:00', '13/07/2011 20:19:00', '13/07/2011 20:20:00', '13/07/2011 20:21:00', '13/07/2011 20:22:00', '13/07/2011 20:23:00', '13/07/2011 20:24:00', '13/07/2011 20:25:00', '13/07/2011 20:26:00', '13/07/2011 20:27:00', '13/07/2011 20:28:00', '13/07/2011 20:29:00', '13/07/2011 20:30:00', '13/07/2011 20:31:00', '13/07/2011 20:32:00', '13/07/2011 20:33:00', '13/07/2011 20:34:00', '13/07/2011 20:35:00', '13/07/2011 20:36:00', '13/07/2011 20:37:00', '13/07/2011 20:38:00', '13/07/2011 20:39:00', '13/07/2011 20:40:00', '13/07/2011 20:41:00', '13/07/2011 20:42:00', '13/07/2011 20:43:00', '13/07/2011 20:44:00', '13/07/2011 20:45:00', '13/07/2011 20:46:00', '13/07/2011 20:47:00', '13/07/2011 20:48:00', '13/07/2011 20:49:00', '13/07/2011 20:50:00', '13/07/2011 20:51:00', '13/07/2011 20:52:00', '13/07/2011 20:53:00', '13/07/2011 20:54:00', '13/07/2011 20:55:00', '13/07/2011 20:56:00', '13/07/2011 20:57:00', '13/07/2011 20:58:00', '13/07/2011 20:59:00', '13/07/2011 21:00:00', '13/07/2011 21:01:00', '13/07/2011 21:02:00', '13/07/2011 21:03:00', '13/07/2011 21:04:00', '13/07/2011 21:05:00', '13/07/2011 21:06:00', '13/07/2011 21:07:00', '13/07/2011 21:08:00', '13/07/2011 21:09:00', '13/07/2011 21:10:00', '13/07/2011 21:11:00', '13/07/2011 21:12:00', '13/07/2011 21:13:00', '13/07/2011 21:14:00', '13/07/2011 21:15:00', '13/07/2011 21:16:00', '13/07/2011 21:17:00', '13/07/2011 21:18:00', '13/07/2011 21:19:00', '13/07/2011 21:20:00', '13/07/2011 21:21:00', '13/07/2011 21:22:00', '13/07/2011 21:23:00', '13/07/2011 21:24:00', '13/07/2011 21:25:00', '13/07/2011 21:26:00', '13/07/2011 21:27:00', '13/07/2011 21:28:00', '13/07/2011 21:29:00', '13/07/2011 21:30:00', '13/07/2011 21:31:00', '13/07/2011 21:32:00', '13/07/2011 21:33:00', '13/07/2011 21:34:00', '13/07/2011 21:35:00', '13/07/2011 21:36:00', '13/07/2011 21:37:00', '13/07/2011 21:38:00', '13/07/2011 21:39:00', '13/07/2011 21:40:00', '13/07/2011 21:41:00', '13/07/2011 21:42:00', '13/07/2011 21:43:00', '13/07/2011 21:44:00', '13/07/2011 21:45:00', '13/07/2011 21:46:00', '13/07/2011 21:47:00', '13/07/2011 21:48:00', '13/07/2011 21:49:00', '13/07/2011 21:50:00', '13/07/2011 21:51:00', '13/07/2011 21:52:00', '13/07/2011 21:53:00', '13/07/2011 21:54:00', '13/07/2011 21:55:00', '13/07/2011 21:56:00', '13/07/2011 21:57:00', '13/07/2011 21:58:00', '13/07/2011 21:59:00', '13/07/2011 22:00:00', '13/07/2011 22:01:00', '13/07/2011 22:02:00', '13/07/2011 22:03:00', '13/07/2011 22:04:00', '13/07/2011 22:05:00', '13/07/2011 22:06:00', '13/07/2011 22:07:00', '13/07/2011 22:08:00', '13/07/2011 22:09:00', '13/07/2011 22:10:00', '13/07/2011 22:11:00', '13/07/2011 22:12:00', '13/07/2011 22:13:00', '13/07/2011 22:14:00', '13/07/2011 22:15:00', '13/07/2011 22:16:00', '13/07/2011 22:17:00', '13/07/2011 22:18:00', '13/07/2011 22:19:00', '13/07/2011 22:20:00', '13/07/2011 22:21:00', '13/07/2011 22:22:00', '13/07/2011 22:23:00', '13/07/2011 22:24:00', '13/07/2011 22:25:00', '13/07/2011 22:26:00', '13/07/2011 22:27:00', '13/07/2011 22:28:00', '13/07/2011 22:29:00', '13/07/2011 22:30:00', '13/07/2011 22:31:00', '13/07/2011 22:32:00', '13/07/2011 22:33:00', '13/07/2011 22:34:00', '13/07/2011 22:35:00', '13/07/2011 22:36:00', '13/07/2011 22:37:00', '13/07/2011 22:38:00', '13/07/2011 22:39:00', '13/07/2011 22:40:00', '13/07/2011 22:41:00', '13/07/2011 22:42:00', '13/07/2011 22:43:00', '13/07/2011 22:44:00', '13/07/2011 22:45:00', '13/07/2011 22:46:00', '13/07/2011 22:47:00', '13/07/2011 22:48:00', '13/07/2011 22:49:00', '13/07/2011 22:50:00', '13/07/2011 22:51:00', '13/07/2011 22:52:00', '13/07/2011 22:53:00', '13/07/2011 22:54:00', '13/07/2011 22:55:00', '13/07/2011 22:56:00', '13/07/2011 22:57:00', '13/07/2011 22:58:00', '13/07/2011 22:59:00', '13/07/2011 23:00:00', '13/07/2011 23:01:00', '13/07/2011 23:02:00', '13/07/2011 23:03:00', '13/07/2011 23:04:00', '13/07/2011 23:05:00', '13/07/2011 23:06:00', '13/07/2011 23:07:00', '13/07/2011 23:08:00', '13/07/2011 23:09:00', '13/07/2011 23:10:00', '13/07/2011 23:11:00', '13/07/2011 23:12:00', '13/07/2011 23:13:00', '13/07/2011 23:14:00', '13/07/2011 23:15:00', '13/07/2011 23:16:00', '13/07/2011 23:17:00', '13/07/2011 23:18:00', '13/07/2011 23:19:00', '13/07/2011 23:20:00', '13/07/2011 23:21:00', '13/07/2011 23:22:00', '13/07/2011 23:23:00', '13/07/2011 23:24:00', '13/07/2011 23:25:00', '13/07/2011 23:26:00', '13/07/2011 23:27:00', '13/07/2011 23:28:00', '13/07/2011 23:29:00', '13/07/2011 23:30:00', '13/07/2011 23:31:00', '13/07/2011 23:32:00', '13/07/2011 23:33:00', '13/07/2011 23:34:00', '13/07/2011 23:35:00', '13/07/2011 23:36:00', '13/07/2011 23:37:00', '13/07/2011 23:38:00', '13/07/2011 23:39:00', '13/07/2011 23:40:00', '13/07/2011 23:41:00', '13/07/2011 23:42:00', '13/07/2011 23:43:00', '13/07/2011 23:44:00', '13/07/2011 23:45:00', '13/07/2011 23:46:00', '13/07/2011 23:47:00', '13/07/2011 23:48:00', '13/07/2011 23:49:00', '13/07/2011 23:50:00', '13/07/2011 23:51:00', '13/07/2011 23:52:00', '13/07/2011 23:53:00', '13/07/2011 23:54:00', '13/07/2011 23:55:00', '13/07/2011 23:56:00', '13/07/2011 23:57:00', '13/07/2011 23:58:00', '13/07/2011 23:59:00', '14/07/2011 0:00:00', '14/07/2011 0:01:00', '14/07/2011 0:02:00', '14/07/2011 0:03:00', '14/07/2011 0:04:00', '14/07/2011 0:05:00', '14/07/2011 0:06:00', '14/07/2011 0:07:00', '14/07/2011 0:08:00', '14/07/2011 0:09:00', '14/07/2011 0:10:00', '14/07/2011 0:11:00', '14/07/2011 0:12:00', '14/07/2011 0:13:00', '14/07/2011 0:14:00', '14/07/2011 0:15:00', '14/07/2011 0:16:00', '14/07/2011 0:17:00', '14/07/2011 0:18:00', '14/07/2011 0:19:00', '14/07/2011 0:20:00', '14/07/2011 0:21:00', '14/07/2011 0:22:00', '14/07/2011 0:23:00', '14/07/2011 0:24:00', '14/07/2011 0:25:00', '14/07/2011 0:26:00', '14/07/2011 0:27:00', '14/07/2011 0:28:00', '14/07/2011 0:29:00', '14/07/2011 0:30:00', '14/07/2011 0:31:00', '14/07/2011 0:32:00', '14/07/2011 0:33:00', '14/07/2011 0:34:00', '14/07/2011 0:35:00', '14/07/2011 0:36:00', '14/07/2011 0:37:00', '14/07/2011 0:38:00', '14/07/2011 0:39:00', '14/07/2011 0:40:00', '14/07/2011 0:41:00', '14/07/2011 0:42:00', '14/07/2011 0:43:00', '14/07/2011 0:44:00', '14/07/2011 0:45:00', '14/07/2011 0:46:00', '14/07/2011 0:47:00', '14/07/2011 0:48:00', '14/07/2011 0:49:00', '14/07/2011 0:50:00', '14/07/2011 0:51:00', '14/07/2011 0:52:00', '14/07/2011 0:53:00', '14/07/2011 0:54:00', '14/07/2011 0:55:00', '14/07/2011 0:56:00', '14/07/2011 0:57:00', '14/07/2011 0:58:00', '14/07/2011 0:59:00', '14/07/2011 1:00:00', '14/07/2011 1:01:00', '14/07/2011 1:02:00', '14/07/2011 1:03:00', '14/07/2011 1:04:00', '14/07/2011 1:05:00', '14/07/2011 1:06:00', '14/07/2011 1:07:00', '14/07/2011 1:08:00', '14/07/2011 1:09:00', '14/07/2011 1:10:00', '14/07/2011 1:11:00', '14/07/2011 1:12:00', '14/07/2011 1:13:00', '14/07/2011 1:14:00', '14/07/2011 1:15:00', '14/07/2011 1:16:00', '14/07/2011 1:17:00', '14/07/2011 1:18:00', '14/07/2011 1:19:00', '14/07/2011 1:20:00', '14/07/2011 1:21:00', '14/07/2011 1:22:00', '14/07/2011 1:23:00', '14/07/2011 1:24:00', '14/07/2011 1:25:00', '14/07/2011 1:26:00', '14/07/2011 1:27:00', '14/07/2011 1:28:00', '14/07/2011 1:29:00', '14/07/2011 1:30:00', '14/07/2011 1:31:00', '14/07/2011 1:32:00', '14/07/2011 1:33:00', '14/07/2011 1:34:00', '14/07/2011 1:35:00', '14/07/2011 1:36:00', '14/07/2011 1:37:00', '14/07/2011 1:38:00', '14/07/2011 1:39:00', '14/07/2011 1:40:00', '14/07/2011 1:41:00', '14/07/2011 1:42:00', '14/07/2011 1:43:00', '14/07/2011 1:44:00', '14/07/2011 1:45:00', '14/07/2011 1:46:00', '14/07/2011 1:47:00', '14/07/2011 1:48:00', '14/07/2011 1:49:00', '14/07/2011 1:50:00', '14/07/2011 1:51:00', '14/07/2011 1:52:00', '14/07/2011 1:53:00', '14/07/2011 1:54:00', '14/07/2011 1:55:00', '14/07/2011 1:56:00', '14/07/2011 1:57:00', '14/07/2011 1:58:00', '14/07/2011 1:59:00', '14/07/2011 2:00:00', '14/07/2011 2:01:00', '14/07/2011 2:02:00', '14/07/2011 2:03:00', '14/07/2011 2:04:00', '14/07/2011 2:05:00', '14/07/2011 2:06:00', '14/07/2011 2:07:00', '14/07/2011 2:08:00', '14/07/2011 2:09:00', '14/07/2011 2:10:00', '14/07/2011 2:11:00', '14/07/2011 2:12:00', '14/07/2011 2:13:00', '14/07/2011 2:14:00', '14/07/2011 2:15:00', '14/07/2011 2:16:00', '14/07/2011 2:17:00', '14/07/2011 2:18:00', '14/07/2011 2:19:00', '14/07/2011 2:20:00', '14/07/2011 2:21:00', '14/07/2011 2:22:00', '14/07/2011 2:23:00', '14/07/2011 2:24:00', '14/07/2011 2:25:00', '14/07/2011 2:26:00', '14/07/2011 2:27:00', '14/07/2011 2:28:00', '14/07/2011 2:29:00', '14/07/2011 2:30:00', '14/07/2011 2:31:00', '14/07/2011 2:32:00', '14/07/2011 2:33:00', '14/07/2011 2:34:00', '14/07/2011 2:35:00', '14/07/2011 2:36:00', '14/07/2011 2:37:00', '14/07/2011 2:38:00', '14/07/2011 2:39:00', '14/07/2011 2:40:00', '14/07/2011 2:41:00', '14/07/2011 2:42:00', '14/07/2011 2:43:00', '14/07/2011 2:44:00', '14/07/2011 2:45:00', '14/07/2011 2:46:00', '14/07/2011 2:47:00', '14/07/2011 2:48:00', '14/07/2011 2:49:00', '14/07/2011 2:50:00', '14/07/2011 2:51:00', '14/07/2011 2:52:00', '14/07/2011 2:53:00', '14/07/2011 2:54:00', '14/07/2011 2:55:00', '14/07/2011 2:56:00', '14/07/2011 2:57:00', '14/07/2011 2:58:00', '14/07/2011 2:59:00', '14/07/2011 3:00:00', '14/07/2011 3:01:00', '14/07/2011 3:02:00', '14/07/2011 3:03:00', '14/07/2011 3:04:00', '14/07/2011 3:05:00', '14/07/2011 3:06:00', '14/07/2011 3:07:00', '14/07/2011 3:08:00', '14/07/2011 3:09:00', '14/07/2011 3:10:00', '14/07/2011 3:11:00', '14/07/2011 3:12:00', '14/07/2011 3:13:00', '14/07/2011 3:14:00', '14/07/2011 3:15:00', '14/07/2011 3:16:00', '14/07/2011 3:17:00', '14/07/2011 3:18:00', '14/07/2011 3:19:00', '14/07/2011 3:20:00', '14/07/2011 3:21:00', '14/07/2011 3:22:00', '14/07/2011 3:23:00', '14/07/2011 3:24:00', '14/07/2011 3:25:00', '14/07/2011 3:26:00', '14/07/2011 3:27:00', '14/07/2011 3:28:00', '14/07/2011 3:29:00', '14/07/2011 3:30:00', '14/07/2011 3:31:00', '14/07/2011 3:32:00', '14/07/2011 3:33:00', '14/07/2011 3:34:00', '14/07/2011 3:35:00', '14/07/2011 3:36:00', '14/07/2011 3:37:00', '14/07/2011 3:38:00', '14/07/2011 3:39:00', '14/07/2011 3:40:00', '14/07/2011 3:41:00', '14/07/2011 3:42:00', '14/07/2011 3:43:00', '14/07/2011 3:44:00', '14/07/2011 3:45:00', '14/07/2011 3:46:00', '14/07/2011 3:47:00', '14/07/2011 3:48:00', '14/07/2011 3:49:00', '14/07/2011 3:50:00', '14/07/2011 3:51:00', '14/07/2011 3:52:00', '14/07/2011 3:53:00', '14/07/2011 3:54:00', '14/07/2011 3:55:00', '14/07/2011 3:56:00', '14/07/2011 3:57:00', '14/07/2011 3:58:00', '14/07/2011 3:59:00', '14/07/2011 4:00:00', '14/07/2011 4:01:00', '14/07/2011 4:02:00', '14/07/2011 4:03:00', '14/07/2011 4:04:00', '14/07/2011 4:05:00', '14/07/2011 4:06:00', '14/07/2011 4:07:00', '14/07/2011 4:08:00', '14/07/2011 4:09:00', '14/07/2011 4:10:00', '14/07/2011 4:11:00', '14/07/2011 4:12:00', '14/07/2011 4:13:00', '14/07/2011 4:14:00', '14/07/2011 4:15:00', '14/07/2011 4:16:00', '14/07/2011 4:17:00', '14/07/2011 4:18:00', '14/07/2011 4:19:00', '14/07/2011 4:20:00', '14/07/2011 4:21:00', '14/07/2011 4:22:00', '14/07/2011 4:23:00', '14/07/2011 4:24:00', '14/07/2011 4:25:00', '14/07/2011 4:26:00', '14/07/2011 4:27:00', '14/07/2011 4:28:00', '14/07/2011 4:29:00', '14/07/2011 4:30:00', '14/07/2011 4:31:00', '14/07/2011 4:32:00', '14/07/2011 4:33:00', '14/07/2011 4:34:00', '14/07/2011 4:35:00', '14/07/2011 4:36:00', '14/07/2011 4:37:00', '14/07/2011 4:38:00', '14/07/2011 4:39:00', '14/07/2011 4:40:00', '14/07/2011 4:41:00', '14/07/2011 4:42:00', '14/07/2011 4:43:00', '14/07/2011 4:44:00', '14/07/2011 4:45:00', '14/07/2011 4:46:00', '14/07/2011 4:47:00', '14/07/2011 4:48:00', '14/07/2011 4:49:00', '14/07/2011 4:50:00', '14/07/2011 4:51:00', '14/07/2011 4:52:00', '14/07/2011 4:53:00', '14/07/2011 4:54:00', '14/07/2011 4:55:00', '14/07/2011 4:56:00', '14/07/2011 4:57:00', '14/07/2011 4:58:00', '14/07/2011 4:59:00', '14/07/2011 5:00:00', '14/07/2011 5:01:00', '14/07/2011 5:02:00', '14/07/2011 5:03:00', '14/07/2011 5:04:00', '14/07/2011 5:05:00', '14/07/2011 5:06:00', '14/07/2011 5:07:00', '14/07/2011 5:08:00', '14/07/2011 5:09:00', '14/07/2011 5:10:00', '14/07/2011 5:11:00', '14/07/2011 5:12:00', '14/07/2011 5:13:00', '14/07/2011 5:14:00', '14/07/2011 5:15:00', '14/07/2011 5:16:00', '14/07/2011 5:17:00', '14/07/2011 5:18:00', '14/07/2011 5:19:00', '14/07/2011 5:20:00', '14/07/2011 5:21:00', '14/07/2011 5:22:00', '14/07/2011 5:23:00', '14/07/2011 5:24:00', '14/07/2011 5:25:00', '14/07/2011 5:26:00', '14/07/2011 5:27:00', '14/07/2011 5:28:00', '14/07/2011 5:29:00', '14/07/2011 5:30:00', '14/07/2011 5:31:00', '14/07/2011 5:32:00', '14/07/2011 5:33:00', '14/07/2011 5:34:00', '14/07/2011 5:35:00', '14/07/2011 5:36:00', '14/07/2011 5:37:00', '14/07/2011 5:38:00', '14/07/2011 5:39:00', '14/07/2011 5:40:00', '14/07/2011 5:41:00', '14/07/2011 5:42:00', '14/07/2011 5:43:00', '14/07/2011 5:44:00', '14/07/2011 5:45:00', '14/07/2011 5:46:00', '14/07/2011 5:47:00', '14/07/2011 5:48:00', '14/07/2011 5:49:00', '14/07/2011 5:50:00', '14/07/2011 5:51:00', '14/07/2011 5:52:00', '14/07/2011 5:53:00', '14/07/2011 5:54:00', '14/07/2011 5:55:00', '14/07/2011 5:56:00', '14/07/2011 5:57:00', '14/07/2011 5:58:00', '14/07/2011 5:59:00', '14/07/2011 6:00:00', '14/07/2011 6:01:00', '14/07/2011 6:02:00', '14/07/2011 6:03:00', '14/07/2011 6:04:00', '14/07/2011 6:05:00', '14/07/2011 6:06:00', '14/07/2011 6:07:00', '14/07/2011 6:08:00', '14/07/2011 6:09:00', '14/07/2011 6:10:00', '14/07/2011 6:11:00', '14/07/2011 6:12:00', '14/07/2011 6:13:00', '14/07/2011 6:14:00', '14/07/2011 6:15:00', '14/07/2011 6:16:00', '14/07/2011 6:17:00', '14/07/2011 6:18:00', '14/07/2011 6:19:00', '14/07/2011 6:20:00', '14/07/2011 6:21:00', '14/07/2011 6:22:00', '14/07/2011 6:23:00', '14/07/2011 6:24:00', '14/07/2011 6:25:00', '14/07/2011 6:26:00', '14/07/2011 6:27:00', '14/07/2011 6:28:00', '14/07/2011 6:29:00', '14/07/2011 6:30:00', '14/07/2011 6:31:00', '14/07/2011 6:32:00', '14/07/2011 6:33:00', '14/07/2011 6:34:00', '14/07/2011 6:35:00', '14/07/2011 6:36:00', '14/07/2011 6:37:00', '14/07/2011 6:38:00', '14/07/2011 6:39:00', '14/07/2011 6:40:00', '14/07/2011 6:41:00', '14/07/2011 6:42:00', '14/07/2011 6:43:00', '14/07/2011 6:44:00', '14/07/2011 6:45:00', '14/07/2011 6:46:00', '14/07/2011 6:47:00', '14/07/2011 6:48:00', '14/07/2011 6:49:00', '14/07/2011 6:50:00', '14/07/2011 6:51:00', '14/07/2011 6:52:00', '14/07/2011 6:53:00', '14/07/2011 6:54:00', '14/07/2011 6:55:00', '14/07/2011 6:56:00', '14/07/2011 6:57:00', '14/07/2011 6:58:00', '14/07/2011 6:59:00', '14/07/2011 7:00:00', '14/07/2011 7:01:00', '14/07/2011 7:02:00', '14/07/2011 7:03:00', '14/07/2011 7:04:00', '14/07/2011 7:05:00', '14/07/2011 7:06:00', '14/07/2011 7:07:00', '14/07/2011 7:08:00', '14/07/2011 7:09:00', '14/07/2011 7:10:00', '14/07/2011 7:11:00', '14/07/2011 7:12:00', '14/07/2011 7:13:00', '14/07/2011 7:14:00', '14/07/2011 7:15:00', '14/07/2011 7:16:00', '14/07/2011 7:17:00', '14/07/2011 7:18:00', '14/07/2011 7:19:00', '14/07/2011 7:20:00', '14/07/2011 7:21:00', '14/07/2011 7:22:00', '14/07/2011 7:23:00', '14/07/2011 7:24:00', '14/07/2011 7:25:00', '14/07/2011 7:26:00', '14/07/2011 7:27:00', '14/07/2011 7:28:00', '14/07/2011 7:29:00', '14/07/2011 7:30:00', '14/07/2011 7:31:00', '14/07/2011 7:32:00', '14/07/2011 7:33:00', '14/07/2011 7:34:00', '14/07/2011 7:35:00', '14/07/2011 7:36:00', '14/07/2011 7:37:00', '14/07/2011 7:38:00', '14/07/2011 7:39:00', '14/07/2011 7:40:00', '14/07/2011 7:41:00', '14/07/2011 7:42:00', '14/07/2011 7:43:00', '14/07/2011 7:44:00', '14/07/2011 7:45:00', '14/07/2011 7:46:00', '14/07/2011 7:47:00', '14/07/2011 7:48:00', '14/07/2011 7:49:00', '14/07/2011 7:50:00', '14/07/2011 7:51:00', '14/07/2011 7:52:00', '14/07/2011 7:53:00', '14/07/2011 7:54:00', '14/07/2011 7:55:00', '14/07/2011 7:56:00', '14/07/2011 7:57:00', '14/07/2011 7:58:00', '14/07/2011 7:59:00', '14/07/2011 8:00:00', '14/07/2011 8:01:00', '14/07/2011 8:02:00', '14/07/2011 8:03:00', '14/07/2011 8:04:00', '14/07/2011 8:05:00', '14/07/2011 8:06:00', '14/07/2011 8:07:00', '14/07/2011 8:08:00', '14/07/2011 8:09:00', '14/07/2011 8:10:00', '14/07/2011 8:11:00', '14/07/2011 8:12:00', '14/07/2011 8:13:00', '14/07/2011 8:14:00', '14/07/2011 8:15:00', '14/07/2011 8:16:00', '14/07/2011 8:17:00', '14/07/2011 8:18:00', '14/07/2011 8:19:00', '14/07/2011 8:20:00', '14/07/2011 8:21:00', '14/07/2011 8:22:00', '14/07/2011 8:23:00', '14/07/2011 8:24:00', '14/07/2011 8:25:00', '14/07/2011 8:26:00', '14/07/2011 8:27:00', '14/07/2011 8:28:00', '14/07/2011 8:29:00', '14/07/2011 8:30:00', '14/07/2011 8:31:00', '14/07/2011 8:32:00', '14/07/2011 8:33:00', '14/07/2011 8:34:00', '14/07/2011 8:35:00', '14/07/2011 8:36:00', '14/07/2011 8:37:00', '14/07/2011 8:38:00', '14/07/2011 8:39:00', '14/07/2011 8:40:00', '14/07/2011 8:41:00', '14/07/2011 8:42:00', '14/07/2011 8:43:00', '14/07/2011 8:44:00', '14/07/2011 8:45:00', '14/07/2011 8:46:00', '14/07/2011 8:47:00', '14/07/2011 8:48:00', '14/07/2011 8:49:00', '14/07/2011 8:50:00', '14/07/2011 8:51:00', '14/07/2011 8:52:00', '14/07/2011 8:53:00', '14/07/2011 8:54:00', '14/07/2011 8:55:00', '14/07/2011 8:56:00', '14/07/2011 8:57:00', '14/07/2011 8:58:00', '14/07/2011 8:59:00', '14/07/2011 9:00:00', '14/07/2011 9:01:00', '14/07/2011 9:02:00', '14/07/2011 9:03:00', '14/07/2011 9:04:00', '14/07/2011 9:05:00', '14/07/2011 9:06:00', '14/07/2011 9:07:00', '14/07/2011 9:08:00', '14/07/2011 9:09:00', '14/07/2011 9:10:00', '14/07/2011 9:11:00', '14/07/2011 9:12:00', '14/07/2011 9:13:00', '14/07/2011 9:14:00', '14/07/2011 9:15:00', '14/07/2011 9:16:00', '14/07/2011 9:17:00', '14/07/2011 9:18:00', '14/07/2011 9:19:00', '14/07/2011 9:20:00', '14/07/2011 9:21:00', '14/07/2011 9:22:00', '14/07/2011 9:23:00', '14/07/2011 9:24:00', '14/07/2011 9:25:00', '14/07/2011 9:26:00', '14/07/2011 9:27:00', '14/07/2011 9:28:00', '14/07/2011 9:29:00', '14/07/2011 9:30:00', '14/07/2011 9:31:00', '14/07/2011 9:32:00', '14/07/2011 9:33:00', '14/07/2011 9:34:00', '14/07/2011 9:35:00', '14/07/2011 9:36:00', '14/07/2011 9:37:00', '14/07/2011 9:38:00', '14/07/2011 9:39:00', '14/07/2011 9:40:00', '14/07/2011 9:41:00', '14/07/2011 9:42:00', '14/07/2011 9:43:00', '14/07/2011 9:44:00', '14/07/2011 9:45:00', '14/07/2011 9:46:00', '14/07/2011 9:47:00', '14/07/2011 9:48:00', '14/07/2011 9:49:00', '14/07/2011 9:50:00', '14/07/2011 9:51:00', '14/07/2011 9:52:00', '14/07/2011 9:53:00', '14/07/2011 9:54:00', '14/07/2011 9:55:00', '14/07/2011 9:56:00', '14/07/2011 9:57:00', '14/07/2011 9:58:00', '14/07/2011 9:59:00', '14/07/2011 10:00:00', '14/07/2011 10:01:00', '14/07/2011 10:02:00', '14/07/2011 10:03:00', '14/07/2011 10:04:00', '14/07/2011 10:05:00', '14/07/2011 10:06:00', '14/07/2011 10:07:00', '14/07/2011 10:08:00', '14/07/2011 10:09:00', '14/07/2011 10:10:00', '14/07/2011 10:11:00', '14/07/2011 10:12:00', '14/07/2011 10:13:00', '14/07/2011 10:14:00', '14/07/2011 10:15:00', '14/07/2011 10:16:00', '14/07/2011 10:17:00', '14/07/2011 10:18:00', '14/07/2011 10:19:00', '14/07/2011 10:20:00', '14/07/2011 10:21:00', '14/07/2011 10:22:00', '14/07/2011 10:23:00', '14/07/2011 10:24:00', '14/07/2011 10:25:00', '14/07/2011 10:26:00', '14/07/2011 10:27:00', '14/07/2011 10:28:00', '14/07/2011 10:29:00', '14/07/2011 10:30:00', '14/07/2011 10:31:00', '14/07/2011 10:32:00', '14/07/2011 10:33:00', '14/07/2011 10:34:00', '14/07/2011 10:35:00', '14/07/2011 10:36:00', '14/07/2011 10:37:00', '14/07/2011 10:38:00', '14/07/2011 10:39:00', '14/07/2011 10:40:00', '14/07/2011 10:41:00', '14/07/2011 10:42:00', '14/07/2011 10:43:00', '14/07/2011 10:44:00', '14/07/2011 10:45:00', '14/07/2011 10:46:00', '14/07/2011 10:47:00', '14/07/2011 10:48:00', '14/07/2011 10:49:00', '14/07/2011 10:50:00', '14/07/2011 10:51:00', '14/07/2011 10:52:00', '14/07/2011 10:53:00', '14/07/2011 10:54:00', '14/07/2011 10:55:00', '14/07/2011 10:56:00', '14/07/2011 10:57:00', '14/07/2011 10:58:00', '14/07/2011 10:59:00', '14/07/2011 11:00:00', '14/07/2011 11:01:00', '14/07/2011 11:02:00', '14/07/2011 11:03:00', '14/07/2011 11:04:00', '14/07/2011 11:05:00', '14/07/2011 11:06:00', '14/07/2011 11:07:00', '14/07/2011 11:08:00', '14/07/2011 11:09:00', '14/07/2011 11:10:00', '14/07/2011 11:11:00', '14/07/2011 11:12:00', '14/07/2011 11:13:00', '14/07/2011 11:14:00', '14/07/2011 11:15:00', '14/07/2011 11:16:00', '14/07/2011 11:17:00', '14/07/2011 11:18:00', '14/07/2011 11:19:00', '14/07/2011 11:20:00', '14/07/2011 11:21:00', '14/07/2011 11:22:00', '14/07/2011 11:23:00', '14/07/2011 11:24:00', '14/07/2011 11:25:00', '14/07/2011 11:26:00', '14/07/2011 11:27:00', '14/07/2011 11:28:00', '14/07/2011 11:29:00', '14/07/2011 11:30:00', '14/07/2011 11:31:00', '14/07/2011 11:32:00', '14/07/2011 11:33:00', '14/07/2011 11:34:00', '14/07/2011 11:35:00', '14/07/2011 11:36:00', '14/07/2011 11:37:00', '14/07/2011 11:38:00', '14/07/2011 11:39:00', '14/07/2011 11:40:00', '14/07/2011 11:41:00', '14/07/2011 11:42:00', '14/07/2011 11:43:00', '14/07/2011 11:44:00', '14/07/2011 11:45:00', '14/07/2011 11:46:00', '14/07/2011 11:47:00', '14/07/2011 11:48:00', '14/07/2011 11:49:00', '14/07/2011 11:50:00', '14/07/2011 11:51:00', '14/07/2011 11:52:00', '14/07/2011 11:53:00', '14/07/2011 11:54:00', '14/07/2011 11:55:00', '14/07/2011 11:56:00', '14/07/2011 11:57:00', '14/07/2011 11:58:00', '14/07/2011 11:59:00', '14/07/2011 12:00:00', '14/07/2011 12:01:00', '14/07/2011 12:02:00', '14/07/2011 12:03:00', '14/07/2011 12:04:00', '14/07/2011 12:05:00', '14/07/2011 12:06:00', '14/07/2011 12:07:00', '14/07/2011 12:08:00', '14/07/2011 12:09:00', '14/07/2011 12:10:00', '14/07/2011 12:11:00', '14/07/2011 12:12:00', '14/07/2011 12:13:00', '14/07/2011 12:14:00', '14/07/2011 12:15:00', '14/07/2011 12:16:00', '14/07/2011 12:17:00', '14/07/2011 12:18:00', '14/07/2011 12:19:00', '14/07/2011 12:20:00', '14/07/2011 12:21:00', '14/07/2011 12:22:00', '14/07/2011 12:23:00', '14/07/2011 12:24:00', '14/07/2011 12:25:00', '14/07/2011 12:26:00', '14/07/2011 12:27:00', '14/07/2011 12:28:00', '14/07/2011 12:29:00', '14/07/2011 12:30:00', '14/07/2011 12:31:00', '14/07/2011 12:32:00', '14/07/2011 12:33:00', '14/07/2011 12:34:00', '14/07/2011 12:35:00', '14/07/2011 12:36:00', '14/07/2011 12:37:00', '14/07/2011 12:38:00', '14/07/2011 12:39:00', '14/07/2011 12:40:00', '14/07/2011 12:41:00', '14/07/2011 12:42:00', '14/07/2011 12:43:00', '14/07/2011 12:44:00', '14/07/2011 12:45:00', '14/07/2011 12:46:00', '14/07/2011 12:47:00', '14/07/2011 12:48:00', '14/07/2011 12:49:00', '14/07/2011 12:50:00', '14/07/2011 12:51:00', '14/07/2011 12:52:00', '14/07/2011 12:53:00', '14/07/2011 12:54:00', '14/07/2011 12:55:00', '14/07/2011 12:56:00', '14/07/2011 12:57:00', '14/07/2011 12:58:00', '14/07/2011 12:59:00', '14/07/2011 13:00:00', '14/07/2011 13:01:00', '14/07/2011 13:02:00', '14/07/2011 13:03:00', '14/07/2011 13:04:00', '14/07/2011 13:05:00', '14/07/2011 13:06:00', '14/07/2011 13:07:00', '14/07/2011 13:08:00', '14/07/2011 13:09:00', '14/07/2011 13:10:00', '14/07/2011 13:11:00', '14/07/2011 13:12:00', '14/07/2011 13:13:00', '14/07/2011 13:14:00', '14/07/2011 13:15:00', '14/07/2011 13:16:00', '14/07/2011 13:17:00', '14/07/2011 13:18:00', '14/07/2011 13:19:00', '14/07/2011 13:20:00', '14/07/2011 13:21:00', '14/07/2011 13:22:00', '14/07/2011 13:23:00', '14/07/2011 13:24:00', '14/07/2011 13:25:00', '14/07/2011 13:26:00', '14/07/2011 13:27:00', '14/07/2011 13:28:00', '14/07/2011 13:29:00', '14/07/2011 13:30:00', '14/07/2011 13:31:00', '14/07/2011 13:32:00', '14/07/2011 13:33:00', '14/07/2011 13:34:00', '14/07/2011 13:35:00', '14/07/2011 13:36:00', '14/07/2011 13:37:00', '14/07/2011 13:38:00', '14/07/2011 13:39:00', '14/07/2011 13:40:00', '14/07/2011 13:41:00', '14/07/2011 13:42:00', '14/07/2011 13:43:00', '14/07/2011 13:44:00', '14/07/2011 13:45:00', '14/07/2011 13:46:00', '14/07/2011 13:47:00', '14/07/2011 13:48:00', '14/07/2011 13:49:00', '14/07/2011 13:50:00', '14/07/2011 13:51:00', '14/07/2011 13:52:00', '14/07/2011 13:53:00', '14/07/2011 13:54:00', '14/07/2011 13:55:00', '14/07/2011 13:56:00', '14/07/2011 13:57:00', '14/07/2011 13:58:00', '14/07/2011 13:59:00', '14/07/2011 14:00:00', '14/07/2011 14:01:00', '14/07/2011 14:02:00', '14/07/2011 14:03:00', '14/07/2011 14:04:00', '14/07/2011 14:05:00', '14/07/2011 14:06:00', '14/07/2011 14:07:00', '14/07/2011 14:08:00', '14/07/2011 14:09:00', '14/07/2011 14:10:00', '14/07/2011 14:11:00', '14/07/2011 14:12:00', '14/07/2011 14:13:00', '14/07/2011 14:14:00', '14/07/2011 14:15:00', '14/07/2011 14:16:00', '14/07/2011 14:17:00', '14/07/2011 14:18:00', '14/07/2011 14:19:00', '14/07/2011 14:20:00', '14/07/2011 14:21:00', '14/07/2011 14:22:00', '14/07/2011 14:23:00', '14/07/2011 14:24:00', '14/07/2011 14:25:00', '14/07/2011 14:26:00', '14/07/2011 14:27:00', '14/07/2011 14:28:00', '14/07/2011 14:29:00', '14/07/2011 14:30:00', '14/07/2011 14:31:00', '14/07/2011 14:32:00', '14/07/2011 14:33:00', '14/07/2011 14:34:00', '14/07/2011 14:35:00', '14/07/2011 14:36:00', '14/07/2011 14:37:00', '14/07/2011 14:38:00', '14/07/2011 14:39:00', '14/07/2011 14:40:00', '14/07/2011 14:41:00', '14/07/2011 14:42:00', '14/07/2011 14:43:00', '14/07/2011 14:44:00', '14/07/2011 14:45:00', '14/07/2011 14:46:00', '14/07/2011 14:47:00', '14/07/2011 14:48:00', '14/07/2011 14:49:00', '14/07/2011 14:50:00', '14/07/2011 14:51:00', '14/07/2011 14:52:00', '14/07/2011 14:53:00', '14/07/2011 14:54:00', '14/07/2011 14:55:00', '14/07/2011 14:56:00', '14/07/2011 14:57:00', '14/07/2011 14:58:00', '14/07/2011 14:59:00', '14/07/2011 15:00:00', '14/07/2011 15:01:00', '14/07/2011 15:02:00', '14/07/2011 15:03:00', '14/07/2011 15:04:00', '14/07/2011 15:05:00', '14/07/2011 15:06:00', '14/07/2011 15:07:00', '14/07/2011 15:08:00', '14/07/2011 15:09:00', '14/07/2011 15:10:00', '14/07/2011 15:11:00', '14/07/2011 15:12:00', '14/07/2011 15:13:00', '14/07/2011 15:14:00', '14/07/2011 15:15:00', '14/07/2011 15:16:00', '14/07/2011 15:17:00', '14/07/2011 15:18:00', '14/07/2011 15:19:00', '14/07/2011 15:20:00', '14/07/2011 15:21:00', '14/07/2011 15:22:00', '14/07/2011 15:23:00', '14/07/2011 15:24:00', '14/07/2011 15:25:00', '14/07/2011 15:26:00', '14/07/2011 15:27:00', '14/07/2011 15:28:00', '14/07/2011 15:29:00', '14/07/2011 15:30:00', '14/07/2011 15:31:00', '14/07/2011 15:32:00', '14/07/2011 15:33:00', '14/07/2011 15:34:00', '14/07/2011 15:35:00', '14/07/2011 15:36:00', '14/07/2011 15:37:00', '14/07/2011 15:38:00', '14/07/2011 15:39:00', '14/07/2011 15:40:00', '14/07/2011 15:41:00', '14/07/2011 15:42:00', '14/07/2011 15:43:00', '14/07/2011 15:44:00', '14/07/2011 15:45:00', '14/07/2011 15:46:00', '14/07/2011 15:47:00', '14/07/2011 15:48:00', '14/07/2011 15:49:00', '14/07/2011 15:50:00', '14/07/2011 15:51:00', '14/07/2011 15:52:00', '14/07/2011 15:53:00', '14/07/2011 15:54:00', '14/07/2011 15:55:00', '14/07/2011 15:56:00', '14/07/2011 15:57:00', '14/07/2011 15:58:00', '14/07/2011 15:59:00', '14/07/2011 16:00:00', '14/07/2011 16:01:00', '14/07/2011 16:02:00', '14/07/2011 16:03:00', '14/07/2011 16:04:00', '14/07/2011 16:05:00', '14/07/2011 16:06:00', '14/07/2011 16:07:00', '14/07/2011 16:08:00', '14/07/2011 16:09:00', '14/07/2011 16:10:00', '14/07/2011 16:11:00', '14/07/2011 16:12:00', '14/07/2011 16:13:00', '14/07/2011 16:14:00', '14/07/2011 16:15:00', '14/07/2011 16:16:00', '14/07/2011 16:17:00', '14/07/2011 16:18:00', '14/07/2011 16:19:00', '14/07/2011 16:20:00', '14/07/2011 16:21:00', '14/07/2011 16:22:00', '14/07/2011 16:23:00', '14/07/2011 16:24:00', '14/07/2011 16:25:00', '14/07/2011 16:26:00', '14/07/2011 16:27:00', '14/07/2011 16:28:00', '14/07/2011 16:29:00', '14/07/2011 16:30:00', '14/07/2011 16:31:00', '14/07/2011 16:32:00', '14/07/2011 16:33:00', '14/07/2011 16:34:00', '14/07/2011 16:35:00', '14/07/2011 16:36:00', '14/07/2011 16:37:00', '14/07/2011 16:38:00', '14/07/2011 16:39:00', '14/07/2011 16:40:00', '14/07/2011 16:41:00', '14/07/2011 16:42:00', '14/07/2011 16:43:00', '14/07/2011 16:44:00', '14/07/2011 16:45:00', '14/07/2011 16:46:00', '14/07/2011 16:47:00', '14/07/2011 16:48:00', '14/07/2011 16:49:00', '14/07/2011 16:50:00', '14/07/2011 16:51:00', '14/07/2011 16:52:00', '14/07/2011 16:53:00', '14/07/2011 16:54:00', '14/07/2011 16:55:00', '14/07/2011 16:56:00', '14/07/2011 16:57:00', '14/07/2011 16:58:00', '14/07/2011 16:59:00', '14/07/2011 17:00:00', '14/07/2011 17:01:00', '14/07/2011 17:02:00', '14/07/2011 17:03:00', '14/07/2011 17:04:00', '14/07/2011 17:05:00', '14/07/2011 17:06:00', '14/07/2011 17:07:00', '14/07/2011 17:08:00', '14/07/2011 17:09:00', '14/07/2011 17:10:00', '14/07/2011 17:11:00', '14/07/2011 17:12:00', '14/07/2011 17:13:00', '14/07/2011 17:14:00', '14/07/2011 17:15:00', '14/07/2011 17:16:00', '14/07/2011 17:17:00', '14/07/2011 17:18:00', '14/07/2011 17:19:00', '14/07/2011 17:20:00', '14/07/2011 17:21:00', '14/07/2011 17:22:00', '14/07/2011 17:23:00', '14/07/2011 17:24:00', '14/07/2011 17:25:00', '14/07/2011 17:26:00', '14/07/2011 17:27:00', '14/07/2011 17:28:00', '14/07/2011 17:29:00', '14/07/2011 17:30:00', '14/07/2011 17:31:00', '14/07/2011 17:32:00', '14/07/2011 17:33:00', '14/07/2011 17:34:00', '14/07/2011 17:35:00', '14/07/2011 17:36:00', '14/07/2011 17:37:00', '14/07/2011 17:38:00', '14/07/2011 17:39:00', '14/07/2011 17:40:00', '14/07/2011 17:41:00', '14/07/2011 17:42:00', '14/07/2011 17:43:00', '14/07/2011 17:44:00', '14/07/2011 17:45:00', '14/07/2011 17:46:00', '14/07/2011 17:47:00', '14/07/2011 17:48:00', '14/07/2011 17:49:00', '14/07/2011 17:50:00', '14/07/2011 17:51:00', '14/07/2011 17:52:00', '14/07/2011 17:53:00', '14/07/2011 17:54:00', '14/07/2011 17:55:00', '14/07/2011 17:56:00', '14/07/2011 17:57:00', '14/07/2011 17:58:00', '14/07/2011 17:59:00', '14/07/2011 18:00:00', '14/07/2011 18:01:00', '14/07/2011 18:02:00', '14/07/2011 18:03:00', '14/07/2011 18:04:00', '14/07/2011 18:05:00', '14/07/2011 18:06:00', '14/07/2011 18:07:00', '14/07/2011 18:08:00', '14/07/2011 18:09:00', '14/07/2011 18:10:00', '14/07/2011 18:11:00', '14/07/2011 18:12:00', '14/07/2011 18:13:00', '14/07/2011 18:14:00', '14/07/2011 18:15:00', '14/07/2011 18:16:00', '14/07/2011 18:17:00', '14/07/2011 18:18:00', '14/07/2011 18:19:00', '14/07/2011 18:20:00', '14/07/2011 18:21:00', '14/07/2011 18:22:00', '14/07/2011 18:23:00', '14/07/2011 18:24:00', '14/07/2011 18:25:00', '14/07/2011 18:26:00', '14/07/2011 18:27:00', '14/07/2011 18:28:00', '14/07/2011 18:29:00', '14/07/2011 18:30:00', '14/07/2011 18:31:00', '14/07/2011 18:32:00', '14/07/2011 18:33:00', '14/07/2011 18:34:00', '14/07/2011 18:35:00', '14/07/2011 18:36:00', '14/07/2011 18:37:00', '14/07/2011 18:38:00', '14/07/2011 18:39:00', '14/07/2011 18:40:00', '14/07/2011 18:41:00', '14/07/2011 18:42:00', '14/07/2011 18:43:00', '14/07/2011 18:44:00', '14/07/2011 18:45:00', '14/07/2011 18:46:00', '14/07/2011 18:47:00', '14/07/2011 18:48:00', '14/07/2011 18:49:00', '14/07/2011 18:50:00', '14/07/2011 18:51:00', '14/07/2011 18:52:00', '14/07/2011 18:53:00', '14/07/2011 18:54:00', '14/07/2011 18:55:00', '14/07/2011 18:56:00', '14/07/2011 18:57:00', '14/07/2011 18:58:00', '14/07/2011 18:59:00', '14/07/2011 19:00:00', '14/07/2011 19:01:00', '14/07/2011 19:02:00', '14/07/2011 19:03:00', '14/07/2011 19:04:00', '14/07/2011 19:05:00', '14/07/2011 19:06:00', '14/07/2011 19:07:00', '14/07/2011 19:08:00', '14/07/2011 19:09:00', '14/07/2011 19:10:00', '14/07/2011 19:11:00', '14/07/2011 19:12:00', '14/07/2011 19:13:00', '14/07/2011 19:14:00', '14/07/2011 19:15:00', '14/07/2011 19:16:00', '14/07/2011 19:17:00', '14/07/2011 19:18:00', '14/07/2011 19:19:00', '14/07/2011 19:20:00', '14/07/2011 19:21:00', '14/07/2011 19:22:00', '14/07/2011 19:23:00', '14/07/2011 19:24:00', '14/07/2011 19:25:00', '14/07/2011 19:26:00', '14/07/2011 19:27:00', '14/07/2011 19:28:00', '14/07/2011 19:29:00', '14/07/2011 19:30:00', '14/07/2011 19:31:00', '14/07/2011 19:32:00', '14/07/2011 19:33:00', '14/07/2011 19:34:00', '14/07/2011 19:35:00', '14/07/2011 19:36:00', '14/07/2011 19:37:00', '14/07/2011 19:38:00', '14/07/2011 19:39:00', '14/07/2011 19:40:00', '14/07/2011 19:41:00', '14/07/2011 19:42:00', '14/07/2011 19:43:00', '14/07/2011 19:44:00', '14/07/2011 19:45:00', '14/07/2011 19:46:00', '14/07/2011 19:47:00', '14/07/2011 19:48:00', '14/07/2011 19:49:00', '14/07/2011 19:50:00', '14/07/2011 19:51:00', '14/07/2011 19:52:00', '14/07/2011 19:53:00', '14/07/2011 19:54:00', '14/07/2011 19:55:00', '14/07/2011 19:56:00', '14/07/2011 19:57:00', '14/07/2011 19:58:00', '14/07/2011 19:59:00', '14/07/2011 20:00:00', '14/07/2011 20:01:00', '14/07/2011 20:02:00', '14/07/2011 20:03:00', '14/07/2011 20:04:00', '14/07/2011 20:05:00', '14/07/2011 20:06:00', '14/07/2011 20:07:00', '14/07/2011 20:08:00', '14/07/2011 20:09:00', '14/07/2011 20:10:00', '14/07/2011 20:11:00', '14/07/2011 20:12:00', '14/07/2011 20:13:00', '14/07/2011 20:14:00', '14/07/2011 20:15:00', '14/07/2011 20:16:00', '14/07/2011 20:17:00', '14/07/2011 20:18:00', '14/07/2011 20:19:00', '14/07/2011 20:20:00', '14/07/2011 20:21:00', '14/07/2011 20:22:00', '14/07/2011 20:23:00', '14/07/2011 20:24:00', '14/07/2011 20:25:00', '14/07/2011 20:26:00', '14/07/2011 20:27:00', '14/07/2011 20:28:00', '14/07/2011 20:29:00', '14/07/2011 20:30:00', '14/07/2011 20:31:00', '14/07/2011 20:32:00', '14/07/2011 20:33:00', '14/07/2011 20:34:00', '14/07/2011 20:35:00', '14/07/2011 20:36:00', '14/07/2011 20:37:00', '14/07/2011 20:38:00', '14/07/2011 20:39:00', '14/07/2011 20:40:00', '14/07/2011 20:41:00', '14/07/2011 20:42:00', '14/07/2011 20:43:00', '14/07/2011 20:44:00', '14/07/2011 20:45:00', '14/07/2011 20:46:00', '14/07/2011 20:47:00', '14/07/2011 20:48:00', '14/07/2011 20:49:00', '14/07/2011 20:50:00', '14/07/2011 20:51:00', '14/07/2011 20:52:00', '14/07/2011 20:53:00', '14/07/2011 20:54:00', '14/07/2011 20:55:00', '14/07/2011 20:56:00', '14/07/2011 20:57:00', '14/07/2011 20:58:00', '14/07/2011 20:59:00', '14/07/2011 21:00:00', '14/07/2011 21:01:00', '14/07/2011 21:02:00', '14/07/2011 21:03:00', '14/07/2011 21:04:00', '14/07/2011 21:05:00', '14/07/2011 21:06:00', '14/07/2011 21:07:00', '14/07/2011 21:08:00', '14/07/2011 21:09:00', '14/07/2011 21:10:00', '14/07/2011 21:11:00', '14/07/2011 21:12:00', '14/07/2011 21:13:00', '14/07/2011 21:14:00', '14/07/2011 21:15:00', '14/07/2011 21:16:00', '14/07/2011 21:17:00', '14/07/2011 21:18:00', '14/07/2011 21:19:00', '14/07/2011 21:20:00', '14/07/2011 21:21:00', '14/07/2011 21:22:00', '14/07/2011 21:23:00', '14/07/2011 21:24:00', '14/07/2011 21:25:00', '14/07/2011 21:26:00', '14/07/2011 21:27:00', '14/07/2011 21:28:00', '14/07/2011 21:29:00', '14/07/2011 21:30:00', '14/07/2011 21:31:00', '14/07/2011 21:32:00', '14/07/2011 21:33:00', '14/07/2011 21:34:00', '14/07/2011 21:35:00', '14/07/2011 21:36:00', '14/07/2011 21:37:00', '14/07/2011 21:38:00', '14/07/2011 21:39:00', '14/07/2011 21:40:00', '14/07/2011 21:41:00', '14/07/2011 21:42:00', '14/07/2011 21:43:00', '14/07/2011 21:44:00', '14/07/2011 21:45:00', '14/07/2011 21:46:00', '14/07/2011 21:47:00', '14/07/2011 21:48:00', '14/07/2011 21:49:00', '14/07/2011 21:50:00', '14/07/2011 21:51:00', '14/07/2011 21:52:00', '14/07/2011 21:53:00', '14/07/2011 21:54:00', '14/07/2011 21:55:00', '14/07/2011 21:56:00', '14/07/2011 21:57:00', '14/07/2011 21:58:00', '14/07/2011 21:59:00', '14/07/2011 22:00:00', '14/07/2011 22:01:00', '14/07/2011 22:02:00', '14/07/2011 22:03:00', '14/07/2011 22:04:00', '14/07/2011 22:05:00', '14/07/2011 22:06:00', '14/07/2011 22:07:00', '14/07/2011 22:08:00', '14/07/2011 22:09:00', '14/07/2011 22:10:00', '14/07/2011 22:11:00', '14/07/2011 22:12:00', '14/07/2011 22:13:00', '14/07/2011 22:14:00', '14/07/2011 22:15:00', '14/07/2011 22:16:00', '14/07/2011 22:17:00', '14/07/2011 22:18:00', '14/07/2011 22:19:00', '14/07/2011 22:20:00', '14/07/2011 22:21:00', '14/07/2011 22:22:00', '14/07/2011 22:23:00', '14/07/2011 22:24:00', '14/07/2011 22:25:00', '14/07/2011 22:26:00', '14/07/2011 22:27:00', '14/07/2011 22:28:00', '14/07/2011 22:29:00', '14/07/2011 22:30:00', '14/07/2011 22:31:00', '14/07/2011 22:32:00', '14/07/2011 22:33:00', '14/07/2011 22:34:00', '14/07/2011 22:35:00', '14/07/2011 22:36:00', '14/07/2011 22:37:00', '14/07/2011 22:38:00', '14/07/2011 22:39:00', '14/07/2011 22:40:00', '14/07/2011 22:41:00', '14/07/2011 22:42:00', '14/07/2011 22:43:00', '14/07/2011 22:44:00', '14/07/2011 22:45:00', '14/07/2011 22:46:00', '14/07/2011 22:47:00', '14/07/2011 22:48:00', '14/07/2011 22:49:00', '14/07/2011 22:50:00', '14/07/2011 22:51:00', '14/07/2011 22:52:00', '14/07/2011 22:53:00', '14/07/2011 22:54:00', '14/07/2011 22:55:00', '14/07/2011 22:56:00', '14/07/2011 22:57:00', '14/07/2011 22:58:00', '14/07/2011 22:59:00', '14/07/2011 23:00:00', '14/07/2011 23:01:00', '14/07/2011 23:02:00', '14/07/2011 23:03:00', '14/07/2011 23:04:00', '14/07/2011 23:05:00', '14/07/2011 23:06:00', '14/07/2011 23:07:00', '14/07/2011 23:08:00', '14/07/2011 23:09:00', '14/07/2011 23:10:00', '14/07/2011 23:11:00', '14/07/2011 23:12:00', '14/07/2011 23:13:00', '14/07/2011 23:14:00', '14/07/2011 23:15:00', '14/07/2011 23:16:00', '14/07/2011 23:17:00', '14/07/2011 23:18:00', '14/07/2011 23:19:00', '14/07/2011 23:20:00', '14/07/2011 23:21:00', '14/07/2011 23:22:00', '14/07/2011 23:23:00', '14/07/2011 23:24:00', '14/07/2011 23:25:00', '14/07/2011 23:26:00', '14/07/2011 23:27:00', '14/07/2011 23:28:00', '14/07/2011 23:29:00', '14/07/2011 23:30:00', '14/07/2011 23:31:00', '14/07/2011 23:32:00', '14/07/2011 23:33:00', '14/07/2011 23:34:00', '14/07/2011 23:35:00', '14/07/2011 23:36:00', '14/07/2011 23:37:00', '14/07/2011 23:38:00', '14/07/2011 23:39:00', '14/07/2011 23:40:00', '14/07/2011 23:41:00', '14/07/2011 23:42:00', '14/07/2011 23:43:00', '14/07/2011 23:44:00', '14/07/2011 23:45:00', '14/07/2011 23:46:00', '14/07/2011 23:47:00', '14/07/2011 23:48:00', '14/07/2011 23:49:00', '14/07/2011 23:50:00', '14/07/2011 23:51:00', '14/07/2011 23:52:00', '14/07/2011 23:53:00', '14/07/2011 23:54:00', '14/07/2011 23:55:00', '14/07/2011 23:56:00', '14/07/2011 23:57:00', '14/07/2011 23:58:00', '14/07/2011 23:59:00', '15/07/2011 0:00:00', '15/07/2011 0:01:00', '15/07/2011 0:02:00', '15/07/2011 0:03:00', '15/07/2011 0:04:00', '15/07/2011 0:05:00', '15/07/2011 0:06:00', '15/07/2011 0:07:00', '15/07/2011 0:08:00', '15/07/2011 0:09:00', '15/07/2011 0:10:00', '15/07/2011 0:11:00', '15/07/2011 0:12:00', '15/07/2011 0:13:00', '15/07/2011 0:14:00', '15/07/2011 0:15:00', '15/07/2011 0:16:00', '15/07/2011 0:17:00', '15/07/2011 0:18:00', '15/07/2011 0:19:00', '15/07/2011 0:20:00', '15/07/2011 0:21:00', '15/07/2011 0:22:00', '15/07/2011 0:23:00', '15/07/2011 0:24:00', '15/07/2011 0:25:00', '15/07/2011 0:26:00', '15/07/2011 0:27:00', '15/07/2011 0:28:00', '15/07/2011 0:29:00', '15/07/2011 0:30:00', '15/07/2011 0:31:00', '15/07/2011 0:32:00', '15/07/2011 0:33:00', '15/07/2011 0:34:00', '15/07/2011 0:35:00', '15/07/2011 0:36:00', '15/07/2011 0:37:00', '15/07/2011 0:38:00', '15/07/2011 0:39:00', '15/07/2011 0:40:00', '15/07/2011 0:41:00', '15/07/2011 0:42:00', '15/07/2011 0:43:00', '15/07/2011 0:44:00', '15/07/2011 0:45:00', '15/07/2011 0:46:00', '15/07/2011 0:47:00', '15/07/2011 0:48:00', '15/07/2011 0:49:00', '15/07/2011 0:50:00', '15/07/2011 0:51:00', '15/07/2011 0:52:00', '15/07/2011 0:53:00', '15/07/2011 0:54:00', '15/07/2011 0:55:00', '15/07/2011 0:56:00', '15/07/2011 0:57:00', '15/07/2011 0:58:00', '15/07/2011 0:59:00', '15/07/2011 1:00:00', '15/07/2011 1:01:00', '15/07/2011 1:02:00', '15/07/2011 1:03:00', '15/07/2011 1:04:00', '15/07/2011 1:05:00', '15/07/2011 1:06:00', '15/07/2011 1:07:00', '15/07/2011 1:08:00', '15/07/2011 1:09:00', '15/07/2011 1:10:00', '15/07/2011 1:11:00', '15/07/2011 1:12:00', '15/07/2011 1:13:00', '15/07/2011 1:14:00', '15/07/2011 1:15:00', '15/07/2011 1:16:00', '15/07/2011 1:17:00', '15/07/2011 1:18:00', '15/07/2011 1:19:00', '15/07/2011 1:20:00', '15/07/2011 1:21:00', '15/07/2011 1:22:00', '15/07/2011 1:23:00', '15/07/2011 1:24:00', '15/07/2011 1:25:00', '15/07/2011 1:26:00', '15/07/2011 1:27:00', '15/07/2011 1:28:00', '15/07/2011 1:29:00', '15/07/2011 1:30:00', '15/07/2011 1:31:00', '15/07/2011 1:32:00', '15/07/2011 1:33:00', '15/07/2011 1:34:00', '15/07/2011 1:35:00', '15/07/2011 1:36:00', '15/07/2011 1:37:00', '15/07/2011 1:38:00', '15/07/2011 1:39:00', '15/07/2011 1:40:00', '15/07/2011 1:41:00', '15/07/2011 1:42:00', '15/07/2011 1:43:00', '15/07/2011 1:44:00', '15/07/2011 1:45:00', '15/07/2011 1:46:00', '15/07/2011 1:47:00', '15/07/2011 1:48:00', '15/07/2011 1:49:00', '15/07/2011 1:50:00', '15/07/2011 1:51:00', '15/07/2011 1:52:00', '15/07/2011 1:53:00', '15/07/2011 1:54:00', '15/07/2011 1:55:00', '15/07/2011 1:56:00', '15/07/2011 1:57:00', '15/07/2011 1:58:00', '15/07/2011 1:59:00', '15/07/2011 2:00:00', '15/07/2011 2:01:00', '15/07/2011 2:02:00', '15/07/2011 2:03:00', '15/07/2011 2:04:00', '15/07/2011 2:05:00', '15/07/2011 2:06:00', '15/07/2011 2:07:00', '15/07/2011 2:08:00', '15/07/2011 2:09:00', '15/07/2011 2:10:00', '15/07/2011 2:11:00', '15/07/2011 2:12:00', '15/07/2011 2:13:00', '15/07/2011 2:14:00', '15/07/2011 2:15:00', '15/07/2011 2:16:00', '15/07/2011 2:17:00', '15/07/2011 2:18:00', '15/07/2011 2:19:00', '15/07/2011 2:20:00', '15/07/2011 2:21:00', '15/07/2011 2:22:00', '15/07/2011 2:23:00', '15/07/2011 2:24:00', '15/07/2011 2:25:00', '15/07/2011 2:26:00', '15/07/2011 2:27:00', '15/07/2011 2:28:00', '15/07/2011 2:29:00', '15/07/2011 2:30:00', '15/07/2011 2:31:00', '15/07/2011 2:32:00', '15/07/2011 2:33:00', '15/07/2011 2:34:00', '15/07/2011 2:35:00', '15/07/2011 2:36:00', '15/07/2011 2:37:00', '15/07/2011 2:38:00', '15/07/2011 2:39:00', '15/07/2011 2:40:00', '15/07/2011 2:41:00', '15/07/2011 2:42:00', '15/07/2011 2:43:00', '15/07/2011 2:44:00', '15/07/2011 2:45:00', '15/07/2011 2:46:00', '15/07/2011 2:47:00', '15/07/2011 2:48:00', '15/07/2011 2:49:00', '15/07/2011 2:50:00', '15/07/2011 2:51:00', '15/07/2011 2:52:00', '15/07/2011 2:53:00', '15/07/2011 2:54:00', '15/07/2011 2:55:00', '15/07/2011 2:56:00', '15/07/2011 2:57:00', '15/07/2011 2:58:00', '15/07/2011 2:59:00', '15/07/2011 3:00:00', '15/07/2011 3:01:00', '15/07/2011 3:02:00', '15/07/2011 3:03:00', '15/07/2011 3:04:00', '15/07/2011 3:05:00', '15/07/2011 3:06:00', '15/07/2011 3:07:00', '15/07/2011 3:08:00', '15/07/2011 3:09:00', '15/07/2011 3:10:00', '15/07/2011 3:11:00', '15/07/2011 3:12:00', '15/07/2011 3:13:00', '15/07/2011 3:14:00', '15/07/2011 3:15:00', '15/07/2011 3:16:00', '15/07/2011 3:17:00', '15/07/2011 3:18:00', '15/07/2011 3:19:00', '15/07/2011 3:20:00', '15/07/2011 3:21:00', '15/07/2011 3:22:00', '15/07/2011 3:23:00', '15/07/2011 3:24:00', '15/07/2011 3:25:00', '15/07/2011 3:26:00', '15/07/2011 3:27:00', '15/07/2011 3:28:00', '15/07/2011 3:29:00', '15/07/2011 3:30:00', '15/07/2011 3:31:00', '15/07/2011 3:32:00', '15/07/2011 3:33:00', '15/07/2011 3:34:00', '15/07/2011 3:35:00', '15/07/2011 3:36:00', '15/07/2011 3:37:00', '15/07/2011 3:38:00', '15/07/2011 3:39:00', '15/07/2011 3:40:00', '15/07/2011 3:41:00', '15/07/2011 3:42:00', '15/07/2011 3:43:00', '15/07/2011 3:44:00', '15/07/2011 3:45:00', '15/07/2011 3:46:00', '15/07/2011 3:47:00', '15/07/2011 3:48:00', '15/07/2011 3:49:00', '15/07/2011 3:50:00', '15/07/2011 3:51:00', '15/07/2011 3:52:00', '15/07/2011 3:53:00', '15/07/2011 3:54:00', '15/07/2011 3:55:00', '15/07/2011 3:56:00', '15/07/2011 3:57:00', '15/07/2011 3:58:00', '15/07/2011 3:59:00', '15/07/2011 4:00:00', '15/07/2011 4:01:00', '15/07/2011 4:02:00', '15/07/2011 4:03:00', '15/07/2011 4:04:00', '15/07/2011 4:05:00', '15/07/2011 4:06:00', '15/07/2011 4:07:00', '15/07/2011 4:08:00', '15/07/2011 4:09:00', '15/07/2011 4:10:00', '15/07/2011 4:11:00', '15/07/2011 4:12:00', '15/07/2011 4:13:00', '15/07/2011 4:14:00', '15/07/2011 4:15:00', '15/07/2011 4:16:00', '15/07/2011 4:17:00', '15/07/2011 4:18:00', '15/07/2011 4:19:00', '15/07/2011 4:20:00', '15/07/2011 4:21:00', '15/07/2011 4:22:00', '15/07/2011 4:23:00', '15/07/2011 4:24:00', '15/07/2011 4:25:00', '15/07/2011 4:26:00', '15/07/2011 4:27:00', '15/07/2011 4:28:00', '15/07/2011 4:29:00', '15/07/2011 4:30:00', '15/07/2011 4:31:00', '15/07/2011 4:32:00', '15/07/2011 4:33:00', '15/07/2011 4:34:00', '15/07/2011 4:35:00', '15/07/2011 4:36:00', '15/07/2011 4:37:00', '15/07/2011 4:38:00', '15/07/2011 4:39:00', '15/07/2011 4:40:00', '15/07/2011 4:41:00', '15/07/2011 4:42:00', '15/07/2011 4:43:00', '15/07/2011 4:44:00', '15/07/2011 4:45:00', '15/07/2011 4:46:00', '15/07/2011 4:47:00', '15/07/2011 4:48:00', '15/07/2011 4:49:00', '15/07/2011 4:50:00', '15/07/2011 4:51:00', '15/07/2011 4:52:00', '15/07/2011 4:53:00', '15/07/2011 4:54:00', '15/07/2011 4:55:00', '15/07/2011 4:56:00', '15/07/2011 4:57:00', '15/07/2011 4:58:00', '15/07/2011 4:59:00', '15/07/2011 5:00:00', '15/07/2011 5:01:00', '15/07/2011 5:02:00', '15/07/2011 5:03:00', '15/07/2011 5:04:00', '15/07/2011 5:05:00', '15/07/2011 5:06:00', '15/07/2011 5:07:00', '15/07/2011 5:08:00', '15/07/2011 5:09:00', '15/07/2011 5:10:00', '15/07/2011 5:11:00', '15/07/2011 5:12:00', '15/07/2011 5:13:00', '15/07/2011 5:14:00', '15/07/2011 5:15:00', '15/07/2011 5:16:00', '15/07/2011 5:17:00', '15/07/2011 5:18:00', '15/07/2011 5:19:00', '15/07/2011 5:20:00', '15/07/2011 5:21:00', '15/07/2011 5:22:00', '15/07/2011 5:23:00', '15/07/2011 5:24:00', '15/07/2011 5:25:00', '15/07/2011 5:26:00', '15/07/2011 5:27:00', '15/07/2011 5:28:00', '15/07/2011 5:29:00', '15/07/2011 5:30:00', '15/07/2011 5:31:00', '15/07/2011 5:32:00', '15/07/2011 5:33:00', '15/07/2011 5:34:00', '15/07/2011 5:35:00', '15/07/2011 5:36:00', '15/07/2011 5:37:00', '15/07/2011 5:38:00', '15/07/2011 5:39:00', '15/07/2011 5:40:00', '15/07/2011 5:41:00', '15/07/2011 5:42:00', '15/07/2011 5:43:00', '15/07/2011 5:44:00', '15/07/2011 5:45:00', '15/07/2011 5:46:00', '15/07/2011 5:47:00', '15/07/2011 5:48:00', '15/07/2011 5:49:00', '15/07/2011 5:50:00', '15/07/2011 5:51:00', '15/07/2011 5:52:00', '15/07/2011 5:53:00', '15/07/2011 5:54:00', '15/07/2011 5:55:00', '15/07/2011 5:56:00', '15/07/2011 5:57:00', '15/07/2011 5:58:00', '15/07/2011 5:59:00', '15/07/2011 6:00:00', '15/07/2011 6:01:00', '15/07/2011 6:02:00', '15/07/2011 6:03:00', '15/07/2011 6:04:00', '15/07/2011 6:05:00', '15/07/2011 6:06:00', '15/07/2011 6:07:00', '15/07/2011 6:08:00', '15/07/2011 6:09:00', '15/07/2011 6:10:00', '15/07/2011 6:11:00', '15/07/2011 6:12:00', '15/07/2011 6:13:00', '15/07/2011 6:14:00', '15/07/2011 6:15:00', '15/07/2011 6:16:00', '15/07/2011 6:17:00', '15/07/2011 6:18:00', '15/07/2011 6:19:00', '15/07/2011 6:20:00', '15/07/2011 6:21:00', '15/07/2011 6:22:00', '15/07/2011 6:23:00', '15/07/2011 6:24:00', '15/07/2011 6:25:00', '15/07/2011 6:26:00', '15/07/2011 6:27:00', '15/07/2011 6:28:00', '15/07/2011 6:29:00', '15/07/2011 6:30:00', '15/07/2011 6:31:00', '15/07/2011 6:32:00', '15/07/2011 6:33:00', '15/07/2011 6:34:00', '15/07/2011 6:35:00', '15/07/2011 6:36:00', '15/07/2011 6:37:00', '15/07/2011 6:38:00', '15/07/2011 6:39:00', '15/07/2011 6:40:00', '15/07/2011 6:41:00', '15/07/2011 6:42:00', '15/07/2011 6:43:00', '15/07/2011 6:44:00', '15/07/2011 6:45:00', '15/07/2011 6:46:00', '15/07/2011 6:47:00', '15/07/2011 6:48:00', '15/07/2011 6:49:00', '15/07/2011 6:50:00', '15/07/2011 6:51:00', '15/07/2011 6:52:00', '15/07/2011 6:53:00', '15/07/2011 6:54:00', '15/07/2011 6:55:00', '15/07/2011 6:56:00', '15/07/2011 6:57:00', '15/07/2011 6:58:00', '15/07/2011 6:59:00', '15/07/2011 7:00:00', '15/07/2011 7:01:00', '15/07/2011 7:02:00', '15/07/2011 7:03:00', '15/07/2011 7:04:00', '15/07/2011 7:05:00', '15/07/2011 7:06:00', '15/07/2011 7:07:00', '15/07/2011 7:08:00', '15/07/2011 7:09:00', '15/07/2011 7:10:00', '15/07/2011 7:11:00', '15/07/2011 7:12:00', '15/07/2011 7:13:00', '15/07/2011 7:14:00', '15/07/2011 7:15:00', '15/07/2011 7:16:00', '15/07/2011 7:17:00', '15/07/2011 7:18:00', '15/07/2011 7:19:00', '15/07/2011 7:20:00', '15/07/2011 7:21:00', '15/07/2011 7:22:00', '15/07/2011 7:23:00', '15/07/2011 7:24:00', '15/07/2011 7:25:00', '15/07/2011 7:26:00', '15/07/2011 7:27:00', '15/07/2011 7:28:00', '15/07/2011 7:29:00', '15/07/2011 7:30:00', '15/07/2011 7:31:00', '15/07/2011 7:32:00', '15/07/2011 7:33:00', '15/07/2011 7:34:00', '15/07/2011 7:35:00', '15/07/2011 7:36:00', '15/07/2011 7:37:00', '15/07/2011 7:38:00', '15/07/2011 7:39:00', '15/07/2011 7:40:00', '15/07/2011 7:41:00', '15/07/2011 7:42:00', '15/07/2011 7:43:00', '15/07/2011 7:44:00', '15/07/2011 7:45:00', '15/07/2011 7:46:00', '15/07/2011 7:47:00', '15/07/2011 7:48:00', '15/07/2011 7:49:00', '15/07/2011 7:50:00', '15/07/2011 7:51:00', '15/07/2011 7:52:00', '15/07/2011 7:53:00', '15/07/2011 7:54:00', '15/07/2011 7:55:00', '15/07/2011 7:56:00', '15/07/2011 7:57:00', '15/07/2011 7:58:00', '15/07/2011 7:59:00', '15/07/2011 8:00:00', '15/07/2011 8:01:00', '15/07/2011 8:02:00', '15/07/2011 8:03:00', '15/07/2011 8:04:00', '15/07/2011 8:05:00', '15/07/2011 8:06:00', '15/07/2011 8:07:00', '15/07/2011 8:08:00', '15/07/2011 8:09:00', '15/07/2011 8:10:00', '15/07/2011 8:11:00', '15/07/2011 8:12:00', '15/07/2011 8:13:00', '15/07/2011 8:14:00', '15/07/2011 8:15:00', '15/07/2011 8:16:00', '15/07/2011 8:17:00', '15/07/2011 8:18:00', '15/07/2011 8:19:00', '15/07/2011 8:20:00', '15/07/2011 8:21:00', '15/07/2011 8:22:00', '15/07/2011 8:23:00', '15/07/2011 8:24:00', '15/07/2011 8:25:00', '15/07/2011 8:26:00', '15/07/2011 8:27:00', '15/07/2011 8:28:00', '15/07/2011 8:29:00', '15/07/2011 8:30:00', '15/07/2011 8:31:00', '15/07/2011 8:32:00', '15/07/2011 8:33:00', '15/07/2011 8:34:00', '15/07/2011 8:35:00', '15/07/2011 8:36:00', '15/07/2011 8:37:00', '15/07/2011 8:38:00', '15/07/2011 8:39:00', '15/07/2011 8:40:00', '15/07/2011 8:41:00', '15/07/2011 8:42:00', '15/07/2011 8:43:00', '15/07/2011 8:44:00', '15/07/2011 8:45:00', '15/07/2011 8:46:00', '15/07/2011 8:47:00', '15/07/2011 8:48:00', '15/07/2011 8:49:00', '15/07/2011 8:50:00', '15/07/2011 8:51:00', '15/07/2011 8:52:00', '15/07/2011 8:53:00', '15/07/2011 8:54:00', '15/07/2011 8:55:00', '15/07/2011 8:56:00', '15/07/2011 8:57:00', '15/07/2011 8:58:00', '15/07/2011 8:59:00', '15/07/2011 9:00:00', '15/07/2011 9:01:00', '15/07/2011 9:02:00', '15/07/2011 9:03:00', '15/07/2011 9:04:00', '15/07/2011 9:05:00', '15/07/2011 9:06:00', '15/07/2011 9:07:00', '15/07/2011 9:08:00', '15/07/2011 9:09:00', '15/07/2011 9:10:00', '15/07/2011 9:11:00', '15/07/2011 9:12:00', '15/07/2011 9:13:00', '15/07/2011 9:14:00', '15/07/2011 9:15:00', '15/07/2011 9:16:00', '15/07/2011 9:17:00', '15/07/2011 9:18:00', '15/07/2011 9:19:00', '15/07/2011 9:20:00', '15/07/2011 9:21:00', '15/07/2011 9:22:00', '15/07/2011 9:23:00', '15/07/2011 9:24:00', '15/07/2011 9:25:00', '15/07/2011 9:26:00', '15/07/2011 9:27:00', '15/07/2011 9:28:00', '15/07/2011 9:29:00', '15/07/2011 9:30:00', '15/07/2011 9:31:00', '15/07/2011 9:32:00', '15/07/2011 9:33:00', '15/07/2011 9:34:00', '15/07/2011 9:35:00', '15/07/2011 9:36:00', '15/07/2011 9:37:00', '15/07/2011 9:38:00', '15/07/2011 9:39:00', '15/07/2011 9:40:00', '15/07/2011 9:41:00', '15/07/2011 9:42:00', '15/07/2011 9:43:00', '15/07/2011 9:44:00', '15/07/2011 9:45:00', '15/07/2011 9:46:00', '15/07/2011 9:47:00', '15/07/2011 9:48:00', '15/07/2011 9:49:00', '15/07/2011 9:50:00', '15/07/2011 9:51:00', '15/07/2011 9:52:00', '15/07/2011 9:53:00', '15/07/2011 9:54:00', '15/07/2011 9:55:00', '15/07/2011 9:56:00', '15/07/2011 9:57:00', '15/07/2011 9:58:00', '15/07/2011 9:59:00', '15/07/2011 10:00:00', '15/07/2011 10:01:00', '15/07/2011 10:02:00', '15/07/2011 10:03:00', '15/07/2011 10:04:00', '15/07/2011 10:05:00', '15/07/2011 10:06:00', '15/07/2011 10:07:00', '15/07/2011 10:08:00', '15/07/2011 10:09:00', '15/07/2011 10:10:00', '15/07/2011 10:11:00', '15/07/2011 10:12:00', '15/07/2011 10:13:00', '15/07/2011 10:14:00', '15/07/2011 10:15:00', '15/07/2011 10:16:00', '15/07/2011 10:17:00', '15/07/2011 10:18:00', '15/07/2011 10:19:00', '15/07/2011 10:20:00', '15/07/2011 10:21:00', '15/07/2011 10:22:00', '15/07/2011 10:23:00', '15/07/2011 10:24:00', '15/07/2011 10:25:00', '15/07/2011 10:26:00', '15/07/2011 10:27:00', '15/07/2011 10:28:00', '15/07/2011 10:29:00', '15/07/2011 10:30:00', '15/07/2011 10:31:00', '15/07/2011 10:32:00', '15/07/2011 10:33:00', '15/07/2011 10:34:00', '15/07/2011 10:35:00', '15/07/2011 10:36:00', '15/07/2011 10:37:00', '15/07/2011 10:38:00', '15/07/2011 10:39:00', '15/07/2011 10:40:00', '15/07/2011 10:41:00', '15/07/2011 10:42:00', '15/07/2011 10:43:00', '15/07/2011 10:44:00', '15/07/2011 10:45:00', '15/07/2011 10:46:00', '15/07/2011 10:47:00', '15/07/2011 10:48:00', '15/07/2011 10:49:00', '15/07/2011 10:50:00', '15/07/2011 10:51:00', '15/07/2011 10:52:00', '15/07/2011 10:53:00', '15/07/2011 10:54:00', '15/07/2011 10:55:00', '15/07/2011 10:56:00', '15/07/2011 10:57:00', '15/07/2011 10:58:00', '15/07/2011 10:59:00', '15/07/2011 11:00:00', '15/07/2011 11:01:00', '15/07/2011 11:02:00', '15/07/2011 11:03:00', '15/07/2011 11:04:00', '15/07/2011 11:05:00', '15/07/2011 11:06:00', '15/07/2011 11:07:00', '15/07/2011 11:08:00', '15/07/2011 11:09:00', '15/07/2011 11:10:00', '15/07/2011 11:11:00', '15/07/2011 11:12:00', '15/07/2011 11:13:00', '15/07/2011 11:14:00', '15/07/2011 11:15:00', '15/07/2011 11:16:00', '15/07/2011 11:17:00', '15/07/2011 11:18:00', '15/07/2011 11:19:00', '15/07/2011 11:20:00', '15/07/2011 11:21:00', '15/07/2011 11:22:00', '15/07/2011 11:23:00', '15/07/2011 11:24:00', '15/07/2011 11:25:00', '15/07/2011 11:26:00', '15/07/2011 11:27:00', '15/07/2011 11:28:00', '15/07/2011 11:29:00', '15/07/2011 11:30:00', '15/07/2011 11:31:00', '15/07/2011 11:32:00', '15/07/2011 11:33:00', '15/07/2011 11:34:00', '15/07/2011 11:35:00', '15/07/2011 11:36:00', '15/07/2011 11:37:00', '15/07/2011 11:38:00', '15/07/2011 11:39:00', '15/07/2011 11:40:00', '15/07/2011 11:41:00', '15/07/2011 11:42:00', '15/07/2011 11:43:00', '15/07/2011 11:44:00', '15/07/2011 11:45:00', '15/07/2011 11:46:00', '15/07/2011 11:47:00', '15/07/2011 11:48:00', '15/07/2011 11:49:00', '15/07/2011 11:50:00', '15/07/2011 11:51:00', '15/07/2011 11:52:00', '15/07/2011 11:53:00', '15/07/2011 11:54:00', '15/07/2011 11:55:00', '15/07/2011 11:56:00', '15/07/2011 11:57:00', '15/07/2011 11:58:00', '15/07/2011 11:59:00', '15/07/2011 12:00:00', '15/07/2011 12:01:00', '15/07/2011 12:02:00', '15/07/2011 12:03:00', '15/07/2011 12:04:00', '15/07/2011 12:05:00', '15/07/2011 12:06:00', '15/07/2011 12:07:00', '15/07/2011 12:08:00', '15/07/2011 12:09:00', '15/07/2011 12:10:00', '15/07/2011 12:11:00', '15/07/2011 12:12:00', '15/07/2011 12:13:00', '15/07/2011 12:14:00', '15/07/2011 12:15:00', '15/07/2011 12:16:00', '15/07/2011 12:17:00', '15/07/2011 12:18:00', '15/07/2011 12:19:00', '15/07/2011 12:20:00', '15/07/2011 12:21:00', '15/07/2011 12:22:00', '15/07/2011 12:23:00', '15/07/2011 12:24:00', '15/07/2011 12:25:00', '15/07/2011 12:26:00', '15/07/2011 12:27:00', '15/07/2011 12:28:00', '15/07/2011 12:29:00', '15/07/2011 12:30:00', '15/07/2011 12:31:00', '15/07/2011 12:32:00', '15/07/2011 12:33:00', '15/07/2011 12:34:00', '15/07/2011 12:35:00', '15/07/2011 12:36:00', '15/07/2011 12:37:00', '15/07/2011 12:38:00', '15/07/2011 12:39:00', '15/07/2011 12:40:00', '15/07/2011 12:41:00', '15/07/2011 12:42:00', '15/07/2011 12:43:00', '15/07/2011 12:44:00', '15/07/2011 12:45:00', '15/07/2011 12:46:00', '15/07/2011 12:47:00', '15/07/2011 12:48:00', '15/07/2011 12:49:00', '15/07/2011 12:50:00', '15/07/2011 12:51:00', '15/07/2011 12:52:00', '15/07/2011 12:53:00', '15/07/2011 12:54:00', '15/07/2011 12:55:00', '15/07/2011 12:56:00', '15/07/2011 12:57:00', '15/07/2011 12:58:00', '15/07/2011 12:59:00', '15/07/2011 13:00:00', '15/07/2011 13:01:00', '15/07/2011 13:02:00', '15/07/2011 13:03:00', '15/07/2011 13:04:00', '15/07/2011 13:05:00', '15/07/2011 13:06:00', '15/07/2011 13:07:00', '15/07/2011 13:08:00', '15/07/2011 13:09:00', '15/07/2011 13:10:00', '15/07/2011 13:11:00', '15/07/2011 13:12:00', '15/07/2011 13:13:00', '15/07/2011 13:14:00', '15/07/2011 13:15:00', '15/07/2011 13:16:00', '15/07/2011 13:17:00', '15/07/2011 13:18:00', '15/07/2011 13:19:00', '15/07/2011 13:20:00', '15/07/2011 13:21:00', '15/07/2011 13:22:00', '15/07/2011 13:23:00', '15/07/2011 13:24:00', '15/07/2011 13:25:00', '15/07/2011 13:26:00', '15/07/2011 13:27:00', '15/07/2011 13:28:00', '15/07/2011 13:29:00', '15/07/2011 13:30:00', '15/07/2011 13:31:00', '15/07/2011 13:32:00', '15/07/2011 13:33:00', '15/07/2011 13:34:00', '15/07/2011 13:35:00', '15/07/2011 13:36:00', '15/07/2011 13:37:00', '15/07/2011 13:38:00', '15/07/2011 13:39:00', '15/07/2011 13:40:00', '15/07/2011 13:41:00', '15/07/2011 13:42:00', '15/07/2011 13:43:00', '15/07/2011 13:44:00', '15/07/2011 13:45:00', '15/07/2011 13:46:00', '15/07/2011 13:47:00', '15/07/2011 13:48:00', '15/07/2011 13:49:00', '15/07/2011 13:50:00', '15/07/2011 13:51:00', '15/07/2011 13:52:00', '15/07/2011 13:53:00', '15/07/2011 13:54:00', '15/07/2011 13:55:00', '15/07/2011 13:56:00', '15/07/2011 13:57:00', '15/07/2011 13:58:00', '15/07/2011 13:59:00', '15/07/2011 14:00:00', '15/07/2011 14:01:00', '15/07/2011 14:02:00', '15/07/2011 14:03:00', '15/07/2011 14:04:00', '15/07/2011 14:05:00', '15/07/2011 14:06:00', '15/07/2011 14:07:00', '15/07/2011 14:08:00', '15/07/2011 14:09:00', '15/07/2011 14:10:00', '15/07/2011 14:11:00', '15/07/2011 14:12:00', '15/07/2011 14:13:00', '15/07/2011 14:14:00', '15/07/2011 14:15:00', '15/07/2011 14:16:00', '15/07/2011 14:17:00', '15/07/2011 14:18:00', '15/07/2011 14:19:00', '15/07/2011 14:20:00', '15/07/2011 14:21:00', '15/07/2011 14:22:00', '15/07/2011 14:23:00', '15/07/2011 14:24:00', '15/07/2011 14:25:00', '15/07/2011 14:26:00', '15/07/2011 14:27:00', '15/07/2011 14:28:00', '15/07/2011 14:29:00', '15/07/2011 14:30:00', '15/07/2011 14:31:00', '15/07/2011 14:32:00', '15/07/2011 14:33:00', '15/07/2011 14:34:00', '15/07/2011 14:35:00', '15/07/2011 14:36:00', '15/07/2011 14:37:00', '15/07/2011 14:38:00', '15/07/2011 14:39:00', '15/07/2011 14:40:00', '15/07/2011 14:41:00', '15/07/2011 14:42:00', '15/07/2011 14:43:00', '15/07/2011 14:44:00', '15/07/2011 14:45:00', '15/07/2011 14:46:00', '15/07/2011 14:47:00', '15/07/2011 14:48:00', '15/07/2011 14:49:00', '15/07/2011 14:50:00', '15/07/2011 14:51:00', '15/07/2011 14:52:00', '15/07/2011 14:53:00', '15/07/2011 14:54:00', '15/07/2011 14:55:00', '15/07/2011 14:56:00', '15/07/2011 14:57:00', '15/07/2011 14:58:00', '15/07/2011 14:59:00', '15/07/2011 15:00:00', '15/07/2011 15:01:00', '15/07/2011 15:02:00', '15/07/2011 15:03:00', '15/07/2011 15:04:00', '15/07/2011 15:05:00', '15/07/2011 15:06:00', '15/07/2011 15:07:00', '15/07/2011 15:08:00', '15/07/2011 15:09:00', '15/07/2011 15:10:00', '15/07/2011 15:11:00', '15/07/2011 15:12:00', '15/07/2011 15:13:00', '15/07/2011 15:14:00', '15/07/2011 15:15:00', '15/07/2011 15:16:00', '15/07/2011 15:17:00', '15/07/2011 15:18:00', '15/07/2011 15:19:00', '15/07/2011 15:20:00', '15/07/2011 15:21:00', '15/07/2011 15:22:00', '15/07/2011 15:23:00', '15/07/2011 15:24:00', '15/07/2011 15:25:00', '15/07/2011 15:26:00', '15/07/2011 15:27:00', '15/07/2011 15:28:00', '15/07/2011 15:29:00', '15/07/2011 15:30:00', '15/07/2011 15:31:00', '15/07/2011 15:32:00', '15/07/2011 15:33:00', '15/07/2011 15:34:00', '15/07/2011 15:35:00', '15/07/2011 15:36:00', '15/07/2011 15:37:00', '15/07/2011 15:38:00', '15/07/2011 15:39:00', '15/07/2011 15:40:00', '15/07/2011 15:41:00', '15/07/2011 15:42:00', '15/07/2011 15:43:00', '15/07/2011 15:44:00', '15/07/2011 15:45:00', '15/07/2011 15:46:00', '15/07/2011 15:47:00', '15/07/2011 15:48:00', '15/07/2011 15:49:00', '15/07/2011 15:50:00', '15/07/2011 15:51:00', '15/07/2011 15:52:00', '15/07/2011 15:53:00', '15/07/2011 15:54:00', '15/07/2011 15:55:00', '15/07/2011 15:56:00', '15/07/2011 15:57:00', '15/07/2011 15:58:00', '15/07/2011 15:59:00', '15/07/2011 16:00:00', '15/07/2011 16:01:00', '15/07/2011 16:02:00', '15/07/2011 16:03:00', '15/07/2011 16:04:00', '15/07/2011 16:05:00', '15/07/2011 16:06:00', '15/07/2011 16:07:00', '15/07/2011 16:08:00', '15/07/2011 16:09:00', '15/07/2011 16:10:00', '15/07/2011 16:11:00', '15/07/2011 16:12:00', '15/07/2011 16:13:00', '15/07/2011 16:14:00', '15/07/2011 16:15:00', '15/07/2011 16:16:00', '15/07/2011 16:17:00', '15/07/2011 16:18:00', '15/07/2011 16:19:00', '15/07/2011 16:20:00', '15/07/2011 16:21:00', '15/07/2011 16:22:00', '15/07/2011 16:23:00', '15/07/2011 16:24:00', '15/07/2011 16:25:00', '15/07/2011 16:26:00', '15/07/2011 16:27:00', '15/07/2011 16:28:00', '15/07/2011 16:29:00', '15/07/2011 16:30:00', '15/07/2011 16:31:00', '15/07/2011 16:32:00', '15/07/2011 16:33:00', '15/07/2011 16:34:00', '15/07/2011 16:35:00', '15/07/2011 16:36:00', '15/07/2011 16:37:00', '15/07/2011 16:38:00', '15/07/2011 16:39:00', '15/07/2011 16:40:00', '15/07/2011 16:41:00', '15/07/2011 16:42:00', '15/07/2011 16:43:00', '15/07/2011 16:44:00', '15/07/2011 16:45:00', '15/07/2011 16:46:00', '15/07/2011 16:47:00', '15/07/2011 16:48:00', '15/07/2011 16:49:00', '15/07/2011 16:50:00', '15/07/2011 16:51:00', '15/07/2011 16:52:00', '15/07/2011 16:53:00', '15/07/2011 16:54:00', '15/07/2011 16:55:00', '15/07/2011 16:56:00', '15/07/2011 16:57:00', '15/07/2011 16:58:00', '15/07/2011 16:59:00', '15/07/2011 17:00:00', '15/07/2011 17:01:00', '15/07/2011 17:02:00', '15/07/2011 17:03:00', '15/07/2011 17:04:00', '15/07/2011 17:05:00', '15/07/2011 17:06:00', '15/07/2011 17:07:00', '15/07/2011 17:08:00', '15/07/2011 17:09:00', '15/07/2011 17:10:00', '15/07/2011 17:11:00', '15/07/2011 17:12:00', '15/07/2011 17:13:00', '15/07/2011 17:14:00', '15/07/2011 17:15:00', '15/07/2011 17:16:00', '15/07/2011 17:17:00', '15/07/2011 17:18:00', '15/07/2011 17:19:00', '15/07/2011 17:20:00', '15/07/2011 17:21:00', '15/07/2011 17:22:00', '15/07/2011 17:23:00', '15/07/2011 17:24:00', '15/07/2011 17:25:00', '15/07/2011 17:26:00', '15/07/2011 17:27:00', '15/07/2011 17:28:00', '15/07/2011 17:29:00', '15/07/2011 17:30:00', '15/07/2011 17:31:00', '15/07/2011 17:32:00', '15/07/2011 17:33:00', '15/07/2011 17:34:00', '15/07/2011 17:35:00', '15/07/2011 17:36:00', '15/07/2011 17:37:00', '15/07/2011 17:38:00', '15/07/2011 17:39:00', '15/07/2011 17:40:00', '15/07/2011 17:41:00', '15/07/2011 17:42:00', '15/07/2011 17:43:00', '15/07/2011 17:44:00', '15/07/2011 17:45:00', '15/07/2011 17:46:00', '15/07/2011 17:47:00', '15/07/2011 17:48:00', '15/07/2011 17:49:00', '15/07/2011 17:50:00', '15/07/2011 17:51:00', '15/07/2011 17:52:00', '15/07/2011 17:53:00', '15/07/2011 17:54:00', '15/07/2011 17:55:00', '15/07/2011 17:56:00', '15/07/2011 17:57:00', '15/07/2011 17:58:00', '15/07/2011 17:59:00', '15/07/2011 18:00:00', '15/07/2011 18:01:00', '15/07/2011 18:02:00', '15/07/2011 18:03:00', '15/07/2011 18:04:00', '15/07/2011 18:05:00', '15/07/2011 18:06:00', '15/07/2011 18:07:00', '15/07/2011 18:08:00', '15/07/2011 18:09:00', '15/07/2011 18:10:00', '15/07/2011 18:11:00', '15/07/2011 18:12:00', '15/07/2011 18:13:00', '15/07/2011 18:14:00', '15/07/2011 18:15:00', '15/07/2011 18:16:00', '15/07/2011 18:17:00', '15/07/2011 18:18:00', '15/07/2011 18:19:00', '15/07/2011 18:20:00', '15/07/2011 18:21:00', '15/07/2011 18:22:00', '15/07/2011 18:23:00', '15/07/2011 18:24:00', '15/07/2011 18:25:00', '15/07/2011 18:26:00', '15/07/2011 18:27:00', '15/07/2011 18:28:00', '15/07/2011 18:29:00', '15/07/2011 18:30:00', '15/07/2011 18:31:00', '15/07/2011 18:32:00', '15/07/2011 18:33:00', '15/07/2011 18:34:00', '15/07/2011 18:35:00', '15/07/2011 18:36:00', '15/07/2011 18:37:00', '15/07/2011 18:38:00', '15/07/2011 18:39:00', '15/07/2011 18:40:00', '15/07/2011 18:41:00', '15/07/2011 18:42:00', '15/07/2011 18:43:00', '15/07/2011 18:44:00', '15/07/2011 18:45:00', '15/07/2011 18:46:00', '15/07/2011 18:47:00', '15/07/2011 18:48:00', '15/07/2011 18:49:00', '15/07/2011 18:50:00', '15/07/2011 18:51:00', '15/07/2011 18:52:00', '15/07/2011 18:53:00', '15/07/2011 18:54:00', '15/07/2011 18:55:00', '15/07/2011 18:56:00', '15/07/2011 18:57:00', '15/07/2011 18:58:00', '15/07/2011 18:59:00', '15/07/2011 19:00:00', '15/07/2011 19:01:00', '15/07/2011 19:02:00', '15/07/2011 19:03:00', '15/07/2011 19:04:00', '15/07/2011 19:05:00', '15/07/2011 19:06:00', '15/07/2011 19:07:00', '15/07/2011 19:08:00', '15/07/2011 19:09:00', '15/07/2011 19:10:00', '15/07/2011 19:11:00', '15/07/2011 19:12:00', '15/07/2011 19:13:00', '15/07/2011 19:14:00', '15/07/2011 19:15:00', '15/07/2011 19:16:00', '15/07/2011 19:17:00', '15/07/2011 19:18:00', '15/07/2011 19:19:00', '15/07/2011 19:20:00', '15/07/2011 19:21:00', '15/07/2011 19:22:00', '15/07/2011 19:23:00', '15/07/2011 19:24:00', '15/07/2011 19:25:00', '15/07/2011 19:26:00', '15/07/2011 19:27:00', '15/07/2011 19:28:00', '15/07/2011 19:29:00', '15/07/2011 19:30:00', '15/07/2011 19:31:00', '15/07/2011 19:32:00', '15/07/2011 19:33:00', '15/07/2011 19:34:00', '15/07/2011 19:35:00', '15/07/2011 19:36:00', '15/07/2011 19:37:00', '15/07/2011 19:38:00', '15/07/2011 19:39:00', '15/07/2011 19:40:00', '15/07/2011 19:41:00', '15/07/2011 19:42:00', '15/07/2011 19:43:00', '15/07/2011 19:44:00', '15/07/2011 19:45:00', '15/07/2011 19:46:00', '15/07/2011 19:47:00', '15/07/2011 19:48:00', '15/07/2011 19:49:00', '15/07/2011 19:50:00', '15/07/2011 19:51:00', '15/07/2011 19:52:00', '15/07/2011 19:53:00', '15/07/2011 19:54:00', '15/07/2011 19:55:00', '15/07/2011 19:56:00', '15/07/2011 19:57:00', '15/07/2011 19:58:00', '15/07/2011 19:59:00', '15/07/2011 20:00:00', '15/07/2011 20:01:00', '15/07/2011 20:02:00', '15/07/2011 20:03:00', '15/07/2011 20:04:00', '15/07/2011 20:05:00', '15/07/2011 20:06:00', '15/07/2011 20:07:00', '15/07/2011 20:08:00', '15/07/2011 20:09:00', '15/07/2011 20:10:00', '15/07/2011 20:11:00', '15/07/2011 20:12:00', '15/07/2011 20:13:00', '15/07/2011 20:14:00', '15/07/2011 20:15:00', '15/07/2011 20:16:00', '15/07/2011 20:17:00', '15/07/2011 20:18:00', '15/07/2011 20:19:00', '15/07/2011 20:20:00', '15/07/2011 20:21:00', '15/07/2011 20:22:00', '15/07/2011 20:23:00', '15/07/2011 20:24:00', '15/07/2011 20:25:00', '15/07/2011 20:26:00', '15/07/2011 20:27:00', '15/07/2011 20:28:00', '15/07/2011 20:29:00', '15/07/2011 20:30:00', '15/07/2011 20:31:00', '15/07/2011 20:32:00', '15/07/2011 20:33:00', '15/07/2011 20:34:00', '15/07/2011 20:35:00', '15/07/2011 20:36:00', '15/07/2011 20:37:00', '15/07/2011 20:38:00', '15/07/2011 20:39:00', '15/07/2011 20:40:00', '15/07/2011 20:41:00', '15/07/2011 20:42:00', '15/07/2011 20:43:00', '15/07/2011 20:44:00', '15/07/2011 20:45:00', '15/07/2011 20:46:00', '15/07/2011 20:47:00', '15/07/2011 20:48:00', '15/07/2011 20:49:00', '15/07/2011 20:50:00', '15/07/2011 20:51:00', '15/07/2011 20:52:00', '15/07/2011 20:53:00', '15/07/2011 20:54:00', '15/07/2011 20:55:00', '15/07/2011 20:56:00', '15/07/2011 20:57:00', '15/07/2011 20:58:00', '15/07/2011 20:59:00', '15/07/2011 21:00:00', '15/07/2011 21:01:00', '15/07/2011 21:02:00', '15/07/2011 21:03:00', '15/07/2011 21:04:00', '15/07/2011 21:05:00', '15/07/2011 21:06:00', '15/07/2011 21:07:00', '15/07/2011 21:08:00', '15/07/2011 21:09:00', '15/07/2011 21:10:00', '15/07/2011 21:11:00', '15/07/2011 21:12:00', '15/07/2011 21:13:00', '15/07/2011 21:14:00', '15/07/2011 21:15:00', '15/07/2011 21:16:00', '15/07/2011 21:17:00', '15/07/2011 21:18:00', '15/07/2011 21:19:00', '15/07/2011 21:20:00', '15/07/2011 21:21:00', '15/07/2011 21:22:00', '15/07/2011 21:23:00', '15/07/2011 21:24:00', '15/07/2011 21:25:00', '15/07/2011 21:26:00', '15/07/2011 21:27:00', '15/07/2011 21:28:00', '15/07/2011 21:29:00', '15/07/2011 21:30:00', '15/07/2011 21:31:00', '15/07/2011 21:32:00', '15/07/2011 21:33:00', '15/07/2011 21:34:00', '15/07/2011 21:35:00', '15/07/2011 21:36:00', '15/07/2011 21:37:00', '15/07/2011 21:38:00', '15/07/2011 21:39:00', '15/07/2011 21:40:00', '15/07/2011 21:41:00', '15/07/2011 21:42:00', '15/07/2011 21:43:00', '15/07/2011 21:44:00', '15/07/2011 21:45:00', '15/07/2011 21:46:00', '15/07/2011 21:47:00', '15/07/2011 21:48:00', '15/07/2011 21:49:00', '15/07/2011 21:50:00', '15/07/2011 21:51:00', '15/07/2011 21:52:00', '15/07/2011 21:53:00', '15/07/2011 21:54:00', '15/07/2011 21:55:00', '15/07/2011 21:56:00', '15/07/2011 21:57:00', '15/07/2011 21:58:00', '15/07/2011 21:59:00', '15/07/2011 22:00:00', '15/07/2011 22:01:00', '15/07/2011 22:02:00', '15/07/2011 22:03:00', '15/07/2011 22:04:00', '15/07/2011 22:05:00', '15/07/2011 22:06:00', '15/07/2011 22:07:00', '15/07/2011 22:08:00', '15/07/2011 22:09:00', '15/07/2011 22:10:00', '15/07/2011 22:11:00', '15/07/2011 22:12:00', '15/07/2011 22:13:00', '15/07/2011 22:14:00', '15/07/2011 22:15:00', '15/07/2011 22:16:00', '15/07/2011 22:17:00', '15/07/2011 22:18:00', '15/07/2011 22:19:00', '15/07/2011 22:20:00', '15/07/2011 22:21:00', '15/07/2011 22:22:00', '15/07/2011 22:23:00', '15/07/2011 22:24:00', '15/07/2011 22:25:00', '15/07/2011 22:26:00', '15/07/2011 22:27:00', '15/07/2011 22:28:00', '15/07/2011 22:29:00', '15/07/2011 22:30:00', '15/07/2011 22:31:00', '15/07/2011 22:32:00', '15/07/2011 22:33:00', '15/07/2011 22:34:00', '15/07/2011 22:35:00', '15/07/2011 22:36:00', '15/07/2011 22:37:00', '15/07/2011 22:38:00', '15/07/2011 22:39:00', '15/07/2011 22:40:00', '15/07/2011 22:41:00', '15/07/2011 22:42:00', '15/07/2011 22:43:00', '15/07/2011 22:44:00', '15/07/2011 22:45:00', '15/07/2011 22:46:00', '15/07/2011 22:47:00', '15/07/2011 22:48:00', '15/07/2011 22:49:00', '15/07/2011 22:50:00', '15/07/2011 22:51:00', '15/07/2011 22:52:00', '15/07/2011 22:53:00', '15/07/2011 22:54:00', '15/07/2011 22:55:00', '15/07/2011 22:56:00', '15/07/2011 22:57:00', '15/07/2011 22:58:00', '15/07/2011 22:59:00', '15/07/2011 23:00:00', '15/07/2011 23:01:00', '15/07/2011 23:02:00', '15/07/2011 23:03:00', '15/07/2011 23:04:00', '15/07/2011 23:05:00', '15/07/2011 23:06:00', '15/07/2011 23:07:00', '15/07/2011 23:08:00', '15/07/2011 23:09:00', '15/07/2011 23:10:00', '15/07/2011 23:11:00', '15/07/2011 23:12:00', '15/07/2011 23:13:00', '15/07/2011 23:14:00', '15/07/2011 23:15:00', '15/07/2011 23:16:00', '15/07/2011 23:17:00', '15/07/2011 23:18:00', '15/07/2011 23:19:00', '15/07/2011 23:20:00', '15/07/2011 23:21:00', '15/07/2011 23:22:00', '15/07/2011 23:23:00', '15/07/2011 23:24:00', '15/07/2011 23:25:00', '15/07/2011 23:26:00', '15/07/2011 23:27:00', '15/07/2011 23:28:00', '15/07/2011 23:29:00', '15/07/2011 23:30:00', '15/07/2011 23:31:00', '15/07/2011 23:32:00', '15/07/2011 23:33:00', '15/07/2011 23:34:00', '15/07/2011 23:35:00', '15/07/2011 23:36:00', '15/07/2011 23:37:00', '15/07/2011 23:38:00', '15/07/2011 23:39:00', '15/07/2011 23:40:00', '15/07/2011 23:41:00', '15/07/2011 23:42:00', '15/07/2011 23:43:00', '15/07/2011 23:44:00', '15/07/2011 23:45:00', '15/07/2011 23:46:00', '15/07/2011 23:47:00', '15/07/2011 23:48:00', '15/07/2011 23:49:00', '15/07/2011 23:50:00', '15/07/2011 23:51:00', '15/07/2011 23:52:00', '15/07/2011 23:53:00', '15/07/2011 23:54:00', '15/07/2011 23:55:00', '15/07/2011 23:56:00', '15/07/2011 23:57:00', '15/07/2011 23:58:00', '15/07/2011 23:59:00', '16/07/2011 0:00:00', '16/07/2011 0:01:00', '16/07/2011 0:02:00', '16/07/2011 0:03:00', '16/07/2011 0:04:00', '16/07/2011 0:05:00', '16/07/2011 0:06:00', '16/07/2011 0:07:00', '16/07/2011 0:08:00', '16/07/2011 0:09:00', '16/07/2011 0:10:00', '16/07/2011 0:11:00', '16/07/2011 0:12:00', '16/07/2011 0:13:00', '16/07/2011 0:14:00', '16/07/2011 0:15:00', '16/07/2011 0:16:00', '16/07/2011 0:17:00', '16/07/2011 0:18:00', '16/07/2011 0:19:00', '16/07/2011 0:20:00', '16/07/2011 0:21:00', '16/07/2011 0:22:00', '16/07/2011 0:23:00', '16/07/2011 0:24:00', '16/07/2011 0:25:00', '16/07/2011 0:26:00', '16/07/2011 0:27:00', '16/07/2011 0:28:00', '16/07/2011 0:29:00', '16/07/2011 0:30:00', '16/07/2011 0:31:00', '16/07/2011 0:32:00', '16/07/2011 0:33:00', '16/07/2011 0:34:00', '16/07/2011 0:35:00', '16/07/2011 0:36:00', '16/07/2011 0:37:00', '16/07/2011 0:38:00', '16/07/2011 0:39:00', '16/07/2011 0:40:00', '16/07/2011 0:41:00', '16/07/2011 0:42:00', '16/07/2011 0:43:00', '16/07/2011 0:44:00', '16/07/2011 0:45:00', '16/07/2011 0:46:00', '16/07/2011 0:47:00', '16/07/2011 0:48:00', '16/07/2011 0:49:00', '16/07/2011 0:50:00', '16/07/2011 0:51:00', '16/07/2011 0:52:00', '16/07/2011 0:53:00', '16/07/2011 0:54:00', '16/07/2011 0:55:00', '16/07/2011 0:56:00', '16/07/2011 0:57:00', '16/07/2011 0:58:00', '16/07/2011 0:59:00', '16/07/2011 1:00:00', '16/07/2011 1:01:00', '16/07/2011 1:02:00', '16/07/2011 1:03:00', '16/07/2011 1:04:00', '16/07/2011 1:05:00', '16/07/2011 1:06:00', '16/07/2011 1:07:00', '16/07/2011 1:08:00', '16/07/2011 1:09:00', '16/07/2011 1:10:00', '16/07/2011 1:11:00', '16/07/2011 1:12:00', '16/07/2011 1:13:00', '16/07/2011 1:14:00', '16/07/2011 1:15:00', '16/07/2011 1:16:00', '16/07/2011 1:17:00', '16/07/2011 1:18:00', '16/07/2011 1:19:00', '16/07/2011 1:20:00', '16/07/2011 1:21:00', '16/07/2011 1:22:00', '16/07/2011 1:23:00', '16/07/2011 1:24:00', '16/07/2011 1:25:00', '16/07/2011 1:26:00', '16/07/2011 1:27:00', '16/07/2011 1:28:00', '16/07/2011 1:29:00', '16/07/2011 1:30:00', '16/07/2011 1:31:00', '16/07/2011 1:32:00', '16/07/2011 1:33:00', '16/07/2011 1:34:00', '16/07/2011 1:35:00', '16/07/2011 1:36:00', '16/07/2011 1:37:00', '16/07/2011 1:38:00', '16/07/2011 1:39:00', '16/07/2011 1:40:00', '16/07/2011 1:41:00', '16/07/2011 1:42:00', '16/07/2011 1:43:00', '16/07/2011 1:44:00', '16/07/2011 1:45:00', '16/07/2011 1:46:00', '16/07/2011 1:47:00', '16/07/2011 1:48:00', '16/07/2011 1:49:00', '16/07/2011 1:50:00', '16/07/2011 1:51:00', '16/07/2011 1:52:00', '16/07/2011 1:53:00', '16/07/2011 1:54:00', '16/07/2011 1:55:00', '16/07/2011 1:56:00', '16/07/2011 1:57:00', '16/07/2011 1:58:00', '16/07/2011 1:59:00', '16/07/2011 2:00:00', '16/07/2011 2:01:00', '16/07/2011 2:02:00', '16/07/2011 2:03:00', '16/07/2011 2:04:00', '16/07/2011 2:05:00', '16/07/2011 2:06:00', '16/07/2011 2:07:00', '16/07/2011 2:08:00', '16/07/2011 2:09:00', '16/07/2011 2:10:00', '16/07/2011 2:11:00', '16/07/2011 2:12:00', '16/07/2011 2:13:00', '16/07/2011 2:14:00', '16/07/2011 2:15:00', '16/07/2011 2:16:00', '16/07/2011 2:17:00', '16/07/2011 2:18:00', '16/07/2011 2:19:00', '16/07/2011 2:20:00', '16/07/2011 2:21:00', '16/07/2011 2:22:00', '16/07/2011 2:23:00', '16/07/2011 2:24:00', '16/07/2011 2:25:00', '16/07/2011 2:26:00', '16/07/2011 2:27:00', '16/07/2011 2:28:00', '16/07/2011 2:29:00', '16/07/2011 2:30:00', '16/07/2011 2:31:00', '16/07/2011 2:32:00', '16/07/2011 2:33:00', '16/07/2011 2:34:00', '16/07/2011 2:35:00', '16/07/2011 2:36:00', '16/07/2011 2:37:00', '16/07/2011 2:38:00', '16/07/2011 2:39:00', '16/07/2011 2:40:00', '16/07/2011 2:41:00', '16/07/2011 2:42:00', '16/07/2011 2:43:00', '16/07/2011 2:44:00', '16/07/2011 2:45:00', '16/07/2011 2:46:00', '16/07/2011 2:47:00', '16/07/2011 2:48:00', '16/07/2011 2:49:00', '16/07/2011 2:50:00', '16/07/2011 2:51:00', '16/07/2011 2:52:00', '16/07/2011 2:53:00', '16/07/2011 2:54:00', '16/07/2011 2:55:00', '16/07/2011 2:56:00', '16/07/2011 2:57:00', '16/07/2011 2:58:00', '16/07/2011 2:59:00', '16/07/2011 3:00:00', '16/07/2011 3:01:00', '16/07/2011 3:02:00', '16/07/2011 3:03:00', '16/07/2011 3:04:00', '16/07/2011 3:05:00', '16/07/2011 3:06:00', '16/07/2011 3:07:00', '16/07/2011 3:08:00', '16/07/2011 3:09:00', '16/07/2011 3:10:00', '16/07/2011 3:11:00', '16/07/2011 3:12:00', '16/07/2011 3:13:00', '16/07/2011 3:14:00', '16/07/2011 3:15:00', '16/07/2011 3:16:00', '16/07/2011 3:17:00', '16/07/2011 3:18:00', '16/07/2011 3:19:00', '16/07/2011 3:20:00', '16/07/2011 3:21:00', '16/07/2011 3:22:00', '16/07/2011 3:23:00', '16/07/2011 3:24:00', '16/07/2011 3:25:00', '16/07/2011 3:26:00', '16/07/2011 3:27:00', '16/07/2011 3:28:00', '16/07/2011 3:29:00', '16/07/2011 3:30:00', '16/07/2011 3:31:00', '16/07/2011 3:32:00', '16/07/2011 3:33:00', '16/07/2011 3:34:00', '16/07/2011 3:35:00', '16/07/2011 3:36:00', '16/07/2011 3:37:00', '16/07/2011 3:38:00', '16/07/2011 3:39:00', '16/07/2011 3:40:00', '16/07/2011 3:41:00', '16/07/2011 3:42:00', '16/07/2011 3:43:00', '16/07/2011 3:44:00', '16/07/2011 3:45:00', '16/07/2011 3:46:00', '16/07/2011 3:47:00', '16/07/2011 3:48:00', '16/07/2011 3:49:00', '16/07/2011 3:50:00', '16/07/2011 3:51:00', '16/07/2011 3:52:00', '16/07/2011 3:53:00', '16/07/2011 3:54:00', '16/07/2011 3:55:00', '16/07/2011 3:56:00', '16/07/2011 3:57:00', '16/07/2011 3:58:00', '16/07/2011 3:59:00', '16/07/2011 4:00:00', '16/07/2011 4:01:00', '16/07/2011 4:02:00', '16/07/2011 4:03:00', '16/07/2011 4:04:00', '16/07/2011 4:05:00', '16/07/2011 4:06:00', '16/07/2011 4:07:00', '16/07/2011 4:08:00', '16/07/2011 4:09:00', '16/07/2011 4:10:00', '16/07/2011 4:11:00', '16/07/2011 4:12:00', '16/07/2011 4:13:00', '16/07/2011 4:14:00', '16/07/2011 4:15:00', '16/07/2011 4:16:00', '16/07/2011 4:17:00', '16/07/2011 4:18:00', '16/07/2011 4:19:00', '16/07/2011 4:20:00', '16/07/2011 4:21:00', '16/07/2011 4:22:00', '16/07/2011 4:23:00', '16/07/2011 4:24:00', '16/07/2011 4:25:00', '16/07/2011 4:26:00', '16/07/2011 4:27:00', '16/07/2011 4:28:00', '16/07/2011 4:29:00', '16/07/2011 4:30:00', '16/07/2011 4:31:00', '16/07/2011 4:32:00', '16/07/2011 4:33:00', '16/07/2011 4:34:00', '16/07/2011 4:35:00', '16/07/2011 4:36:00', '16/07/2011 4:37:00', '16/07/2011 4:38:00', '16/07/2011 4:39:00', '16/07/2011 4:40:00', '16/07/2011 4:41:00', '16/07/2011 4:42:00', '16/07/2011 4:43:00', '16/07/2011 4:44:00', '16/07/2011 4:45:00', '16/07/2011 4:46:00', '16/07/2011 4:47:00', '16/07/2011 4:48:00', '16/07/2011 4:49:00', '16/07/2011 4:50:00', '16/07/2011 4:51:00', '16/07/2011 4:52:00', '16/07/2011 4:53:00', '16/07/2011 4:54:00', '16/07/2011 4:55:00', '16/07/2011 4:56:00', '16/07/2011 4:57:00', '16/07/2011 4:58:00', '16/07/2011 4:59:00', '16/07/2011 5:00:00', '16/07/2011 5:01:00', '16/07/2011 5:02:00', '16/07/2011 5:03:00', '16/07/2011 5:04:00', '16/07/2011 5:05:00', '16/07/2011 5:06:00', '16/07/2011 5:07:00', '16/07/2011 5:08:00', '16/07/2011 5:09:00', '16/07/2011 5:10:00', '16/07/2011 5:11:00', '16/07/2011 5:12:00', '16/07/2011 5:13:00', '16/07/2011 5:14:00', '16/07/2011 5:15:00', '16/07/2011 5:16:00', '16/07/2011 5:17:00', '16/07/2011 5:18:00', '16/07/2011 5:19:00', '16/07/2011 5:20:00', '16/07/2011 5:21:00', '16/07/2011 5:22:00', '16/07/2011 5:23:00', '16/07/2011 5:24:00', '16/07/2011 5:25:00', '16/07/2011 5:26:00', '16/07/2011 5:27:00', '16/07/2011 5:28:00', '16/07/2011 5:29:00', '16/07/2011 5:30:00', '16/07/2011 5:31:00', '16/07/2011 5:32:00', '16/07/2011 5:33:00', '16/07/2011 5:34:00', '16/07/2011 5:35:00', '16/07/2011 5:36:00', '16/07/2011 5:37:00', '16/07/2011 5:38:00', '16/07/2011 5:39:00', '16/07/2011 5:40:00', '16/07/2011 5:41:00', '16/07/2011 5:42:00', '16/07/2011 5:43:00', '16/07/2011 5:44:00', '16/07/2011 5:45:00', '16/07/2011 5:46:00', '16/07/2011 5:47:00', '16/07/2011 5:48:00', '16/07/2011 5:49:00', '16/07/2011 5:50:00', '16/07/2011 5:51:00', '16/07/2011 5:52:00', '16/07/2011 5:53:00', '16/07/2011 5:54:00', '16/07/2011 5:55:00', '16/07/2011 5:56:00', '16/07/2011 5:57:00', '16/07/2011 5:58:00', '16/07/2011 5:59:00', '16/07/2011 6:00:00', '16/07/2011 6:01:00', '16/07/2011 6:02:00', '16/07/2011 6:03:00', '16/07/2011 6:04:00', '16/07/2011 6:05:00', '16/07/2011 6:06:00', '16/07/2011 6:07:00', '16/07/2011 6:08:00', '16/07/2011 6:09:00', '16/07/2011 6:10:00', '16/07/2011 6:11:00', '16/07/2011 6:12:00', '16/07/2011 6:13:00', '16/07/2011 6:14:00', '16/07/2011 6:15:00', '16/07/2011 6:16:00', '16/07/2011 6:17:00', '16/07/2011 6:18:00', '16/07/2011 6:19:00', '16/07/2011 6:20:00', '16/07/2011 6:21:00', '16/07/2011 6:22:00', '16/07/2011 6:23:00', '16/07/2011 6:24:00', '16/07/2011 6:25:00', '16/07/2011 6:26:00', '16/07/2011 6:27:00', '16/07/2011 6:28:00', '16/07/2011 6:29:00', '16/07/2011 6:30:00', '16/07/2011 6:31:00', '16/07/2011 6:32:00', '16/07/2011 6:33:00', '16/07/2011 6:34:00', '16/07/2011 6:35:00', '16/07/2011 6:36:00', '16/07/2011 6:37:00', '16/07/2011 6:38:00', '16/07/2011 6:39:00', '16/07/2011 6:40:00', '16/07/2011 6:41:00', '16/07/2011 6:42:00', '16/07/2011 6:43:00', '16/07/2011 6:44:00', '16/07/2011 6:45:00', '16/07/2011 6:46:00', '16/07/2011 6:47:00', '16/07/2011 6:48:00', '16/07/2011 6:49:00', '16/07/2011 6:50:00', '16/07/2011 6:51:00', '16/07/2011 6:52:00', '16/07/2011 6:53:00', '16/07/2011 6:54:00', '16/07/2011 6:55:00', '16/07/2011 6:56:00', '16/07/2011 6:57:00', '16/07/2011 6:58:00', '16/07/2011 6:59:00', '16/07/2011 7:00:00', '16/07/2011 7:01:00', '16/07/2011 7:02:00', '16/07/2011 7:03:00', '16/07/2011 7:04:00', '16/07/2011 7:05:00', '16/07/2011 7:06:00', '16/07/2011 7:07:00', '16/07/2011 7:08:00', '16/07/2011 7:09:00', '16/07/2011 7:10:00', '16/07/2011 7:11:00', '16/07/2011 7:12:00', '16/07/2011 7:13:00', '16/07/2011 7:14:00', '16/07/2011 7:15:00', '16/07/2011 7:16:00', '16/07/2011 7:17:00', '16/07/2011 7:18:00', '16/07/2011 7:19:00', '16/07/2011 7:20:00', '16/07/2011 7:21:00', '16/07/2011 7:22:00', '16/07/2011 7:23:00', '16/07/2011 7:24:00', '16/07/2011 7:25:00', '16/07/2011 7:26:00', '16/07/2011 7:27:00', '16/07/2011 7:28:00', '16/07/2011 7:29:00', '16/07/2011 7:30:00', '16/07/2011 7:31:00', '16/07/2011 7:32:00', '16/07/2011 7:33:00', '16/07/2011 7:34:00', '16/07/2011 7:35:00', '16/07/2011 7:36:00', '16/07/2011 7:37:00', '16/07/2011 7:38:00', '16/07/2011 7:39:00', '16/07/2011 7:40:00', '16/07/2011 7:41:00', '16/07/2011 7:42:00', '16/07/2011 7:43:00', '16/07/2011 7:44:00', '16/07/2011 7:45:00', '16/07/2011 7:46:00', '16/07/2011 7:47:00', '16/07/2011 7:48:00', '16/07/2011 7:49:00', '16/07/2011 7:50:00', '16/07/2011 7:51:00', '16/07/2011 7:52:00', '16/07/2011 7:53:00', '16/07/2011 7:54:00', '16/07/2011 7:55:00', '16/07/2011 7:56:00', '16/07/2011 7:57:00', '16/07/2011 7:58:00', '16/07/2011 7:59:00', '16/07/2011 8:00:00', '16/07/2011 8:01:00', '16/07/2011 8:02:00', '16/07/2011 8:03:00', '16/07/2011 8:04:00', '16/07/2011 8:05:00', '16/07/2011 8:06:00', '16/07/2011 8:07:00', '16/07/2011 8:08:00', '16/07/2011 8:09:00', '16/07/2011 8:10:00', '16/07/2011 8:11:00', '16/07/2011 8:12:00', '16/07/2011 8:13:00', '16/07/2011 8:14:00', '16/07/2011 8:15:00', '16/07/2011 8:16:00', '16/07/2011 8:17:00', '16/07/2011 8:18:00', '16/07/2011 8:19:00', '16/07/2011 8:20:00', '16/07/2011 8:21:00', '16/07/2011 8:22:00', '16/07/2011 8:23:00', '16/07/2011 8:24:00', '16/07/2011 8:25:00', '16/07/2011 8:26:00', '16/07/2011 8:27:00', '16/07/2011 8:28:00', '16/07/2011 8:29:00', '16/07/2011 8:30:00', '16/07/2011 8:31:00', '16/07/2011 8:32:00', '16/07/2011 8:33:00', '16/07/2011 8:34:00', '16/07/2011 8:35:00', '16/07/2011 8:36:00', '16/07/2011 8:37:00', '16/07/2011 8:38:00', '16/07/2011 8:39:00', '16/07/2011 8:40:00', '16/07/2011 8:41:00', '16/07/2011 8:42:00', '16/07/2011 8:43:00', '16/07/2011 8:44:00', '16/07/2011 8:45:00', '16/07/2011 8:46:00', '16/07/2011 8:47:00', '16/07/2011 8:48:00', '16/07/2011 8:49:00', '16/07/2011 8:50:00', '16/07/2011 8:51:00', '16/07/2011 8:52:00', '16/07/2011 8:53:00', '16/07/2011 8:54:00', '16/07/2011 8:55:00', '16/07/2011 8:56:00', '16/07/2011 8:57:00', '16/07/2011 8:58:00', '16/07/2011 8:59:00', '16/07/2011 9:00:00', '16/07/2011 9:01:00', '16/07/2011 9:02:00', '16/07/2011 9:03:00', '16/07/2011 9:04:00', '16/07/2011 9:05:00', '16/07/2011 9:06:00', '16/07/2011 9:07:00', '16/07/2011 9:08:00', '16/07/2011 9:09:00', '16/07/2011 9:10:00', '16/07/2011 9:11:00', '16/07/2011 9:12:00', '16/07/2011 9:13:00', '16/07/2011 9:14:00', '16/07/2011 9:15:00', '16/07/2011 9:16:00', '16/07/2011 9:17:00', '16/07/2011 9:18:00', '16/07/2011 9:19:00', '16/07/2011 9:20:00', '16/07/2011 9:21:00', '16/07/2011 9:22:00', '16/07/2011 9:23:00', '16/07/2011 9:24:00', '16/07/2011 9:25:00', '16/07/2011 9:26:00', '16/07/2011 9:27:00', '16/07/2011 9:28:00', '16/07/2011 9:29:00', '16/07/2011 9:30:00', '16/07/2011 9:31:00', '16/07/2011 9:32:00', '16/07/2011 9:33:00', '16/07/2011 9:34:00', '16/07/2011 9:35:00', '16/07/2011 9:36:00', '16/07/2011 9:37:00', '16/07/2011 9:38:00', '16/07/2011 9:39:00', '16/07/2011 9:40:00', '16/07/2011 9:41:00', '16/07/2011 9:42:00', '16/07/2011 9:43:00', '16/07/2011 9:44:00', '16/07/2011 9:45:00', '16/07/2011 9:46:00', '16/07/2011 9:47:00', '16/07/2011 9:48:00', '16/07/2011 9:49:00', '16/07/2011 9:50:00', '16/07/2011 9:51:00', '16/07/2011 9:52:00', '16/07/2011 9:53:00', '16/07/2011 9:54:00', '16/07/2011 9:55:00', '16/07/2011 9:56:00', '16/07/2011 9:57:00', '16/07/2011 9:58:00', '16/07/2011 9:59:00', '16/07/2011 10:00:00', '16/07/2011 10:01:00', '16/07/2011 10:02:00', '16/07/2011 10:03:00', '16/07/2011 10:04:00', '16/07/2011 10:05:00', '16/07/2011 10:06:00', '16/07/2011 10:07:00', '16/07/2011 10:08:00', '16/07/2011 10:09:00', '16/07/2011 10:10:00', '16/07/2011 10:11:00', '16/07/2011 10:12:00', '16/07/2011 10:13:00', '16/07/2011 10:14:00', '16/07/2011 10:15:00', '16/07/2011 10:16:00', '16/07/2011 10:17:00', '16/07/2011 10:18:00', '16/07/2011 10:19:00', '16/07/2011 10:20:00', '16/07/2011 10:21:00', '16/07/2011 10:22:00', '16/07/2011 10:23:00', '16/07/2011 10:24:00', '16/07/2011 10:25:00', '16/07/2011 10:26:00', '16/07/2011 10:27:00', '16/07/2011 10:28:00', '16/07/2011 10:29:00', '16/07/2011 10:30:00', '16/07/2011 10:31:00', '16/07/2011 10:32:00', '16/07/2011 10:33:00', '16/07/2011 10:34:00', '16/07/2011 10:35:00', '16/07/2011 10:36:00', '16/07/2011 10:37:00', '16/07/2011 10:38:00', '16/07/2011 10:39:00', '16/07/2011 10:40:00', '16/07/2011 10:41:00', '16/07/2011 10:42:00', '16/07/2011 10:43:00', '16/07/2011 10:44:00', '16/07/2011 10:45:00', '16/07/2011 10:46:00', '16/07/2011 10:47:00', '16/07/2011 10:48:00', '16/07/2011 10:49:00', '16/07/2011 10:50:00', '16/07/2011 10:51:00', '16/07/2011 10:52:00', '16/07/2011 10:53:00', '16/07/2011 10:54:00', '16/07/2011 10:55:00', '16/07/2011 10:56:00', '16/07/2011 10:57:00', '16/07/2011 10:58:00', '16/07/2011 10:59:00', '16/07/2011 11:00:00', '16/07/2011 11:01:00', '16/07/2011 11:02:00', '16/07/2011 11:03:00', '16/07/2011 11:04:00', '16/07/2011 11:05:00', '16/07/2011 11:06:00', '16/07/2011 11:07:00', '16/07/2011 11:08:00', '16/07/2011 11:09:00', '16/07/2011 11:10:00', '16/07/2011 11:11:00', '16/07/2011 11:12:00', '16/07/2011 11:13:00', '16/07/2011 11:14:00', '16/07/2011 11:15:00', '16/07/2011 11:16:00', '16/07/2011 11:17:00', '16/07/2011 11:18:00', '16/07/2011 11:19:00', '16/07/2011 11:20:00', '16/07/2011 11:21:00', '16/07/2011 11:22:00', '16/07/2011 11:23:00', '16/07/2011 11:24:00', '16/07/2011 11:25:00', '16/07/2011 11:26:00', '16/07/2011 11:27:00', '16/07/2011 11:28:00', '16/07/2011 11:29:00', '16/07/2011 11:30:00', '16/07/2011 11:31:00', '16/07/2011 11:32:00', '16/07/2011 11:33:00', '16/07/2011 11:34:00', '16/07/2011 11:35:00', '16/07/2011 11:36:00', '16/07/2011 11:37:00', '16/07/2011 11:38:00', '16/07/2011 11:39:00', '16/07/2011 11:40:00', '16/07/2011 11:41:00', '16/07/2011 11:42:00', '16/07/2011 11:43:00', '16/07/2011 11:44:00', '16/07/2011 11:45:00', '16/07/2011 11:46:00', '16/07/2011 11:47:00', '16/07/2011 11:48:00', '16/07/2011 11:49:00', '16/07/2011 11:50:00', '16/07/2011 11:51:00', '16/07/2011 11:52:00', '16/07/2011 11:53:00', '16/07/2011 11:54:00', '16/07/2011 11:55:00', '16/07/2011 11:56:00', '16/07/2011 11:57:00', '16/07/2011 11:58:00', '16/07/2011 11:59:00', '16/07/2011 12:00:00', '16/07/2011 12:01:00', '16/07/2011 12:02:00', '16/07/2011 12:03:00', '16/07/2011 12:04:00', '16/07/2011 12:05:00', '16/07/2011 12:06:00', '16/07/2011 12:07:00', '16/07/2011 12:08:00', '16/07/2011 12:09:00', '16/07/2011 12:10:00', '16/07/2011 12:11:00', '16/07/2011 12:12:00', '16/07/2011 12:13:00', '16/07/2011 12:14:00', '16/07/2011 12:15:00', '16/07/2011 12:16:00', '16/07/2011 12:17:00', '16/07/2011 12:18:00', '16/07/2011 12:19:00', '16/07/2011 12:20:00', '16/07/2011 12:21:00', '16/07/2011 12:22:00', '16/07/2011 12:23:00', '16/07/2011 12:24:00', '16/07/2011 12:25:00', '16/07/2011 12:26:00', '16/07/2011 12:27:00', '16/07/2011 12:28:00', '16/07/2011 12:29:00', '16/07/2011 12:30:00', '16/07/2011 12:31:00', '16/07/2011 12:32:00', '16/07/2011 12:33:00', '16/07/2011 12:34:00', '16/07/2011 12:35:00', '16/07/2011 12:36:00', '16/07/2011 12:37:00', '16/07/2011 12:38:00', '16/07/2011 12:39:00', '16/07/2011 12:40:00', '16/07/2011 12:41:00', '16/07/2011 12:42:00', '16/07/2011 12:43:00', '16/07/2011 12:44:00', '16/07/2011 12:45:00', '16/07/2011 12:46:00', '16/07/2011 12:47:00', '16/07/2011 12:48:00', '16/07/2011 12:49:00', '16/07/2011 12:50:00', '16/07/2011 12:51:00', '16/07/2011 12:52:00', '16/07/2011 12:53:00', '16/07/2011 12:54:00', '16/07/2011 12:55:00', '16/07/2011 12:56:00', '16/07/2011 12:57:00', '16/07/2011 12:58:00', '16/07/2011 12:59:00', '16/07/2011 13:00:00', '16/07/2011 13:01:00', '16/07/2011 13:02:00', '16/07/2011 13:03:00', '16/07/2011 13:04:00', '16/07/2011 13:05:00', '16/07/2011 13:06:00', '16/07/2011 13:07:00', '16/07/2011 13:08:00', '16/07/2011 13:09:00', '16/07/2011 13:10:00', '16/07/2011 13:11:00', '16/07/2011 13:12:00', '16/07/2011 13:13:00', '16/07/2011 13:14:00', '16/07/2011 13:15:00', '16/07/2011 13:16:00', '16/07/2011 13:17:00', '16/07/2011 13:18:00', '16/07/2011 13:19:00', '16/07/2011 13:20:00', '16/07/2011 13:21:00', '16/07/2011 13:22:00', '16/07/2011 13:23:00', '16/07/2011 13:24:00', '16/07/2011 13:25:00', '16/07/2011 13:26:00', '16/07/2011 13:27:00', '16/07/2011 13:28:00', '16/07/2011 13:29:00', '16/07/2011 13:30:00', '16/07/2011 13:31:00', '16/07/2011 13:32:00', '16/07/2011 13:33:00', '16/07/2011 13:34:00', '16/07/2011 13:35:00', '16/07/2011 13:36:00', '16/07/2011 13:37:00', '16/07/2011 13:38:00', '16/07/2011 13:39:00', '16/07/2011 13:40:00', '16/07/2011 13:41:00', '16/07/2011 13:42:00', '16/07/2011 13:43:00', '16/07/2011 13:44:00', '16/07/2011 13:45:00', '16/07/2011 13:46:00', '16/07/2011 13:47:00', '16/07/2011 13:48:00', '16/07/2011 13:49:00', '16/07/2011 13:50:00', '16/07/2011 13:51:00', '16/07/2011 13:52:00', '16/07/2011 13:53:00', '16/07/2011 13:54:00', '16/07/2011 13:55:00', '16/07/2011 13:56:00', '16/07/2011 13:57:00', '16/07/2011 13:58:00', '16/07/2011 13:59:00', '16/07/2011 14:00:00', '16/07/2011 14:01:00', '16/07/2011 14:02:00', '16/07/2011 14:03:00', '16/07/2011 14:04:00', '16/07/2011 14:05:00', '16/07/2011 14:06:00', '16/07/2011 14:07:00', '16/07/2011 14:08:00', '16/07/2011 14:09:00', '16/07/2011 14:10:00', '16/07/2011 14:11:00', '16/07/2011 14:12:00', '16/07/2011 14:13:00', '16/07/2011 14:14:00', '16/07/2011 14:15:00', '16/07/2011 14:16:00', '16/07/2011 14:17:00', '16/07/2011 14:18:00', '16/07/2011 14:19:00', '16/07/2011 14:20:00', '16/07/2011 14:21:00', '16/07/2011 14:22:00', '16/07/2011 14:23:00', '16/07/2011 14:24:00', '16/07/2011 14:25:00', '16/07/2011 14:26:00', '16/07/2011 14:27:00', '16/07/2011 14:28:00', '16/07/2011 14:29:00', '16/07/2011 14:30:00', '16/07/2011 14:31:00', '16/07/2011 14:32:00', '16/07/2011 14:33:00', '16/07/2011 14:34:00', '16/07/2011 14:35:00', '16/07/2011 14:36:00', '16/07/2011 14:37:00', '16/07/2011 14:38:00', '16/07/2011 14:39:00', '16/07/2011 14:40:00', '16/07/2011 14:41:00', '16/07/2011 14:42:00', '16/07/2011 14:43:00', '16/07/2011 14:44:00', '16/07/2011 14:45:00', '16/07/2011 14:46:00', '16/07/2011 14:47:00', '16/07/2011 14:48:00', '16/07/2011 14:49:00', '16/07/2011 14:50:00', '16/07/2011 14:51:00', '16/07/2011 14:52:00', '16/07/2011 14:53:00', '16/07/2011 14:54:00', '16/07/2011 14:55:00', '16/07/2011 14:56:00', '16/07/2011 14:57:00', '16/07/2011 14:58:00', '16/07/2011 14:59:00', '16/07/2011 15:00:00', '16/07/2011 15:01:00', '16/07/2011 15:02:00', '16/07/2011 15:03:00', '16/07/2011 15:04:00', '16/07/2011 15:05:00', '16/07/2011 15:06:00', '16/07/2011 15:07:00', '16/07/2011 15:08:00', '16/07/2011 15:09:00', '16/07/2011 15:10:00', '16/07/2011 15:11:00', '16/07/2011 15:12:00', '16/07/2011 15:13:00', '16/07/2011 15:14:00', '16/07/2011 15:15:00', '16/07/2011 15:16:00', '16/07/2011 15:17:00', '16/07/2011 15:18:00', '16/07/2011 15:19:00', '16/07/2011 15:20:00', '16/07/2011 15:21:00', '16/07/2011 15:22:00', '16/07/2011 15:23:00', '16/07/2011 15:24:00', '16/07/2011 15:25:00', '16/07/2011 15:26:00', '16/07/2011 15:27:00', '16/07/2011 15:28:00', '16/07/2011 15:29:00', '16/07/2011 15:30:00', '16/07/2011 15:31:00', '16/07/2011 15:32:00', '16/07/2011 15:33:00', '16/07/2011 15:34:00', '16/07/2011 15:35:00', '16/07/2011 15:36:00', '16/07/2011 15:37:00', '16/07/2011 15:38:00', '16/07/2011 15:39:00', '16/07/2011 15:40:00', '16/07/2011 15:41:00', '16/07/2011 15:42:00', '16/07/2011 15:43:00', '16/07/2011 15:44:00', '16/07/2011 15:45:00', '16/07/2011 15:46:00', '16/07/2011 15:47:00', '16/07/2011 15:48:00', '16/07/2011 15:49:00', '16/07/2011 15:50:00', '16/07/2011 15:51:00', '16/07/2011 15:52:00', '16/07/2011 15:53:00', '16/07/2011 15:54:00', '16/07/2011 15:55:00', '16/07/2011 15:56:00', '16/07/2011 15:57:00', '16/07/2011 15:58:00', '16/07/2011 15:59:00', '16/07/2011 16:00:00', '16/07/2011 16:01:00', '16/07/2011 16:02:00', '16/07/2011 16:03:00', '16/07/2011 16:04:00', '16/07/2011 16:05:00', '16/07/2011 16:06:00', '16/07/2011 16:07:00', '16/07/2011 16:08:00', '16/07/2011 16:09:00', '16/07/2011 16:10:00', '16/07/2011 16:11:00', '16/07/2011 16:12:00', '16/07/2011 16:13:00', '16/07/2011 16:14:00', '16/07/2011 16:15:00', '16/07/2011 16:16:00', '16/07/2011 16:17:00', '16/07/2011 16:18:00', '16/07/2011 16:19:00', '16/07/2011 16:20:00', '16/07/2011 16:21:00', '16/07/2011 16:22:00', '16/07/2011 16:23:00', '16/07/2011 16:24:00', '16/07/2011 16:25:00', '16/07/2011 16:26:00', '16/07/2011 16:27:00', '16/07/2011 16:28:00', '16/07/2011 16:29:00', '16/07/2011 16:30:00', '16/07/2011 16:31:00', '16/07/2011 16:32:00', '16/07/2011 16:33:00', '16/07/2011 16:34:00', '16/07/2011 16:35:00', '16/07/2011 16:36:00', '16/07/2011 16:37:00', '16/07/2011 16:38:00', '16/07/2011 16:39:00', '16/07/2011 16:40:00', '16/07/2011 16:41:00', '16/07/2011 16:42:00', '16/07/2011 16:43:00', '16/07/2011 16:44:00', '16/07/2011 16:45:00', '16/07/2011 16:46:00', '16/07/2011 16:47:00', '16/07/2011 16:48:00', '16/07/2011 16:49:00', '16/07/2011 16:50:00', '16/07/2011 16:51:00', '16/07/2011 16:52:00', '16/07/2011 16:53:00', '16/07/2011 16:54:00', '16/07/2011 16:55:00', '16/07/2011 16:56:00', '16/07/2011 16:57:00', '16/07/2011 16:58:00', '16/07/2011 16:59:00', '16/07/2011 17:00:00', '16/07/2011 17:01:00', '16/07/2011 17:02:00', '16/07/2011 17:03:00', '16/07/2011 17:04:00', '16/07/2011 17:05:00', '16/07/2011 17:06:00', '16/07/2011 17:07:00', '16/07/2011 17:08:00', '16/07/2011 17:09:00', '16/07/2011 17:10:00', '16/07/2011 17:11:00', '16/07/2011 17:12:00', '16/07/2011 17:13:00', '16/07/2011 17:14:00', '16/07/2011 17:15:00', '16/07/2011 17:16:00', '16/07/2011 17:17:00', '16/07/2011 17:18:00', '16/07/2011 17:19:00', '16/07/2011 17:20:00', '16/07/2011 17:21:00', '16/07/2011 17:22:00', '16/07/2011 17:23:00', '16/07/2011 17:24:00', '16/07/2011 17:25:00', '16/07/2011 17:26:00', '16/07/2011 17:27:00', '16/07/2011 17:28:00', '16/07/2011 17:29:00', '16/07/2011 17:30:00', '16/07/2011 17:31:00', '16/07/2011 17:32:00', '16/07/2011 17:33:00', '16/07/2011 17:34:00', '16/07/2011 17:35:00', '16/07/2011 17:36:00', '16/07/2011 17:37:00', '16/07/2011 17:38:00', '16/07/2011 17:39:00', '16/07/2011 17:40:00', '16/07/2011 17:41:00', '16/07/2011 17:42:00', '16/07/2011 17:43:00', '16/07/2011 17:44:00', '16/07/2011 17:45:00', '16/07/2011 17:46:00', '16/07/2011 17:47:00', '16/07/2011 17:48:00', '16/07/2011 17:49:00', '16/07/2011 17:50:00', '16/07/2011 17:51:00', '16/07/2011 17:52:00', '16/07/2011 17:53:00', '16/07/2011 17:54:00', '16/07/2011 17:55:00', '16/07/2011 17:56:00', '16/07/2011 17:57:00', '16/07/2011 17:58:00', '16/07/2011 17:59:00', '16/07/2011 18:00:00', '16/07/2011 18:01:00', '16/07/2011 18:02:00', '16/07/2011 18:03:00', '16/07/2011 18:04:00', '16/07/2011 18:05:00', '16/07/2011 18:06:00', '16/07/2011 18:07:00', '16/07/2011 18:08:00', '16/07/2011 18:09:00', '16/07/2011 18:10:00', '16/07/2011 18:11:00', '16/07/2011 18:12:00', '16/07/2011 18:13:00', '16/07/2011 18:14:00', '16/07/2011 18:15:00', '16/07/2011 18:16:00', '16/07/2011 18:17:00', '16/07/2011 18:18:00', '16/07/2011 18:19:00', '16/07/2011 18:20:00', '16/07/2011 18:21:00', '16/07/2011 18:22:00', '16/07/2011 18:23:00', '16/07/2011 18:24:00', '16/07/2011 18:25:00', '16/07/2011 18:26:00', '16/07/2011 18:27:00', '16/07/2011 18:28:00', '16/07/2011 18:29:00', '16/07/2011 18:30:00', '16/07/2011 18:31:00', '16/07/2011 18:32:00', '16/07/2011 18:33:00', '16/07/2011 18:34:00', '16/07/2011 18:35:00', '16/07/2011 18:36:00', '16/07/2011 18:37:00', '16/07/2011 18:38:00', '16/07/2011 18:39:00', '16/07/2011 18:40:00', '16/07/2011 18:41:00', '16/07/2011 18:42:00', '16/07/2011 18:43:00', '16/07/2011 18:44:00', '16/07/2011 18:45:00', '16/07/2011 18:46:00', '16/07/2011 18:47:00', '16/07/2011 18:48:00', '16/07/2011 18:49:00', '16/07/2011 18:50:00', '16/07/2011 18:51:00', '16/07/2011 18:52:00', '16/07/2011 18:53:00', '16/07/2011 18:54:00', '16/07/2011 18:55:00', '16/07/2011 18:56:00', '16/07/2011 18:57:00', '16/07/2011 18:58:00', '16/07/2011 18:59:00', '16/07/2011 19:00:00', '16/07/2011 19:01:00', '16/07/2011 19:02:00', '16/07/2011 19:03:00', '16/07/2011 19:04:00', '16/07/2011 19:05:00', '16/07/2011 19:06:00', '16/07/2011 19:07:00', '16/07/2011 19:08:00', '16/07/2011 19:09:00', '16/07/2011 19:10:00', '16/07/2011 19:11:00', '16/07/2011 19:12:00', '16/07/2011 19:13:00', '16/07/2011 19:14:00', '16/07/2011 19:15:00', '16/07/2011 19:16:00', '16/07/2011 19:17:00', '16/07/2011 19:18:00', '16/07/2011 19:19:00', '16/07/2011 19:20:00', '16/07/2011 19:21:00', '16/07/2011 19:22:00', '16/07/2011 19:23:00', '16/07/2011 19:24:00', '16/07/2011 19:25:00', '16/07/2011 19:26:00', '16/07/2011 19:27:00', '16/07/2011 19:28:00', '16/07/2011 19:29:00', '16/07/2011 19:30:00', '16/07/2011 19:31:00', '16/07/2011 19:32:00', '16/07/2011 19:33:00', '16/07/2011 19:34:00', '16/07/2011 19:35:00', '16/07/2011 19:36:00', '16/07/2011 19:37:00', '16/07/2011 19:38:00', '16/07/2011 19:39:00', '16/07/2011 19:40:00', '16/07/2011 19:41:00', '16/07/2011 19:42:00', '16/07/2011 19:43:00', '16/07/2011 19:44:00', '16/07/2011 19:45:00', '16/07/2011 19:46:00', '16/07/2011 19:47:00', '16/07/2011 19:48:00', '16/07/2011 19:49:00', '16/07/2011 19:50:00', '16/07/2011 19:51:00', '16/07/2011 19:52:00', '16/07/2011 19:53:00', '16/07/2011 19:54:00', '16/07/2011 19:55:00', '16/07/2011 19:56:00', '16/07/2011 19:57:00', '16/07/2011 19:58:00', '16/07/2011 19:59:00', '16/07/2011 20:00:00', '16/07/2011 20:01:00', '16/07/2011 20:02:00', '16/07/2011 20:03:00', '16/07/2011 20:04:00', '16/07/2011 20:05:00', '16/07/2011 20:06:00', '16/07/2011 20:07:00', '16/07/2011 20:08:00', '16/07/2011 20:09:00', '16/07/2011 20:10:00', '16/07/2011 20:11:00', '16/07/2011 20:12:00', '16/07/2011 20:13:00', '16/07/2011 20:14:00', '16/07/2011 20:15:00', '16/07/2011 20:16:00', '16/07/2011 20:17:00', '16/07/2011 20:18:00', '16/07/2011 20:19:00', '16/07/2011 20:20:00', '16/07/2011 20:21:00', '16/07/2011 20:22:00', '16/07/2011 20:23:00', '16/07/2011 20:24:00', '16/07/2011 20:25:00', '16/07/2011 20:26:00', '16/07/2011 20:27:00', '16/07/2011 20:28:00', '16/07/2011 20:29:00', '16/07/2011 20:30:00', '16/07/2011 20:31:00', '16/07/2011 20:32:00', '16/07/2011 20:33:00', '16/07/2011 20:34:00', '16/07/2011 20:35:00', '16/07/2011 20:36:00', '16/07/2011 20:37:00', '16/07/2011 20:38:00', '16/07/2011 20:39:00', '16/07/2011 20:40:00', '16/07/2011 20:41:00', '16/07/2011 20:42:00', '16/07/2011 20:43:00', '16/07/2011 20:44:00', '16/07/2011 20:45:00', '16/07/2011 20:46:00', '16/07/2011 20:47:00', '16/07/2011 20:48:00', '16/07/2011 20:49:00', '16/07/2011 20:50:00', '16/07/2011 20:51:00', '16/07/2011 20:52:00', '16/07/2011 20:53:00', '16/07/2011 20:54:00', '16/07/2011 20:55:00', '16/07/2011 20:56:00', '16/07/2011 20:57:00', '16/07/2011 20:58:00', '16/07/2011 20:59:00', '16/07/2011 21:00:00', '16/07/2011 21:01:00', '16/07/2011 21:02:00', '16/07/2011 21:03:00', '16/07/2011 21:04:00', '16/07/2011 21:05:00', '16/07/2011 21:06:00', '16/07/2011 21:07:00', '16/07/2011 21:08:00', '16/07/2011 21:09:00', '16/07/2011 21:10:00', '16/07/2011 21:11:00', '16/07/2011 21:12:00', '16/07/2011 21:13:00', '16/07/2011 21:14:00', '16/07/2011 21:15:00', '16/07/2011 21:16:00', '16/07/2011 21:17:00', '16/07/2011 21:18:00', '16/07/2011 21:19:00', '16/07/2011 21:20:00', '16/07/2011 21:21:00', '16/07/2011 21:22:00', '16/07/2011 21:23:00', '16/07/2011 21:24:00', '16/07/2011 21:25:00', '16/07/2011 21:26:00', '16/07/2011 21:27:00', '16/07/2011 21:28:00', '16/07/2011 21:29:00', '16/07/2011 21:30:00', '16/07/2011 21:31:00', '16/07/2011 21:32:00', '16/07/2011 21:33:00', '16/07/2011 21:34:00', '16/07/2011 21:35:00', '16/07/2011 21:36:00', '16/07/2011 21:37:00', '16/07/2011 21:38:00', '16/07/2011 21:39:00', '16/07/2011 21:40:00', '16/07/2011 21:41:00', '16/07/2011 21:42:00', '16/07/2011 21:43:00', '16/07/2011 21:44:00', '16/07/2011 21:45:00', '16/07/2011 21:46:00', '16/07/2011 21:47:00', '16/07/2011 21:48:00', '16/07/2011 21:49:00', '16/07/2011 21:50:00', '16/07/2011 21:51:00', '16/07/2011 21:52:00', '16/07/2011 21:53:00', '16/07/2011 21:54:00', '16/07/2011 21:55:00', '16/07/2011 21:56:00', '16/07/2011 21:57:00', '16/07/2011 21:58:00', '16/07/2011 21:59:00', '16/07/2011 22:00:00', '16/07/2011 22:01:00', '16/07/2011 22:02:00', '16/07/2011 22:03:00', '16/07/2011 22:04:00', '16/07/2011 22:05:00', '16/07/2011 22:06:00', '16/07/2011 22:07:00', '16/07/2011 22:08:00', '16/07/2011 22:09:00', '16/07/2011 22:10:00', '16/07/2011 22:11:00', '16/07/2011 22:12:00', '16/07/2011 22:13:00', '16/07/2011 22:14:00', '16/07/2011 22:15:00', '16/07/2011 22:16:00', '16/07/2011 22:17:00', '16/07/2011 22:18:00', '16/07/2011 22:19:00', '16/07/2011 22:20:00', '16/07/2011 22:21:00', '16/07/2011 22:22:00', '16/07/2011 22:23:00', '16/07/2011 22:24:00', '16/07/2011 22:25:00', '16/07/2011 22:26:00', '16/07/2011 22:27:00', '16/07/2011 22:28:00', '16/07/2011 22:29:00', '16/07/2011 22:30:00', '16/07/2011 22:31:00', '16/07/2011 22:32:00', '16/07/2011 22:33:00', '16/07/2011 22:34:00', '16/07/2011 22:35:00', '16/07/2011 22:36:00', '16/07/2011 22:37:00', '16/07/2011 22:38:00', '16/07/2011 22:39:00', '16/07/2011 22:40:00', '16/07/2011 22:41:00', '16/07/2011 22:42:00', '16/07/2011 22:43:00', '16/07/2011 22:44:00', '16/07/2011 22:45:00', '16/07/2011 22:46:00', '16/07/2011 22:47:00', '16/07/2011 22:48:00', '16/07/2011 22:49:00', '16/07/2011 22:50:00', '16/07/2011 22:51:00', '16/07/2011 22:52:00', '16/07/2011 22:53:00', '16/07/2011 22:54:00', '16/07/2011 22:55:00', '16/07/2011 22:56:00', '16/07/2011 22:57:00', '16/07/2011 22:58:00', '16/07/2011 22:59:00', '16/07/2011 23:00:00', '16/07/2011 23:01:00', '16/07/2011 23:02:00', '16/07/2011 23:03:00', '16/07/2011 23:04:00', '16/07/2011 23:05:00', '16/07/2011 23:06:00', '16/07/2011 23:07:00', '16/07/2011 23:08:00', '16/07/2011 23:09:00', '16/07/2011 23:10:00', '16/07/2011 23:11:00', '16/07/2011 23:12:00', '16/07/2011 23:13:00', '16/07/2011 23:14:00', '16/07/2011 23:15:00', '16/07/2011 23:16:00', '16/07/2011 23:17:00', '16/07/2011 23:18:00', '16/07/2011 23:19:00', '16/07/2011 23:20:00', '16/07/2011 23:21:00', '16/07/2011 23:22:00', '16/07/2011 23:23:00', '16/07/2011 23:24:00', '16/07/2011 23:25:00', '16/07/2011 23:26:00', '16/07/2011 23:27:00', '16/07/2011 23:28:00', '16/07/2011 23:29:00', '16/07/2011 23:30:00', '16/07/2011 23:31:00', '16/07/2011 23:32:00', '16/07/2011 23:33:00', '16/07/2011 23:34:00', '16/07/2011 23:35:00', '16/07/2011 23:36:00', '16/07/2011 23:37:00', '16/07/2011 23:38:00', '16/07/2011 23:39:00', '16/07/2011 23:40:00', '16/07/2011 23:41:00', '16/07/2011 23:42:00', '16/07/2011 23:43:00', '16/07/2011 23:44:00', '16/07/2011 23:45:00', '16/07/2011 23:46:00', '16/07/2011 23:47:00', '16/07/2011 23:48:00', '16/07/2011 23:49:00', '16/07/2011 23:50:00', '16/07/2011 23:51:00', '16/07/2011 23:52:00', '16/07/2011 23:53:00', '16/07/2011 23:54:00', '16/07/2011 23:55:00', '16/07/2011 23:56:00', '16/07/2011 23:57:00', '16/07/2011 23:58:00', '16/07/2011 23:59:00', '17/07/2011 0:00:00', '17/07/2011 0:01:00', '17/07/2011 0:02:00', '17/07/2011 0:03:00', '17/07/2011 0:04:00', '17/07/2011 0:05:00', '17/07/2011 0:06:00', '17/07/2011 0:07:00', '17/07/2011 0:08:00', '17/07/2011 0:09:00', '17/07/2011 0:10:00', '17/07/2011 0:11:00', '17/07/2011 0:12:00', '17/07/2011 0:13:00', '17/07/2011 0:14:00', '17/07/2011 0:15:00', '17/07/2011 0:16:00', '17/07/2011 0:17:00', '17/07/2011 0:18:00', '17/07/2011 0:19:00', '17/07/2011 0:20:00', '17/07/2011 0:21:00', '17/07/2011 0:22:00', '17/07/2011 0:23:00', '17/07/2011 0:24:00', '17/07/2011 0:25:00', '17/07/2011 0:26:00', '17/07/2011 0:27:00', '17/07/2011 0:28:00', '17/07/2011 0:29:00', '17/07/2011 0:30:00', '17/07/2011 0:31:00', '17/07/2011 0:32:00', '17/07/2011 0:33:00', '17/07/2011 0:34:00', '17/07/2011 0:35:00', '17/07/2011 0:36:00', '17/07/2011 0:37:00', '17/07/2011 0:38:00', '17/07/2011 0:39:00', '17/07/2011 0:40:00', '17/07/2011 0:41:00', '17/07/2011 0:42:00', '17/07/2011 0:43:00', '17/07/2011 0:44:00', '17/07/2011 0:45:00', '17/07/2011 0:46:00', '17/07/2011 0:47:00', '17/07/2011 0:48:00', '17/07/2011 0:49:00', '17/07/2011 0:50:00', '17/07/2011 0:51:00', '17/07/2011 0:52:00', '17/07/2011 0:53:00', '17/07/2011 0:54:00', '17/07/2011 0:55:00', '17/07/2011 0:56:00', '17/07/2011 0:57:00', '17/07/2011 0:58:00', '17/07/2011 0:59:00', '17/07/2011 1:00:00', '17/07/2011 1:01:00', '17/07/2011 1:02:00', '17/07/2011 1:03:00', '17/07/2011 1:04:00', '17/07/2011 1:05:00', '17/07/2011 1:06:00', '17/07/2011 1:07:00', '17/07/2011 1:08:00', '17/07/2011 1:09:00', '17/07/2011 1:10:00', '17/07/2011 1:11:00', '17/07/2011 1:12:00', '17/07/2011 1:13:00', '17/07/2011 1:14:00', '17/07/2011 1:15:00', '17/07/2011 1:16:00', '17/07/2011 1:17:00', '17/07/2011 1:18:00', '17/07/2011 1:19:00', '17/07/2011 1:20:00', '17/07/2011 1:21:00', '17/07/2011 1:22:00', '17/07/2011 1:23:00', '17/07/2011 1:24:00', '17/07/2011 1:25:00', '17/07/2011 1:26:00', '17/07/2011 1:27:00', '17/07/2011 1:28:00', '17/07/2011 1:29:00', '17/07/2011 1:30:00', '17/07/2011 1:31:00', '17/07/2011 1:32:00', '17/07/2011 1:33:00', '17/07/2011 1:34:00', '17/07/2011 1:35:00', '17/07/2011 1:36:00', '17/07/2011 1:37:00', '17/07/2011 1:38:00', '17/07/2011 1:39:00', '17/07/2011 1:40:00', '17/07/2011 1:41:00', '17/07/2011 1:42:00', '17/07/2011 1:43:00', '17/07/2011 1:44:00', '17/07/2011 1:45:00', '17/07/2011 1:46:00', '17/07/2011 1:47:00', '17/07/2011 1:48:00', '17/07/2011 1:49:00', '17/07/2011 1:50:00', '17/07/2011 1:51:00', '17/07/2011 1:52:00', '17/07/2011 1:53:00', '17/07/2011 1:54:00', '17/07/2011 1:55:00', '17/07/2011 1:56:00', '17/07/2011 1:57:00', '17/07/2011 1:58:00', '17/07/2011 1:59:00', '17/07/2011 2:00:00', '17/07/2011 2:01:00', '17/07/2011 2:02:00', '17/07/2011 2:03:00', '17/07/2011 2:04:00', '17/07/2011 2:05:00', '17/07/2011 2:06:00', '17/07/2011 2:07:00', '17/07/2011 2:08:00', '17/07/2011 2:09:00', '17/07/2011 2:10:00', '17/07/2011 2:11:00', '17/07/2011 2:12:00', '17/07/2011 2:13:00', '17/07/2011 2:14:00', '17/07/2011 2:15:00', '17/07/2011 2:16:00', '17/07/2011 2:17:00', '17/07/2011 2:18:00', '17/07/2011 2:19:00', '17/07/2011 2:20:00', '17/07/2011 2:21:00', '17/07/2011 2:22:00', '17/07/2011 2:23:00', '17/07/2011 2:24:00', '17/07/2011 2:25:00', '17/07/2011 2:26:00', '17/07/2011 2:27:00', '17/07/2011 2:28:00', '17/07/2011 2:29:00', '17/07/2011 2:30:00', '17/07/2011 2:31:00', '17/07/2011 2:32:00', '17/07/2011 2:33:00', '17/07/2011 2:34:00', '17/07/2011 2:35:00', '17/07/2011 2:36:00', '17/07/2011 2:37:00', '17/07/2011 2:38:00', '17/07/2011 2:39:00', '17/07/2011 2:40:00', '17/07/2011 2:41:00', '17/07/2011 2:42:00', '17/07/2011 2:43:00', '17/07/2011 2:44:00', '17/07/2011 2:45:00', '17/07/2011 2:46:00', '17/07/2011 2:47:00', '17/07/2011 2:48:00', '17/07/2011 2:49:00', '17/07/2011 2:50:00', '17/07/2011 2:51:00', '17/07/2011 2:52:00', '17/07/2011 2:53:00', '17/07/2011 2:54:00', '17/07/2011 2:55:00', '17/07/2011 2:56:00', '17/07/2011 2:57:00', '17/07/2011 2:58:00', '17/07/2011 2:59:00', '17/07/2011 3:00:00', '17/07/2011 3:01:00', '17/07/2011 3:02:00', '17/07/2011 3:03:00', '17/07/2011 3:04:00', '17/07/2011 3:05:00', '17/07/2011 3:06:00', '17/07/2011 3:07:00', '17/07/2011 3:08:00', '17/07/2011 3:09:00', '17/07/2011 3:10:00', '17/07/2011 3:11:00', '17/07/2011 3:12:00', '17/07/2011 3:13:00', '17/07/2011 3:14:00', '17/07/2011 3:15:00', '17/07/2011 3:16:00', '17/07/2011 3:17:00', '17/07/2011 3:18:00', '17/07/2011 3:19:00', '17/07/2011 3:20:00', '17/07/2011 3:21:00', '17/07/2011 3:22:00', '17/07/2011 3:23:00', '17/07/2011 3:24:00', '17/07/2011 3:25:00', '17/07/2011 3:26:00', '17/07/2011 3:27:00', '17/07/2011 3:28:00', '17/07/2011 3:29:00', '17/07/2011 3:30:00', '17/07/2011 3:31:00', '17/07/2011 3:32:00', '17/07/2011 3:33:00', '17/07/2011 3:34:00', '17/07/2011 3:35:00', '17/07/2011 3:36:00', '17/07/2011 3:37:00', '17/07/2011 3:38:00', '17/07/2011 3:39:00', '17/07/2011 3:40:00', '17/07/2011 3:41:00', '17/07/2011 3:42:00', '17/07/2011 3:43:00', '17/07/2011 3:44:00', '17/07/2011 3:45:00', '17/07/2011 3:46:00', '17/07/2011 3:47:00', '17/07/2011 3:48:00', '17/07/2011 3:49:00', '17/07/2011 3:50:00', '17/07/2011 3:51:00', '17/07/2011 3:52:00', '17/07/2011 3:53:00', '17/07/2011 3:54:00', '17/07/2011 3:55:00', '17/07/2011 3:56:00', '17/07/2011 3:57:00', '17/07/2011 3:58:00', '17/07/2011 3:59:00', '17/07/2011 4:00:00', '17/07/2011 4:01:00', '17/07/2011 4:02:00', '17/07/2011 4:03:00', '17/07/2011 4:04:00', '17/07/2011 4:05:00', '17/07/2011 4:06:00', '17/07/2011 4:07:00', '17/07/2011 4:08:00', '17/07/2011 4:09:00', '17/07/2011 4:10:00', '17/07/2011 4:11:00', '17/07/2011 4:12:00', '17/07/2011 4:13:00', '17/07/2011 4:14:00', '17/07/2011 4:15:00', '17/07/2011 4:16:00', '17/07/2011 4:17:00', '17/07/2011 4:18:00', '17/07/2011 4:19:00', '17/07/2011 4:20:00', '17/07/2011 4:21:00', '17/07/2011 4:22:00', '17/07/2011 4:23:00', '17/07/2011 4:24:00', '17/07/2011 4:25:00', '17/07/2011 4:26:00', '17/07/2011 4:27:00', '17/07/2011 4:28:00', '17/07/2011 4:29:00', '17/07/2011 4:30:00', '17/07/2011 4:31:00', '17/07/2011 4:32:00', '17/07/2011 4:33:00', '17/07/2011 4:34:00', '17/07/2011 4:35:00', '17/07/2011 4:36:00', '17/07/2011 4:37:00', '17/07/2011 4:38:00', '17/07/2011 4:39:00', '17/07/2011 4:40:00', '17/07/2011 4:41:00', '17/07/2011 4:42:00', '17/07/2011 4:43:00', '17/07/2011 4:44:00', '17/07/2011 4:45:00', '17/07/2011 4:46:00', '17/07/2011 4:47:00', '17/07/2011 4:48:00', '17/07/2011 4:49:00', '17/07/2011 4:50:00', '17/07/2011 4:51:00', '17/07/2011 4:52:00', '17/07/2011 4:53:00', '17/07/2011 4:54:00', '17/07/2011 4:55:00', '17/07/2011 4:56:00', '17/07/2011 4:57:00', '17/07/2011 4:58:00', '17/07/2011 4:59:00', '17/07/2011 5:00:00', '17/07/2011 5:01:00', '17/07/2011 5:02:00', '17/07/2011 5:03:00', '17/07/2011 5:04:00', '17/07/2011 5:05:00', '17/07/2011 5:06:00', '17/07/2011 5:07:00', '17/07/2011 5:08:00', '17/07/2011 5:09:00', '17/07/2011 5:10:00', '17/07/2011 5:11:00', '17/07/2011 5:12:00', '17/07/2011 5:13:00', '17/07/2011 5:14:00', '17/07/2011 5:15:00', '17/07/2011 5:16:00', '17/07/2011 5:17:00', '17/07/2011 5:18:00', '17/07/2011 5:19:00', '17/07/2011 5:20:00', '17/07/2011 5:21:00', '17/07/2011 5:22:00', '17/07/2011 5:23:00', '17/07/2011 5:24:00', '17/07/2011 5:25:00', '17/07/2011 5:26:00', '17/07/2011 5:27:00', '17/07/2011 5:28:00', '17/07/2011 5:29:00', '17/07/2011 5:30:00', '17/07/2011 5:31:00', '17/07/2011 5:32:00', '17/07/2011 5:33:00', '17/07/2011 5:34:00', '17/07/2011 5:35:00', '17/07/2011 5:36:00', '17/07/2011 5:37:00', '17/07/2011 5:38:00', '17/07/2011 5:39:00', '17/07/2011 5:40:00', '17/07/2011 5:41:00', '17/07/2011 5:42:00', '17/07/2011 5:43:00', '17/07/2011 5:44:00', '17/07/2011 5:45:00', '17/07/2011 5:46:00', '17/07/2011 5:47:00', '17/07/2011 5:48:00', '17/07/2011 5:49:00', '17/07/2011 5:50:00', '17/07/2011 5:51:00', '17/07/2011 5:52:00', '17/07/2011 5:53:00', '17/07/2011 5:54:00', '17/07/2011 5:55:00', '17/07/2011 5:56:00', '17/07/2011 5:57:00', '17/07/2011 5:58:00', '17/07/2011 5:59:00', '17/07/2011 6:00:00', '17/07/2011 6:01:00', '17/07/2011 6:02:00', '17/07/2011 6:03:00', '17/07/2011 6:04:00', '17/07/2011 6:05:00', '17/07/2011 6:06:00', '17/07/2011 6:07:00', '17/07/2011 6:08:00', '17/07/2011 6:09:00', '17/07/2011 6:10:00', '17/07/2011 6:11:00', '17/07/2011 6:12:00', '17/07/2011 6:13:00', '17/07/2011 6:14:00', '17/07/2011 6:15:00', '17/07/2011 6:16:00', '17/07/2011 6:17:00', '17/07/2011 6:18:00', '17/07/2011 6:19:00', '17/07/2011 6:20:00', '17/07/2011 6:21:00', '17/07/2011 6:22:00', '17/07/2011 6:23:00', '17/07/2011 6:24:00', '17/07/2011 6:25:00', '17/07/2011 6:26:00', '17/07/2011 6:27:00', '17/07/2011 6:28:00', '17/07/2011 6:29:00', '17/07/2011 6:30:00', '17/07/2011 6:31:00', '17/07/2011 6:32:00', '17/07/2011 6:33:00', '17/07/2011 6:34:00', '17/07/2011 6:35:00', '17/07/2011 6:36:00', '17/07/2011 6:37:00', '17/07/2011 6:38:00', '17/07/2011 6:39:00', '17/07/2011 6:40:00', '17/07/2011 6:41:00', '17/07/2011 6:42:00', '17/07/2011 6:43:00', '17/07/2011 6:44:00', '17/07/2011 6:45:00', '17/07/2011 6:46:00', '17/07/2011 6:47:00', '17/07/2011 6:48:00', '17/07/2011 6:49:00', '17/07/2011 6:50:00', '17/07/2011 6:51:00', '17/07/2011 6:52:00', '17/07/2011 6:53:00', '17/07/2011 6:54:00', '17/07/2011 6:55:00', '17/07/2011 6:56:00', '17/07/2011 6:57:00', '17/07/2011 6:58:00', '17/07/2011 6:59:00', '17/07/2011 7:00:00', '17/07/2011 7:01:00', '17/07/2011 7:02:00', '17/07/2011 7:03:00', '17/07/2011 7:04:00', '17/07/2011 7:05:00', '17/07/2011 7:06:00', '17/07/2011 7:07:00', '17/07/2011 7:08:00', '17/07/2011 7:09:00', '17/07/2011 7:10:00', '17/07/2011 7:11:00', '17/07/2011 7:12:00', '17/07/2011 7:13:00', '17/07/2011 7:14:00', '17/07/2011 7:15:00', '17/07/2011 7:16:00', '17/07/2011 7:17:00', '17/07/2011 7:18:00', '17/07/2011 7:19:00', '17/07/2011 7:20:00', '17/07/2011 7:21:00', '17/07/2011 7:22:00', '17/07/2011 7:23:00', '17/07/2011 7:24:00', '17/07/2011 7:25:00', '17/07/2011 7:26:00', '17/07/2011 7:27:00', '17/07/2011 7:28:00', '17/07/2011 7:29:00', '17/07/2011 7:30:00', '17/07/2011 7:31:00', '17/07/2011 7:32:00', '17/07/2011 7:33:00', '17/07/2011 7:34:00', '17/07/2011 7:35:00', '17/07/2011 7:36:00', '17/07/2011 7:37:00', '17/07/2011 7:38:00', '17/07/2011 7:39:00', '17/07/2011 7:40:00', '17/07/2011 7:41:00', '17/07/2011 7:42:00', '17/07/2011 7:43:00', '17/07/2011 7:44:00', '17/07/2011 7:45:00', '17/07/2011 7:46:00', '17/07/2011 7:47:00', '17/07/2011 7:48:00', '17/07/2011 7:49:00', '17/07/2011 7:50:00', '17/07/2011 7:51:00', '17/07/2011 7:52:00', '17/07/2011 7:53:00', '17/07/2011 7:54:00', '17/07/2011 7:55:00', '17/07/2011 7:56:00', '17/07/2011 7:57:00', '17/07/2011 7:58:00', '17/07/2011 7:59:00', '17/07/2011 8:00:00', '17/07/2011 8:01:00', '17/07/2011 8:02:00', '17/07/2011 8:03:00', '17/07/2011 8:04:00', '17/07/2011 8:05:00', '17/07/2011 8:06:00', '17/07/2011 8:07:00', '17/07/2011 8:08:00', '17/07/2011 8:09:00', '17/07/2011 8:10:00', '17/07/2011 8:11:00', '17/07/2011 8:12:00', '17/07/2011 8:13:00', '17/07/2011 8:14:00', '17/07/2011 8:15:00', '17/07/2011 8:16:00', '17/07/2011 8:17:00', '17/07/2011 8:18:00', '17/07/2011 8:19:00', '17/07/2011 8:20:00', '17/07/2011 8:21:00', '17/07/2011 8:22:00', '17/07/2011 8:23:00', '17/07/2011 8:24:00', '17/07/2011 8:25:00', '17/07/2011 8:26:00', '17/07/2011 8:27:00', '17/07/2011 8:28:00', '17/07/2011 8:29:00', '17/07/2011 8:30:00', '17/07/2011 8:31:00', '17/07/2011 8:32:00', '17/07/2011 8:33:00', '17/07/2011 8:34:00', '17/07/2011 8:35:00', '17/07/2011 8:36:00', '17/07/2011 8:37:00', '17/07/2011 8:38:00', '17/07/2011 8:39:00', '17/07/2011 8:40:00', '17/07/2011 8:41:00', '17/07/2011 8:42:00', '17/07/2011 8:43:00', '17/07/2011 8:44:00', '17/07/2011 8:45:00', '17/07/2011 8:46:00', '17/07/2011 8:47:00', '17/07/2011 8:48:00', '17/07/2011 8:49:00', '17/07/2011 8:50:00', '17/07/2011 8:51:00', '17/07/2011 8:52:00', '17/07/2011 8:53:00', '17/07/2011 8:54:00', '17/07/2011 8:55:00', '17/07/2011 8:56:00', '17/07/2011 8:57:00', '17/07/2011 8:58:00', '17/07/2011 8:59:00', '17/07/2011 9:00:00', '17/07/2011 9:01:00', '17/07/2011 9:02:00', '17/07/2011 9:03:00', '17/07/2011 9:04:00', '17/07/2011 9:05:00', '17/07/2011 9:06:00', '17/07/2011 9:07:00', '17/07/2011 9:08:00', '17/07/2011 9:09:00', '17/07/2011 9:10:00', '17/07/2011 9:11:00', '17/07/2011 9:12:00', '17/07/2011 9:13:00', '17/07/2011 9:14:00', '17/07/2011 9:15:00', '17/07/2011 9:16:00', '17/07/2011 9:17:00', '17/07/2011 9:18:00', '17/07/2011 9:19:00', '17/07/2011 9:20:00', '17/07/2011 9:21:00', '17/07/2011 9:22:00', '17/07/2011 9:23:00', '17/07/2011 9:24:00', '17/07/2011 9:25:00', '17/07/2011 9:26:00', '17/07/2011 9:27:00', '17/07/2011 9:28:00', '17/07/2011 9:29:00', '17/07/2011 9:30:00', '17/07/2011 9:31:00', '17/07/2011 9:32:00', '17/07/2011 9:33:00', '17/07/2011 9:34:00', '17/07/2011 9:35:00', '17/07/2011 9:36:00', '17/07/2011 9:37:00', '17/07/2011 9:38:00', '17/07/2011 9:39:00', '17/07/2011 9:40:00', '17/07/2011 9:41:00', '17/07/2011 9:42:00', '17/07/2011 9:43:00', '17/07/2011 9:44:00', '17/07/2011 9:45:00', '17/07/2011 9:46:00', '17/07/2011 9:47:00', '17/07/2011 9:48:00', '17/07/2011 9:49:00', '17/07/2011 9:50:00', '17/07/2011 9:51:00', '17/07/2011 9:52:00', '17/07/2011 9:53:00', '17/07/2011 9:54:00', '17/07/2011 9:55:00', '17/07/2011 9:56:00', '17/07/2011 9:57:00', '17/07/2011 9:58:00', '17/07/2011 9:59:00', '17/07/2011 10:00:00', '17/07/2011 10:01:00', '17/07/2011 10:02:00', '17/07/2011 10:03:00', '17/07/2011 10:04:00', '17/07/2011 10:05:00', '17/07/2011 10:06:00', '17/07/2011 10:07:00', '17/07/2011 10:08:00', '17/07/2011 10:09:00', '17/07/2011 10:10:00', '17/07/2011 10:11:00', '17/07/2011 10:12:00', '17/07/2011 10:13:00', '17/07/2011 10:14:00', '17/07/2011 10:15:00', '17/07/2011 10:16:00', '17/07/2011 10:17:00', '17/07/2011 10:18:00', '17/07/2011 10:19:00', '17/07/2011 10:20:00', '17/07/2011 10:21:00', '17/07/2011 10:22:00', '17/07/2011 10:23:00', '17/07/2011 10:24:00', '17/07/2011 10:25:00', '17/07/2011 10:26:00', '17/07/2011 10:27:00', '17/07/2011 10:28:00', '17/07/2011 10:29:00', '17/07/2011 10:30:00', '17/07/2011 10:31:00', '17/07/2011 10:32:00', '17/07/2011 10:33:00', '17/07/2011 10:34:00', '17/07/2011 10:35:00', '17/07/2011 10:36:00', '17/07/2011 10:37:00', '17/07/2011 10:38:00', '17/07/2011 10:39:00', '17/07/2011 10:40:00', '17/07/2011 10:41:00', '17/07/2011 10:42:00', '17/07/2011 10:43:00', '17/07/2011 10:44:00', '17/07/2011 10:45:00', '17/07/2011 10:46:00', '17/07/2011 10:47:00', '17/07/2011 10:48:00', '17/07/2011 10:49:00', '17/07/2011 10:50:00', '17/07/2011 10:51:00', '17/07/2011 10:52:00', '17/07/2011 10:53:00', '17/07/2011 10:54:00', '17/07/2011 10:55:00', '17/07/2011 10:56:00', '17/07/2011 10:57:00', '17/07/2011 10:58:00', '17/07/2011 10:59:00', '17/07/2011 11:00:00', '17/07/2011 11:01:00', '17/07/2011 11:02:00', '17/07/2011 11:03:00', '17/07/2011 11:04:00', '17/07/2011 11:05:00', '17/07/2011 11:06:00', '17/07/2011 11:07:00', '17/07/2011 11:08:00', '17/07/2011 11:09:00', '17/07/2011 11:10:00', '17/07/2011 11:11:00', '17/07/2011 11:12:00', '17/07/2011 11:13:00', '17/07/2011 11:14:00', '17/07/2011 11:15:00', '17/07/2011 11:16:00', '17/07/2011 11:17:00', '17/07/2011 11:18:00', '17/07/2011 11:19:00', '17/07/2011 11:20:00', '17/07/2011 11:21:00', '17/07/2011 11:22:00', '17/07/2011 11:23:00', '17/07/2011 11:24:00', '17/07/2011 11:25:00', '17/07/2011 11:26:00', '17/07/2011 11:27:00', '17/07/2011 11:28:00', '17/07/2011 11:29:00', '17/07/2011 11:30:00', '17/07/2011 11:31:00', '17/07/2011 11:32:00', '17/07/2011 11:33:00', '17/07/2011 11:34:00', '17/07/2011 11:35:00', '17/07/2011 11:36:00', '17/07/2011 11:37:00', '17/07/2011 11:38:00', '17/07/2011 11:39:00', '17/07/2011 11:40:00', '17/07/2011 11:41:00', '17/07/2011 11:42:00', '17/07/2011 11:43:00', '17/07/2011 11:44:00', '17/07/2011 11:45:00', '17/07/2011 11:46:00', '17/07/2011 11:47:00', '17/07/2011 11:48:00', '17/07/2011 11:49:00', '17/07/2011 11:50:00', '17/07/2011 11:51:00', '17/07/2011 11:52:00', '17/07/2011 11:53:00', '17/07/2011 11:54:00', '17/07/2011 11:55:00', '17/07/2011 11:56:00', '17/07/2011 11:57:00', '17/07/2011 11:58:00', '17/07/2011 11:59:00', '17/07/2011 12:00:00', '17/07/2011 12:01:00', '17/07/2011 12:02:00', '17/07/2011 12:03:00', '17/07/2011 12:04:00', '17/07/2011 12:05:00', '17/07/2011 12:06:00', '17/07/2011 12:07:00', '17/07/2011 12:08:00', '17/07/2011 12:09:00', '17/07/2011 12:10:00', '17/07/2011 12:11:00', '17/07/2011 12:12:00', '17/07/2011 12:13:00', '17/07/2011 12:14:00', '17/07/2011 12:15:00', '17/07/2011 12:16:00', '17/07/2011 12:17:00', '17/07/2011 12:18:00', '17/07/2011 12:19:00', '17/07/2011 12:20:00', '17/07/2011 12:21:00', '17/07/2011 12:22:00', '17/07/2011 12:23:00', '17/07/2011 12:24:00', '17/07/2011 12:25:00', '17/07/2011 12:26:00', '17/07/2011 12:27:00', '17/07/2011 12:28:00', '17/07/2011 12:29:00', '17/07/2011 12:30:00', '17/07/2011 12:31:00', '17/07/2011 12:32:00', '17/07/2011 12:33:00', '17/07/2011 12:34:00', '17/07/2011 12:35:00', '17/07/2011 12:36:00', '17/07/2011 12:37:00', '17/07/2011 12:38:00', '17/07/2011 12:39:00', '17/07/2011 12:40:00', '17/07/2011 12:41:00', '17/07/2011 12:42:00', '17/07/2011 12:43:00', '17/07/2011 12:44:00', '17/07/2011 12:45:00', '17/07/2011 12:46:00', '17/07/2011 12:47:00', '17/07/2011 12:48:00', '17/07/2011 12:49:00', '17/07/2011 12:50:00', '17/07/2011 12:51:00', '17/07/2011 12:52:00', '17/07/2011 12:53:00', '17/07/2011 12:54:00', '17/07/2011 12:55:00', '17/07/2011 12:56:00', '17/07/2011 12:57:00', '17/07/2011 12:58:00', '17/07/2011 12:59:00', '17/07/2011 13:00:00', '17/07/2011 13:01:00', '17/07/2011 13:02:00', '17/07/2011 13:03:00', '17/07/2011 13:04:00', '17/07/2011 13:05:00', '17/07/2011 13:06:00', '17/07/2011 13:07:00', '17/07/2011 13:08:00', '17/07/2011 13:09:00', '17/07/2011 13:10:00', '17/07/2011 13:11:00', '17/07/2011 13:12:00', '17/07/2011 13:13:00', '17/07/2011 13:14:00', '17/07/2011 13:15:00', '17/07/2011 13:16:00', '17/07/2011 13:17:00', '17/07/2011 13:18:00', '17/07/2011 13:19:00', '17/07/2011 13:20:00', '17/07/2011 13:21:00', '17/07/2011 13:22:00', '17/07/2011 13:23:00', '17/07/2011 13:24:00', '17/07/2011 13:25:00', '17/07/2011 13:26:00', '17/07/2011 13:27:00', '17/07/2011 13:28:00', '17/07/2011 13:29:00', '17/07/2011 13:30:00', '17/07/2011 13:31:00', '17/07/2011 13:32:00', '17/07/2011 13:33:00', '17/07/2011 13:34:00', '17/07/2011 13:35:00', '17/07/2011 13:36:00', '17/07/2011 13:37:00', '17/07/2011 13:38:00', '17/07/2011 13:39:00', '17/07/2011 13:40:00', '17/07/2011 13:41:00', '17/07/2011 13:42:00', '17/07/2011 13:43:00', '17/07/2011 13:44:00', '17/07/2011 13:45:00', '17/07/2011 13:46:00', '17/07/2011 13:47:00', '17/07/2011 13:48:00', '17/07/2011 13:49:00', '17/07/2011 13:50:00', '17/07/2011 13:51:00', '17/07/2011 13:52:00', '17/07/2011 13:53:00', '17/07/2011 13:54:00', '17/07/2011 13:55:00', '17/07/2011 13:56:00', '17/07/2011 13:57:00', '17/07/2011 13:58:00', '17/07/2011 13:59:00', '17/07/2011 14:00:00', '17/07/2011 14:01:00', '17/07/2011 14:02:00', '17/07/2011 14:03:00', '17/07/2011 14:04:00', '17/07/2011 14:05:00', '17/07/2011 14:06:00', '17/07/2011 14:07:00', '17/07/2011 14:08:00', '17/07/2011 14:09:00', '17/07/2011 14:10:00', '17/07/2011 14:11:00', '17/07/2011 14:12:00', '17/07/2011 14:13:00', '17/07/2011 14:14:00', '17/07/2011 14:15:00', '17/07/2011 14:16:00', '17/07/2011 14:17:00', '17/07/2011 14:18:00', '17/07/2011 14:19:00', '17/07/2011 14:20:00', '17/07/2011 14:21:00', '17/07/2011 14:22:00', '17/07/2011 14:23:00', '17/07/2011 14:24:00', '17/07/2011 14:25:00', '17/07/2011 14:26:00', '17/07/2011 14:27:00', '17/07/2011 14:28:00', '17/07/2011 14:29:00', '17/07/2011 14:30:00', '17/07/2011 14:31:00', '17/07/2011 14:32:00', '17/07/2011 14:33:00', '17/07/2011 14:34:00', '17/07/2011 14:35:00', '17/07/2011 14:36:00', '17/07/2011 14:37:00', '17/07/2011 14:38:00', '17/07/2011 14:39:00', '17/07/2011 14:40:00', '17/07/2011 14:41:00', '17/07/2011 14:42:00', '17/07/2011 14:43:00', '17/07/2011 14:44:00', '17/07/2011 14:45:00', '17/07/2011 14:46:00', '17/07/2011 14:47:00', '17/07/2011 14:48:00', '17/07/2011 14:49:00', '17/07/2011 14:50:00', '17/07/2011 14:51:00', '17/07/2011 14:52:00', '17/07/2011 14:53:00', '17/07/2011 14:54:00', '17/07/2011 14:55:00', '17/07/2011 14:56:00', '17/07/2011 14:57:00', '17/07/2011 14:58:00', '17/07/2011 14:59:00', '17/07/2011 15:00:00', '17/07/2011 15:01:00', '17/07/2011 15:02:00', '17/07/2011 15:03:00', '17/07/2011 15:04:00', '17/07/2011 15:05:00', '17/07/2011 15:06:00', '17/07/2011 15:07:00', '17/07/2011 15:08:00', '17/07/2011 15:09:00', '17/07/2011 15:10:00', '17/07/2011 15:11:00', '17/07/2011 15:12:00', '17/07/2011 15:13:00', '17/07/2011 15:14:00', '17/07/2011 15:15:00', '17/07/2011 15:16:00', '17/07/2011 15:17:00', '17/07/2011 15:18:00', '17/07/2011 15:19:00', '17/07/2011 15:20:00', '17/07/2011 15:21:00', '17/07/2011 15:22:00', '17/07/2011 15:23:00', '17/07/2011 15:24:00', '17/07/2011 15:25:00', '17/07/2011 15:26:00', '17/07/2011 15:27:00', '17/07/2011 15:28:00', '17/07/2011 15:29:00', '17/07/2011 15:30:00', '17/07/2011 15:31:00', '17/07/2011 15:32:00', '17/07/2011 15:33:00', '17/07/2011 15:34:00', '17/07/2011 15:35:00', '17/07/2011 15:36:00', '17/07/2011 15:37:00', '17/07/2011 15:38:00', '17/07/2011 15:39:00', '17/07/2011 15:40:00', '17/07/2011 15:41:00', '17/07/2011 15:42:00', '17/07/2011 15:43:00', '17/07/2011 15:44:00', '17/07/2011 15:45:00', '17/07/2011 15:46:00', '17/07/2011 15:47:00', '17/07/2011 15:48:00', '17/07/2011 15:49:00', '17/07/2011 15:50:00', '17/07/2011 15:51:00', '17/07/2011 15:52:00', '17/07/2011 15:53:00', '17/07/2011 15:54:00', '17/07/2011 15:55:00', '17/07/2011 15:56:00', '17/07/2011 15:57:00', '17/07/2011 15:58:00', '17/07/2011 15:59:00', '17/07/2011 16:00:00', '17/07/2011 16:01:00', '17/07/2011 16:02:00', '17/07/2011 16:03:00', '17/07/2011 16:04:00', '17/07/2011 16:05:00', '17/07/2011 16:06:00', '17/07/2011 16:07:00', '17/07/2011 16:08:00', '17/07/2011 16:09:00', '17/07/2011 16:10:00', '17/07/2011 16:11:00', '17/07/2011 16:12:00', '17/07/2011 16:13:00', '17/07/2011 16:14:00', '17/07/2011 16:15:00', '17/07/2011 16:16:00', '17/07/2011 16:17:00', '17/07/2011 16:18:00', '17/07/2011 16:19:00', '17/07/2011 16:20:00', '17/07/2011 16:21:00', '17/07/2011 16:22:00', '17/07/2011 16:23:00', '17/07/2011 16:24:00', '17/07/2011 16:25:00', '17/07/2011 16:26:00', '17/07/2011 16:27:00', '17/07/2011 16:28:00', '17/07/2011 16:29:00', '17/07/2011 16:30:00', '17/07/2011 16:31:00', '17/07/2011 16:32:00', '17/07/2011 16:33:00', '17/07/2011 16:34:00', '17/07/2011 16:35:00', '17/07/2011 16:36:00', '17/07/2011 16:37:00', '17/07/2011 16:38:00', '17/07/2011 16:39:00', '17/07/2011 16:40:00', '17/07/2011 16:41:00', '17/07/2011 16:42:00', '17/07/2011 16:43:00', '17/07/2011 16:44:00', '17/07/2011 16:45:00', '17/07/2011 16:46:00', '17/07/2011 16:47:00', '17/07/2011 16:48:00', '17/07/2011 16:49:00', '17/07/2011 16:50:00', '17/07/2011 16:51:00', '17/07/2011 16:52:00', '17/07/2011 16:53:00', '17/07/2011 16:54:00', '17/07/2011 16:55:00', '17/07/2011 16:56:00', '17/07/2011 16:57:00', '17/07/2011 16:58:00', '17/07/2011 16:59:00', '17/07/2011 17:00:00', '17/07/2011 17:01:00', '17/07/2011 17:02:00', '17/07/2011 17:03:00', '17/07/2011 17:04:00', '17/07/2011 17:05:00', '17/07/2011 17:06:00', '17/07/2011 17:07:00', '17/07/2011 17:08:00', '17/07/2011 17:09:00', '17/07/2011 17:10:00', '17/07/2011 17:11:00', '17/07/2011 17:12:00', '17/07/2011 17:13:00', '17/07/2011 17:14:00', '17/07/2011 17:15:00', '17/07/2011 17:16:00', '17/07/2011 17:17:00', '17/07/2011 17:18:00', '17/07/2011 17:19:00', '17/07/2011 17:20:00', '17/07/2011 17:21:00', '17/07/2011 17:22:00', '17/07/2011 17:23:00', '17/07/2011 17:24:00', '17/07/2011 17:25:00', '17/07/2011 17:26:00', '17/07/2011 17:27:00', '17/07/2011 17:28:00', '17/07/2011 17:29:00', '17/07/2011 17:30:00', '17/07/2011 17:31:00', '17/07/2011 17:32:00', '17/07/2011 17:33:00', '17/07/2011 17:34:00', '17/07/2011 17:35:00', '17/07/2011 17:36:00', '17/07/2011 17:37:00', '17/07/2011 17:38:00', '17/07/2011 17:39:00', '17/07/2011 17:40:00', '17/07/2011 17:41:00', '17/07/2011 17:42:00', '17/07/2011 17:43:00', '17/07/2011 17:44:00', '17/07/2011 17:45:00', '17/07/2011 17:46:00', '17/07/2011 17:47:00', '17/07/2011 17:48:00', '17/07/2011 17:49:00', '17/07/2011 17:50:00', '17/07/2011 17:51:00', '17/07/2011 17:52:00', '17/07/2011 17:53:00', '17/07/2011 17:54:00', '17/07/2011 17:55:00', '17/07/2011 17:56:00', '17/07/2011 17:57:00', '17/07/2011 17:58:00', '17/07/2011 17:59:00', '17/07/2011 18:00:00', '17/07/2011 18:01:00', '17/07/2011 18:02:00', '17/07/2011 18:03:00', '17/07/2011 18:04:00', '17/07/2011 18:05:00', '17/07/2011 18:06:00', '17/07/2011 18:07:00', '17/07/2011 18:08:00', '17/07/2011 18:09:00', '17/07/2011 18:10:00', '17/07/2011 18:11:00', '17/07/2011 18:12:00', '17/07/2011 18:13:00', '17/07/2011 18:14:00', '17/07/2011 18:15:00', '17/07/2011 18:16:00', '17/07/2011 18:17:00', '17/07/2011 18:18:00', '17/07/2011 18:19:00', '17/07/2011 18:20:00', '17/07/2011 18:21:00', '17/07/2011 18:22:00', '17/07/2011 18:23:00', '17/07/2011 18:24:00', '17/07/2011 18:25:00', '17/07/2011 18:26:00', '17/07/2011 18:27:00', '17/07/2011 18:28:00', '17/07/2011 18:29:00', '17/07/2011 18:30:00', '17/07/2011 18:31:00', '17/07/2011 18:32:00', '17/07/2011 18:33:00', '17/07/2011 18:34:00', '17/07/2011 18:35:00', '17/07/2011 18:36:00', '17/07/2011 18:37:00', '17/07/2011 18:38:00', '17/07/2011 18:39:00', '17/07/2011 18:40:00', '17/07/2011 18:41:00', '17/07/2011 18:42:00', '17/07/2011 18:43:00', '17/07/2011 18:44:00', '17/07/2011 18:45:00', '17/07/2011 18:46:00', '17/07/2011 18:47:00', '17/07/2011 18:48:00', '17/07/2011 18:49:00', '17/07/2011 18:50:00', '17/07/2011 18:51:00', '17/07/2011 18:52:00', '17/07/2011 18:53:00', '17/07/2011 18:54:00', '17/07/2011 18:55:00', '17/07/2011 18:56:00', '17/07/2011 18:57:00', '17/07/2011 18:58:00', '17/07/2011 18:59:00', '17/07/2011 19:00:00', '17/07/2011 19:01:00', '17/07/2011 19:02:00', '17/07/2011 19:03:00', '17/07/2011 19:04:00', '17/07/2011 19:05:00', '17/07/2011 19:06:00', '17/07/2011 19:07:00', '17/07/2011 19:08:00', '17/07/2011 19:09:00', '17/07/2011 19:10:00', '17/07/2011 19:11:00', '17/07/2011 19:12:00', '17/07/2011 19:13:00', '17/07/2011 19:14:00', '17/07/2011 19:15:00', '17/07/2011 19:16:00', '17/07/2011 19:17:00', '17/07/2011 19:18:00', '17/07/2011 19:19:00', '17/07/2011 19:20:00', '17/07/2011 19:21:00', '17/07/2011 19:22:00', '17/07/2011 19:23:00', '17/07/2011 19:24:00', '17/07/2011 19:25:00', '17/07/2011 19:26:00', '17/07/2011 19:27:00', '17/07/2011 19:28:00', '17/07/2011 19:29:00', '17/07/2011 19:30:00', '17/07/2011 19:31:00', '17/07/2011 19:32:00', '17/07/2011 19:33:00', '17/07/2011 19:34:00', '17/07/2011 19:35:00', '17/07/2011 19:36:00', '17/07/2011 19:37:00', '17/07/2011 19:38:00', '17/07/2011 19:39:00', '17/07/2011 19:40:00', '17/07/2011 19:41:00', '17/07/2011 19:42:00', '17/07/2011 19:43:00', '17/07/2011 19:44:00', '17/07/2011 19:45:00', '17/07/2011 19:46:00', '17/07/2011 19:47:00', '17/07/2011 19:48:00', '17/07/2011 19:49:00', '17/07/2011 19:50:00', '17/07/2011 19:51:00', '17/07/2011 19:52:00', '17/07/2011 19:53:00', '17/07/2011 19:54:00', '17/07/2011 19:55:00', '17/07/2011 19:56:00', '17/07/2011 19:57:00', '17/07/2011 19:58:00', '17/07/2011 19:59:00', '17/07/2011 20:00:00', '17/07/2011 20:01:00', '17/07/2011 20:02:00', '17/07/2011 20:03:00', '17/07/2011 20:04:00', '17/07/2011 20:05:00', '17/07/2011 20:06:00', '17/07/2011 20:07:00', '17/07/2011 20:08:00', '17/07/2011 20:09:00', '17/07/2011 20:10:00', '17/07/2011 20:11:00', '17/07/2011 20:12:00', '17/07/2011 20:13:00', '17/07/2011 20:14:00', '17/07/2011 20:15:00', '17/07/2011 20:16:00', '17/07/2011 20:17:00', '17/07/2011 20:18:00', '17/07/2011 20:19:00', '17/07/2011 20:20:00', '17/07/2011 20:21:00', '17/07/2011 20:22:00', '17/07/2011 20:23:00', '17/07/2011 20:24:00', '17/07/2011 20:25:00', '17/07/2011 20:26:00', '17/07/2011 20:27:00', '17/07/2011 20:28:00', '17/07/2011 20:29:00', '17/07/2011 20:30:00', '17/07/2011 20:31:00', '17/07/2011 20:32:00', '17/07/2011 20:33:00', '17/07/2011 20:34:00', '17/07/2011 20:35:00', '17/07/2011 20:36:00', '17/07/2011 20:37:00', '17/07/2011 20:38:00', '17/07/2011 20:39:00', '17/07/2011 20:40:00', '17/07/2011 20:41:00', '17/07/2011 20:42:00', '17/07/2011 20:43:00', '17/07/2011 20:44:00', '17/07/2011 20:45:00', '17/07/2011 20:46:00', '17/07/2011 20:47:00', '17/07/2011 20:48:00', '17/07/2011 20:49:00', '17/07/2011 20:50:00', '17/07/2011 20:51:00', '17/07/2011 20:52:00', '17/07/2011 20:53:00', '17/07/2011 20:54:00', '17/07/2011 20:55:00', '17/07/2011 20:56:00', '17/07/2011 20:57:00', '17/07/2011 20:58:00', '17/07/2011 20:59:00', '17/07/2011 21:00:00', '17/07/2011 21:01:00', '17/07/2011 21:02:00', '17/07/2011 21:03:00', '17/07/2011 21:04:00', '17/07/2011 21:05:00', '17/07/2011 21:06:00', '17/07/2011 21:07:00', '17/07/2011 21:08:00', '17/07/2011 21:09:00', '17/07/2011 21:10:00', '17/07/2011 21:11:00', '17/07/2011 21:12:00', '17/07/2011 21:13:00', '17/07/2011 21:14:00', '17/07/2011 21:15:00', '17/07/2011 21:16:00', '17/07/2011 21:17:00', '17/07/2011 21:18:00', '17/07/2011 21:19:00', '17/07/2011 21:20:00', '17/07/2011 21:21:00', '17/07/2011 21:22:00', '17/07/2011 21:23:00', '17/07/2011 21:24:00', '17/07/2011 21:25:00', '17/07/2011 21:26:00', '17/07/2011 21:27:00', '17/07/2011 21:28:00', '17/07/2011 21:29:00', '17/07/2011 21:30:00', '17/07/2011 21:31:00', '17/07/2011 21:32:00', '17/07/2011 21:33:00', '17/07/2011 21:34:00', '17/07/2011 21:35:00', '17/07/2011 21:36:00', '17/07/2011 21:37:00', '17/07/2011 21:38:00', '17/07/2011 21:39:00', '17/07/2011 21:40:00', '17/07/2011 21:41:00', '17/07/2011 21:42:00', '17/07/2011 21:43:00', '17/07/2011 21:44:00', '17/07/2011 21:45:00', '17/07/2011 21:46:00', '17/07/2011 21:47:00', '17/07/2011 21:48:00', '17/07/2011 21:49:00', '17/07/2011 21:50:00', '17/07/2011 21:51:00', '17/07/2011 21:52:00', '17/07/2011 21:53:00', '17/07/2011 21:54:00', '17/07/2011 21:55:00', '17/07/2011 21:56:00', '17/07/2011 21:57:00', '17/07/2011 21:58:00', '17/07/2011 21:59:00', '17/07/2011 22:00:00', '17/07/2011 22:01:00', '17/07/2011 22:02:00', '17/07/2011 22:03:00', '17/07/2011 22:04:00', '17/07/2011 22:05:00', '17/07/2011 22:06:00', '17/07/2011 22:07:00', '17/07/2011 22:08:00', '17/07/2011 22:09:00', '17/07/2011 22:10:00', '17/07/2011 22:11:00', '17/07/2011 22:12:00', '17/07/2011 22:13:00', '17/07/2011 22:14:00', '17/07/2011 22:15:00', '17/07/2011 22:16:00', '17/07/2011 22:17:00', '17/07/2011 22:18:00', '17/07/2011 22:19:00', '17/07/2011 22:20:00', '17/07/2011 22:21:00', '17/07/2011 22:22:00', '17/07/2011 22:23:00', '17/07/2011 22:24:00', '17/07/2011 22:25:00', '17/07/2011 22:26:00', '17/07/2011 22:27:00', '17/07/2011 22:28:00', '17/07/2011 22:29:00', '17/07/2011 22:30:00', '17/07/2011 22:31:00', '17/07/2011 22:32:00', '17/07/2011 22:33:00', '17/07/2011 22:34:00', '17/07/2011 22:35:00', '17/07/2011 22:36:00', '17/07/2011 22:37:00', '17/07/2011 22:38:00', '17/07/2011 22:39:00', '17/07/2011 22:40:00', '17/07/2011 22:41:00', '17/07/2011 22:42:00', '17/07/2011 22:43:00', '17/07/2011 22:44:00', '17/07/2011 22:45:00', '17/07/2011 22:46:00', '17/07/2011 22:47:00', '17/07/2011 22:48:00', '17/07/2011 22:49:00', '17/07/2011 22:50:00', '17/07/2011 22:51:00', '17/07/2011 22:52:00', '17/07/2011 22:53:00', '17/07/2011 22:54:00', '17/07/2011 22:55:00', '17/07/2011 22:56:00', '17/07/2011 22:57:00', '17/07/2011 22:58:00', '17/07/2011 22:59:00', '17/07/2011 23:00:00', '17/07/2011 23:01:00', '17/07/2011 23:02:00', '17/07/2011 23:03:00', '17/07/2011 23:04:00', '17/07/2011 23:05:00', '17/07/2011 23:06:00', '17/07/2011 23:07:00', '17/07/2011 23:08:00', '17/07/2011 23:09:00', '17/07/2011 23:10:00', '17/07/2011 23:11:00', '17/07/2011 23:12:00', '17/07/2011 23:13:00', '17/07/2011 23:14:00', '17/07/2011 23:15:00', '17/07/2011 23:16:00', '17/07/2011 23:17:00', '17/07/2011 23:18:00', '17/07/2011 23:19:00', '17/07/2011 23:20:00', '17/07/2011 23:21:00', '17/07/2011 23:22:00', '17/07/2011 23:23:00', '17/07/2011 23:24:00', '17/07/2011 23:25:00', '17/07/2011 23:26:00', '17/07/2011 23:27:00', '17/07/2011 23:28:00', '17/07/2011 23:29:00', '17/07/2011 23:30:00', '17/07/2011 23:31:00', '17/07/2011 23:32:00', '17/07/2011 23:33:00', '17/07/2011 23:34:00', '17/07/2011 23:35:00', '17/07/2011 23:36:00', '17/07/2011 23:37:00', '17/07/2011 23:38:00', '17/07/2011 23:39:00', '17/07/2011 23:40:00', '17/07/2011 23:41:00', '17/07/2011 23:42:00', '17/07/2011 23:43:00', '17/07/2011 23:44:00', '17/07/2011 23:45:00', '17/07/2011 23:46:00', '17/07/2011 23:47:00', '17/07/2011 23:48:00', '17/07/2011 23:49:00', '17/07/2011 23:50:00', '17/07/2011 23:51:00', '17/07/2011 23:52:00', '17/07/2011 23:53:00', '17/07/2011 23:54:00', '17/07/2011 23:55:00', '17/07/2011 23:56:00', '17/07/2011 23:57:00', '17/07/2011 23:58:00', '17/07/2011 23:59:00', '18/07/2011 0:00:00', '18/07/2011 0:01:00', '18/07/2011 0:02:00', '18/07/2011 0:03:00', '18/07/2011 0:04:00', '18/07/2011 0:05:00', '18/07/2011 0:06:00', '18/07/2011 0:07:00', '18/07/2011 0:08:00', '18/07/2011 0:09:00', '18/07/2011 0:10:00', '18/07/2011 0:11:00', '18/07/2011 0:12:00', '18/07/2011 0:13:00', '18/07/2011 0:14:00', '18/07/2011 0:15:00', '18/07/2011 0:16:00', '18/07/2011 0:17:00', '18/07/2011 0:18:00', '18/07/2011 0:19:00', '18/07/2011 0:20:00', '18/07/2011 0:21:00', '18/07/2011 0:22:00', '18/07/2011 0:23:00', '18/07/2011 0:24:00', '18/07/2011 0:25:00', '18/07/2011 0:26:00', '18/07/2011 0:27:00', '18/07/2011 0:28:00', '18/07/2011 0:29:00', '18/07/2011 0:30:00', '18/07/2011 0:31:00', '18/07/2011 0:32:00', '18/07/2011 0:33:00', '18/07/2011 0:34:00', '18/07/2011 0:35:00', '18/07/2011 0:36:00', '18/07/2011 0:37:00', '18/07/2011 0:38:00', '18/07/2011 0:39:00', '18/07/2011 0:40:00', '18/07/2011 0:41:00', '18/07/2011 0:42:00', '18/07/2011 0:43:00', '18/07/2011 0:44:00', '18/07/2011 0:45:00', '18/07/2011 0:46:00', '18/07/2011 0:47:00', '18/07/2011 0:48:00', '18/07/2011 0:49:00', '18/07/2011 0:50:00', '18/07/2011 0:51:00', '18/07/2011 0:52:00', '18/07/2011 0:53:00', '18/07/2011 0:54:00', '18/07/2011 0:55:00', '18/07/2011 0:56:00', '18/07/2011 0:57:00', '18/07/2011 0:58:00', '18/07/2011 0:59:00', '18/07/2011 1:00:00', '18/07/2011 1:01:00', '18/07/2011 1:02:00', '18/07/2011 1:03:00', '18/07/2011 1:04:00', '18/07/2011 1:05:00', '18/07/2011 1:06:00', '18/07/2011 1:07:00', '18/07/2011 1:08:00', '18/07/2011 1:09:00', '18/07/2011 1:10:00', '18/07/2011 1:11:00', '18/07/2011 1:12:00', '18/07/2011 1:13:00', '18/07/2011 1:14:00', '18/07/2011 1:15:00', '18/07/2011 1:16:00', '18/07/2011 1:17:00', '18/07/2011 1:18:00', '18/07/2011 1:19:00', '18/07/2011 1:20:00', '18/07/2011 1:21:00', '18/07/2011 1:22:00', '18/07/2011 1:23:00', '18/07/2011 1:24:00', '18/07/2011 1:25:00', '18/07/2011 1:26:00', '18/07/2011 1:27:00', '18/07/2011 1:28:00', '18/07/2011 1:29:00', '18/07/2011 1:30:00', '18/07/2011 1:31:00', '18/07/2011 1:32:00', '18/07/2011 1:33:00', '18/07/2011 1:34:00', '18/07/2011 1:35:00', '18/07/2011 1:36:00', '18/07/2011 1:37:00', '18/07/2011 1:38:00', '18/07/2011 1:39:00', '18/07/2011 1:40:00', '18/07/2011 1:41:00', '18/07/2011 1:42:00', '18/07/2011 1:43:00', '18/07/2011 1:44:00', '18/07/2011 1:45:00', '18/07/2011 1:46:00', '18/07/2011 1:47:00', '18/07/2011 1:48:00', '18/07/2011 1:49:00', '18/07/2011 1:50:00', '18/07/2011 1:51:00', '18/07/2011 1:52:00', '18/07/2011 1:53:00', '18/07/2011 1:54:00', '18/07/2011 1:55:00', '18/07/2011 1:56:00', '18/07/2011 1:57:00', '18/07/2011 1:58:00', '18/07/2011 1:59:00', '18/07/2011 2:00:00', '18/07/2011 2:01:00', '18/07/2011 2:02:00', '18/07/2011 2:03:00', '18/07/2011 2:04:00', '18/07/2011 2:05:00', '18/07/2011 2:06:00', '18/07/2011 2:07:00', '18/07/2011 2:08:00', '18/07/2011 2:09:00', '18/07/2011 2:10:00', '18/07/2011 2:11:00', '18/07/2011 2:12:00', '18/07/2011 2:13:00', '18/07/2011 2:14:00', '18/07/2011 2:15:00', '18/07/2011 2:16:00', '18/07/2011 2:17:00', '18/07/2011 2:18:00', '18/07/2011 2:19:00', '18/07/2011 2:20:00', '18/07/2011 2:21:00', '18/07/2011 2:22:00', '18/07/2011 2:23:00', '18/07/2011 2:24:00', '18/07/2011 2:25:00', '18/07/2011 2:26:00', '18/07/2011 2:27:00', '18/07/2011 2:28:00', '18/07/2011 2:29:00', '18/07/2011 2:30:00', '18/07/2011 2:31:00', '18/07/2011 2:32:00', '18/07/2011 2:33:00', '18/07/2011 2:34:00', '18/07/2011 2:35:00', '18/07/2011 2:36:00', '18/07/2011 2:37:00', '18/07/2011 2:38:00', '18/07/2011 2:39:00', '18/07/2011 2:40:00', '18/07/2011 2:41:00', '18/07/2011 2:42:00', '18/07/2011 2:43:00', '18/07/2011 2:44:00', '18/07/2011 2:45:00', '18/07/2011 2:46:00', '18/07/2011 2:47:00', '18/07/2011 2:48:00', '18/07/2011 2:49:00', '18/07/2011 2:50:00', '18/07/2011 2:51:00', '18/07/2011 2:52:00', '18/07/2011 2:53:00', '18/07/2011 2:54:00', '18/07/2011 2:55:00', '18/07/2011 2:56:00', '18/07/2011 2:57:00', '18/07/2011 2:58:00', '18/07/2011 2:59:00', '18/07/2011 3:00:00', '18/07/2011 3:01:00', '18/07/2011 3:02:00', '18/07/2011 3:03:00', '18/07/2011 3:04:00', '18/07/2011 3:05:00', '18/07/2011 3:06:00', '18/07/2011 3:07:00', '18/07/2011 3:08:00', '18/07/2011 3:09:00', '18/07/2011 3:10:00', '18/07/2011 3:11:00', '18/07/2011 3:12:00', '18/07/2011 3:13:00', '18/07/2011 3:14:00', '18/07/2011 3:15:00', '18/07/2011 3:16:00', '18/07/2011 3:17:00', '18/07/2011 3:18:00', '18/07/2011 3:19:00', '18/07/2011 3:20:00', '18/07/2011 3:21:00', '18/07/2011 3:22:00', '18/07/2011 3:23:00', '18/07/2011 3:24:00', '18/07/2011 3:25:00', '18/07/2011 3:26:00', '18/07/2011 3:27:00', '18/07/2011 3:28:00', '18/07/2011 3:29:00', '18/07/2011 3:30:00', '18/07/2011 3:31:00', '18/07/2011 3:32:00', '18/07/2011 3:33:00', '18/07/2011 3:34:00', '18/07/2011 3:35:00', '18/07/2011 3:36:00', '18/07/2011 3:37:00', '18/07/2011 3:38:00', '18/07/2011 3:39:00', '18/07/2011 3:40:00', '18/07/2011 3:41:00', '18/07/2011 3:42:00', '18/07/2011 3:43:00', '18/07/2011 3:44:00', '18/07/2011 3:45:00', '18/07/2011 3:46:00', '18/07/2011 3:47:00', '18/07/2011 3:48:00', '18/07/2011 3:49:00', '18/07/2011 3:50:00', '18/07/2011 3:51:00', '18/07/2011 3:52:00', '18/07/2011 3:53:00', '18/07/2011 3:54:00', '18/07/2011 3:55:00', '18/07/2011 3:56:00', '18/07/2011 3:57:00', '18/07/2011 3:58:00', '18/07/2011 3:59:00', '18/07/2011 4:00:00', '18/07/2011 4:01:00', '18/07/2011 4:02:00', '18/07/2011 4:03:00', '18/07/2011 4:04:00', '18/07/2011 4:05:00', '18/07/2011 4:06:00', '18/07/2011 4:07:00', '18/07/2011 4:08:00', '18/07/2011 4:09:00', '18/07/2011 4:10:00', '18/07/2011 4:11:00', '18/07/2011 4:12:00', '18/07/2011 4:13:00', '18/07/2011 4:14:00', '18/07/2011 4:15:00', '18/07/2011 4:16:00', '18/07/2011 4:17:00', '18/07/2011 4:18:00', '18/07/2011 4:19:00', '18/07/2011 4:20:00', '18/07/2011 4:21:00', '18/07/2011 4:22:00', '18/07/2011 4:23:00', '18/07/2011 4:24:00', '18/07/2011 4:25:00', '18/07/2011 4:26:00', '18/07/2011 4:27:00', '18/07/2011 4:28:00', '18/07/2011 4:29:00', '18/07/2011 4:30:00', '18/07/2011 4:31:00', '18/07/2011 4:32:00', '18/07/2011 4:33:00', '18/07/2011 4:34:00', '18/07/2011 4:35:00', '18/07/2011 4:36:00', '18/07/2011 4:37:00', '18/07/2011 4:38:00', '18/07/2011 4:39:00', '18/07/2011 4:40:00', '18/07/2011 4:41:00', '18/07/2011 4:42:00', '18/07/2011 4:43:00', '18/07/2011 4:44:00', '18/07/2011 4:45:00', '18/07/2011 4:46:00', '18/07/2011 4:47:00', '18/07/2011 4:48:00', '18/07/2011 4:49:00', '18/07/2011 4:50:00', '18/07/2011 4:51:00', '18/07/2011 4:52:00', '18/07/2011 4:53:00', '18/07/2011 4:54:00', '18/07/2011 4:55:00', '18/07/2011 4:56:00', '18/07/2011 4:57:00', '18/07/2011 4:58:00', '18/07/2011 4:59:00', '18/07/2011 5:00:00', '18/07/2011 5:01:00', '18/07/2011 5:02:00', '18/07/2011 5:03:00', '18/07/2011 5:04:00', '18/07/2011 5:05:00', '18/07/2011 5:06:00', '18/07/2011 5:07:00', '18/07/2011 5:08:00', '18/07/2011 5:09:00', '18/07/2011 5:10:00', '18/07/2011 5:11:00', '18/07/2011 5:12:00', '18/07/2011 5:13:00', '18/07/2011 5:14:00', '18/07/2011 5:15:00', '18/07/2011 5:16:00', '18/07/2011 5:17:00', '18/07/2011 5:18:00', '18/07/2011 5:19:00', '18/07/2011 5:20:00', '18/07/2011 5:21:00', '18/07/2011 5:22:00', '18/07/2011 5:23:00', '18/07/2011 5:24:00', '18/07/2011 5:25:00', '18/07/2011 5:26:00', '18/07/2011 5:27:00', '18/07/2011 5:28:00', '18/07/2011 5:29:00', '18/07/2011 5:30:00', '18/07/2011 5:31:00', '18/07/2011 5:32:00', '18/07/2011 5:33:00', '18/07/2011 5:34:00', '18/07/2011 5:35:00', '18/07/2011 5:36:00', '18/07/2011 5:37:00', '18/07/2011 5:38:00', '18/07/2011 5:39:00', '18/07/2011 5:40:00', '18/07/2011 5:41:00', '18/07/2011 5:42:00', '18/07/2011 5:43:00', '18/07/2011 5:44:00', '18/07/2011 5:45:00', '18/07/2011 5:46:00', '18/07/2011 5:47:00', '18/07/2011 5:48:00', '18/07/2011 5:49:00', '18/07/2011 5:50:00', '18/07/2011 5:51:00', '18/07/2011 5:52:00', '18/07/2011 5:53:00', '18/07/2011 5:54:00', '18/07/2011 5:55:00', '18/07/2011 5:56:00', '18/07/2011 5:57:00', '18/07/2011 5:58:00', '18/07/2011 5:59:00', '18/07/2011 6:00:00', '18/07/2011 6:01:00', '18/07/2011 6:02:00', '18/07/2011 6:03:00', '18/07/2011 6:04:00', '18/07/2011 6:05:00', '18/07/2011 6:06:00', '18/07/2011 6:07:00', '18/07/2011 6:08:00', '18/07/2011 6:09:00', '18/07/2011 6:10:00', '18/07/2011 6:11:00', '18/07/2011 6:12:00', '18/07/2011 6:13:00', '18/07/2011 6:14:00', '18/07/2011 6:15:00', '18/07/2011 6:16:00', '18/07/2011 6:17:00', '18/07/2011 6:18:00', '18/07/2011 6:19:00', '18/07/2011 6:20:00', '18/07/2011 6:21:00', '18/07/2011 6:22:00', '18/07/2011 6:23:00', '18/07/2011 6:24:00', '18/07/2011 6:25:00', '18/07/2011 6:26:00', '18/07/2011 6:27:00', '18/07/2011 6:28:00', '18/07/2011 6:29:00', '18/07/2011 6:30:00', '18/07/2011 6:31:00', '18/07/2011 6:32:00', '18/07/2011 6:33:00', '18/07/2011 6:34:00', '18/07/2011 6:35:00', '18/07/2011 6:36:00', '18/07/2011 6:37:00', '18/07/2011 6:38:00', '18/07/2011 6:39:00', '18/07/2011 6:40:00', '18/07/2011 6:41:00', '18/07/2011 6:42:00', '18/07/2011 6:43:00', '18/07/2011 6:44:00', '18/07/2011 6:45:00', '18/07/2011 6:46:00', '18/07/2011 6:47:00', '18/07/2011 6:48:00', '18/07/2011 6:49:00', '18/07/2011 6:50:00', '18/07/2011 6:51:00', '18/07/2011 6:52:00', '18/07/2011 6:53:00', '18/07/2011 6:54:00', '18/07/2011 6:55:00', '18/07/2011 6:56:00', '18/07/2011 6:57:00', '18/07/2011 6:58:00', '18/07/2011 6:59:00', '18/07/2011 7:00:00', '18/07/2011 7:01:00', '18/07/2011 7:02:00', '18/07/2011 7:03:00', '18/07/2011 7:04:00', '18/07/2011 7:05:00', '18/07/2011 7:06:00', '18/07/2011 7:07:00', '18/07/2011 7:08:00', '18/07/2011 7:09:00', '18/07/2011 7:10:00', '18/07/2011 7:11:00', '18/07/2011 7:12:00', '18/07/2011 7:13:00', '18/07/2011 7:14:00', '18/07/2011 7:15:00', '18/07/2011 7:16:00', '18/07/2011 7:17:00', '18/07/2011 7:18:00', '18/07/2011 7:19:00', '18/07/2011 7:20:00', '18/07/2011 7:21:00', '18/07/2011 7:22:00', '18/07/2011 7:23:00', '18/07/2011 7:24:00', '18/07/2011 7:25:00', '18/07/2011 7:26:00', '18/07/2011 7:27:00', '18/07/2011 7:28:00', '18/07/2011 7:29:00', '18/07/2011 7:30:00', '18/07/2011 7:31:00', '18/07/2011 7:32:00', '18/07/2011 7:33:00', '18/07/2011 7:34:00', '18/07/2011 7:35:00', '18/07/2011 7:36:00', '18/07/2011 7:37:00', '18/07/2011 7:38:00', '18/07/2011 7:39:00', '18/07/2011 7:40:00', '18/07/2011 7:41:00', '18/07/2011 7:42:00', '18/07/2011 7:43:00', '18/07/2011 7:44:00', '18/07/2011 7:45:00', '18/07/2011 7:46:00', '18/07/2011 7:47:00', '18/07/2011 7:48:00', '18/07/2011 7:49:00', '18/07/2011 7:50:00', '18/07/2011 7:51:00', '18/07/2011 7:52:00', '18/07/2011 7:53:00', '18/07/2011 7:54:00', '18/07/2011 7:55:00', '18/07/2011 7:56:00', '18/07/2011 7:57:00', '18/07/2011 7:58:00', '18/07/2011 7:59:00', '18/07/2011 8:00:00', '18/07/2011 8:01:00', '18/07/2011 8:02:00', '18/07/2011 8:03:00', '18/07/2011 8:04:00', '18/07/2011 8:05:00', '18/07/2011 8:06:00', '18/07/2011 8:07:00', '18/07/2011 8:08:00', '18/07/2011 8:09:00', '18/07/2011 8:10:00', '18/07/2011 8:11:00', '18/07/2011 8:12:00', '18/07/2011 8:13:00', '18/07/2011 8:14:00', '18/07/2011 8:15:00', '18/07/2011 8:16:00', '18/07/2011 8:17:00', '18/07/2011 8:18:00', '18/07/2011 8:19:00', '18/07/2011 8:20:00', '18/07/2011 8:21:00', '18/07/2011 8:22:00', '18/07/2011 8:23:00', '18/07/2011 8:24:00', '18/07/2011 8:25:00', '18/07/2011 8:26:00', '18/07/2011 8:27:00', '18/07/2011 8:28:00', '18/07/2011 8:29:00', '18/07/2011 8:30:00', '18/07/2011 8:31:00', '18/07/2011 8:32:00', '18/07/2011 8:33:00', '18/07/2011 8:34:00', '18/07/2011 8:35:00', '18/07/2011 8:36:00', '18/07/2011 8:37:00', '18/07/2011 8:38:00', '18/07/2011 8:39:00', '18/07/2011 8:40:00', '18/07/2011 8:41:00', '18/07/2011 8:42:00', '18/07/2011 8:43:00', '18/07/2011 8:44:00', '18/07/2011 8:45:00', '18/07/2011 8:46:00', '18/07/2011 8:47:00', '18/07/2011 8:48:00', '18/07/2011 8:49:00', '18/07/2011 8:50:00', '18/07/2011 8:51:00', '18/07/2011 8:52:00', '18/07/2011 8:53:00', '18/07/2011 8:54:00', '18/07/2011 8:55:00', '18/07/2011 8:56:00', '18/07/2011 8:57:00', '18/07/2011 8:58:00', '18/07/2011 8:59:00', '18/07/2011 9:00:00', '18/07/2011 9:01:00', '18/07/2011 9:02:00', '18/07/2011 9:03:00', '18/07/2011 9:04:00', '18/07/2011 9:05:00', '18/07/2011 9:06:00', '18/07/2011 9:07:00', '18/07/2011 9:08:00', '18/07/2011 9:09:00', '18/07/2011 9:10:00', '18/07/2011 9:11:00', '18/07/2011 9:12:00', '18/07/2011 9:13:00', '18/07/2011 9:14:00', '18/07/2011 9:15:00', '18/07/2011 9:16:00', '18/07/2011 9:17:00', '18/07/2011 9:18:00', '18/07/2011 9:19:00', '18/07/2011 9:20:00', '18/07/2011 9:21:00', '18/07/2011 9:22:00', '18/07/2011 9:23:00', '18/07/2011 9:24:00', '18/07/2011 9:25:00', '18/07/2011 9:26:00', '18/07/2011 9:27:00', '18/07/2011 9:28:00', '18/07/2011 9:29:00', '18/07/2011 9:30:00', '18/07/2011 9:31:00', '18/07/2011 9:32:00', '18/07/2011 9:33:00', '18/07/2011 9:34:00', '18/07/2011 9:35:00', '18/07/2011 9:36:00', '18/07/2011 9:37:00', '18/07/2011 9:38:00', '18/07/2011 9:39:00', '18/07/2011 9:40:00', '18/07/2011 9:41:00', '18/07/2011 9:42:00', '18/07/2011 9:43:00', '18/07/2011 9:44:00', '18/07/2011 9:45:00', '18/07/2011 9:46:00', '18/07/2011 9:47:00', '18/07/2011 9:48:00', '18/07/2011 9:49:00', '18/07/2011 9:50:00', '18/07/2011 9:51:00', '18/07/2011 9:52:00', '18/07/2011 9:53:00', '18/07/2011 9:54:00', '18/07/2011 9:55:00', '18/07/2011 9:56:00', '18/07/2011 9:57:00', '18/07/2011 9:58:00', '18/07/2011 9:59:00', '18/07/2011 10:00:00', '18/07/2011 10:01:00', '18/07/2011 10:02:00', '18/07/2011 10:03:00', '18/07/2011 10:04:00', '18/07/2011 10:05:00', '18/07/2011 10:06:00', '18/07/2011 10:07:00', '18/07/2011 10:08:00', '18/07/2011 10:09:00', '18/07/2011 10:10:00', '18/07/2011 10:11:00', '18/07/2011 10:12:00', '18/07/2011 10:13:00', '18/07/2011 10:14:00', '18/07/2011 10:15:00', '18/07/2011 10:16:00', '18/07/2011 10:17:00', '18/07/2011 10:18:00', '18/07/2011 10:19:00', '18/07/2011 10:20:00', '18/07/2011 10:21:00', '18/07/2011 10:22:00', '18/07/2011 10:23:00', '18/07/2011 10:24:00', '18/07/2011 10:25:00', '18/07/2011 10:26:00', '18/07/2011 10:27:00', '18/07/2011 10:28:00', '18/07/2011 10:29:00', '18/07/2011 10:30:00', '18/07/2011 10:31:00', '18/07/2011 10:32:00', '18/07/2011 10:33:00', '18/07/2011 10:34:00', '18/07/2011 10:35:00', '18/07/2011 10:36:00', '18/07/2011 10:37:00', '18/07/2011 10:38:00', '18/07/2011 10:39:00', '18/07/2011 10:40:00', '18/07/2011 10:41:00', '18/07/2011 10:42:00', '18/07/2011 10:43:00', '18/07/2011 10:44:00', '18/07/2011 10:45:00', '18/07/2011 10:46:00', '18/07/2011 10:47:00', '18/07/2011 10:48:00', '18/07/2011 10:49:00', '18/07/2011 10:50:00', '18/07/2011 10:51:00', '18/07/2011 10:52:00', '18/07/2011 10:53:00', '18/07/2011 10:54:00', '18/07/2011 10:55:00', '18/07/2011 10:56:00', '18/07/2011 10:57:00', '18/07/2011 10:58:00', '18/07/2011 10:59:00', '18/07/2011 11:00:00', '18/07/2011 11:01:00', '18/07/2011 11:02:00', '18/07/2011 11:03:00', '18/07/2011 11:04:00', '18/07/2011 11:05:00', '18/07/2011 11:06:00', '18/07/2011 11:07:00', '18/07/2011 11:08:00', '18/07/2011 11:09:00', '18/07/2011 11:10:00', '18/07/2011 11:11:00', '18/07/2011 11:12:00', '18/07/2011 11:13:00', '18/07/2011 11:14:00', '18/07/2011 11:15:00', '18/07/2011 11:16:00', '18/07/2011 11:17:00', '18/07/2011 11:18:00', '18/07/2011 11:19:00', '18/07/2011 11:20:00', '18/07/2011 11:21:00', '18/07/2011 11:22:00', '18/07/2011 11:23:00', '18/07/2011 11:24:00', '18/07/2011 11:25:00', '18/07/2011 11:26:00', '18/07/2011 11:27:00', '18/07/2011 11:28:00', '18/07/2011 11:29:00', '18/07/2011 11:30:00', '18/07/2011 11:31:00', '18/07/2011 11:32:00', '18/07/2011 11:33:00', '18/07/2011 11:34:00', '18/07/2011 11:35:00', '18/07/2011 11:36:00', '18/07/2011 11:37:00', '18/07/2011 11:38:00', '18/07/2011 11:39:00', '18/07/2011 11:40:00', '18/07/2011 11:41:00', '18/07/2011 11:42:00', '18/07/2011 11:43:00', '18/07/2011 11:44:00', '18/07/2011 11:45:00', '18/07/2011 11:46:00', '18/07/2011 11:47:00', '18/07/2011 11:48:00', '18/07/2011 11:49:00', '18/07/2011 11:50:00', '18/07/2011 11:51:00', '18/07/2011 11:52:00', '18/07/2011 11:53:00', '18/07/2011 11:54:00', '18/07/2011 11:55:00', '18/07/2011 11:56:00', '18/07/2011 11:57:00', '18/07/2011 11:58:00', '18/07/2011 11:59:00', '18/07/2011 12:00:00', '18/07/2011 12:01:00', '18/07/2011 12:02:00', '18/07/2011 12:03:00', '18/07/2011 12:04:00', '18/07/2011 12:05:00', '18/07/2011 12:06:00', '18/07/2011 12:07:00', '18/07/2011 12:08:00', '18/07/2011 12:09:00', '18/07/2011 12:10:00', '18/07/2011 12:11:00', '18/07/2011 12:12:00', '18/07/2011 12:13:00', '18/07/2011 12:14:00', '18/07/2011 12:15:00', '18/07/2011 12:16:00', '18/07/2011 12:17:00', '18/07/2011 12:18:00', '18/07/2011 12:19:00', '18/07/2011 12:20:00', '18/07/2011 12:21:00', '18/07/2011 12:22:00', '18/07/2011 12:23:00', '18/07/2011 12:24:00', '18/07/2011 12:25:00', '18/07/2011 12:26:00', '18/07/2011 12:27:00', '18/07/2011 12:28:00', '18/07/2011 12:29:00', '18/07/2011 12:30:00', '18/07/2011 12:31:00', '18/07/2011 12:32:00', '18/07/2011 12:33:00', '18/07/2011 12:34:00', '18/07/2011 12:35:00', '18/07/2011 12:36:00', '18/07/2011 12:37:00', '18/07/2011 12:38:00', '18/07/2011 12:39:00', '18/07/2011 12:40:00', '18/07/2011 12:41:00', '18/07/2011 12:42:00', '18/07/2011 12:43:00', '18/07/2011 12:44:00', '18/07/2011 12:45:00', '18/07/2011 12:46:00', '18/07/2011 12:47:00', '18/07/2011 12:48:00', '18/07/2011 12:49:00', '18/07/2011 12:50:00', '18/07/2011 12:51:00', '18/07/2011 12:52:00', '18/07/2011 12:53:00', '18/07/2011 12:54:00', '18/07/2011 12:55:00', '18/07/2011 12:56:00', '18/07/2011 12:57:00', '18/07/2011 12:58:00', '18/07/2011 12:59:00', '18/07/2011 13:00:00', '18/07/2011 13:01:00', '18/07/2011 13:02:00', '18/07/2011 13:03:00', '18/07/2011 13:04:00', '18/07/2011 13:05:00', '18/07/2011 13:06:00', '18/07/2011 13:07:00', '18/07/2011 13:08:00', '18/07/2011 13:09:00', '18/07/2011 13:10:00', '18/07/2011 13:11:00', '18/07/2011 13:12:00', '18/07/2011 13:13:00', '18/07/2011 13:14:00', '18/07/2011 13:15:00', '18/07/2011 13:16:00', '18/07/2011 13:17:00', '18/07/2011 13:18:00', '18/07/2011 13:19:00', '18/07/2011 13:20:00', '18/07/2011 13:21:00', '18/07/2011 13:22:00', '18/07/2011 13:23:00', '18/07/2011 13:24:00', '18/07/2011 13:25:00', '18/07/2011 13:26:00', '18/07/2011 13:27:00', '18/07/2011 13:28:00', '18/07/2011 13:29:00', '18/07/2011 13:30:00', '18/07/2011 13:31:00', '18/07/2011 13:32:00', '18/07/2011 13:33:00', '18/07/2011 13:34:00', '18/07/2011 13:35:00', '18/07/2011 13:36:00', '18/07/2011 13:37:00', '18/07/2011 13:38:00', '18/07/2011 13:39:00', '18/07/2011 13:40:00', '18/07/2011 13:41:00', '18/07/2011 13:42:00', '18/07/2011 13:43:00', '18/07/2011 13:44:00', '18/07/2011 13:45:00', '18/07/2011 13:46:00', '18/07/2011 13:47:00', '18/07/2011 13:48:00', '18/07/2011 13:49:00', '18/07/2011 13:50:00', '18/07/2011 13:51:00', '18/07/2011 13:52:00', '18/07/2011 13:53:00', '18/07/2011 13:54:00', '18/07/2011 13:55:00', '18/07/2011 13:56:00', '18/07/2011 13:57:00', '18/07/2011 13:58:00', '18/07/2011 13:59:00', '18/07/2011 14:00:00', '18/07/2011 14:01:00', '18/07/2011 14:02:00', '18/07/2011 14:03:00', '18/07/2011 14:04:00', '18/07/2011 14:05:00', '18/07/2011 14:06:00', '18/07/2011 14:07:00', '18/07/2011 14:08:00', '18/07/2011 14:09:00', '18/07/2011 14:10:00', '18/07/2011 14:11:00', '18/07/2011 14:12:00', '18/07/2011 14:13:00', '18/07/2011 14:14:00', '18/07/2011 14:15:00', '18/07/2011 14:16:00', '18/07/2011 14:17:00', '18/07/2011 14:18:00', '18/07/2011 14:19:00', '18/07/2011 14:20:00', '18/07/2011 14:21:00', '18/07/2011 14:22:00', '18/07/2011 14:23:00', '18/07/2011 14:24:00', '18/07/2011 14:25:00', '18/07/2011 14:26:00', '18/07/2011 14:27:00', '18/07/2011 14:28:00', '18/07/2011 14:29:00', '18/07/2011 14:30:00', '18/07/2011 14:31:00', '18/07/2011 14:32:00', '18/07/2011 14:33:00', '18/07/2011 14:34:00', '18/07/2011 14:35:00', '18/07/2011 14:36:00', '18/07/2011 14:37:00', '18/07/2011 14:38:00', '18/07/2011 14:39:00', '18/07/2011 14:40:00', '18/07/2011 14:41:00', '18/07/2011 14:42:00', '18/07/2011 14:43:00', '18/07/2011 14:44:00', '18/07/2011 14:45:00', '18/07/2011 14:46:00', '18/07/2011 14:47:00', '18/07/2011 14:48:00', '18/07/2011 14:49:00', '18/07/2011 14:50:00', '18/07/2011 14:51:00', '18/07/2011 14:52:00', '18/07/2011 14:53:00', '18/07/2011 14:54:00', '18/07/2011 14:55:00', '18/07/2011 14:56:00', '18/07/2011 14:57:00', '18/07/2011 14:58:00', '18/07/2011 14:59:00', '18/07/2011 15:00:00', '18/07/2011 15:01:00', '18/07/2011 15:02:00', '18/07/2011 15:03:00', '18/07/2011 15:04:00', '18/07/2011 15:05:00', '18/07/2011 15:06:00', '18/07/2011 15:07:00', '18/07/2011 15:08:00', '18/07/2011 15:09:00', '18/07/2011 15:10:00', '18/07/2011 15:11:00', '18/07/2011 15:12:00', '18/07/2011 15:13:00', '18/07/2011 15:14:00', '18/07/2011 15:15:00', '18/07/2011 15:16:00', '18/07/2011 15:17:00', '18/07/2011 15:18:00', '18/07/2011 15:19:00', '18/07/2011 15:20:00', '18/07/2011 15:21:00', '18/07/2011 15:22:00', '18/07/2011 15:23:00', '18/07/2011 15:24:00', '18/07/2011 15:25:00', '18/07/2011 15:26:00', '18/07/2011 15:27:00', '18/07/2011 15:28:00', '18/07/2011 15:29:00', '18/07/2011 15:30:00', '18/07/2011 15:31:00', '18/07/2011 15:32:00', '18/07/2011 15:33:00', '18/07/2011 15:34:00', '18/07/2011 15:35:00', '18/07/2011 15:36:00', '18/07/2011 15:37:00', '18/07/2011 15:38:00', '18/07/2011 15:39:00', '18/07/2011 15:40:00', '18/07/2011 15:41:00', '18/07/2011 15:42:00', '18/07/2011 15:43:00', '18/07/2011 15:44:00', '18/07/2011 15:45:00', '18/07/2011 15:46:00', '18/07/2011 15:47:00', '18/07/2011 15:48:00', '18/07/2011 15:49:00', '18/07/2011 15:50:00', '18/07/2011 15:51:00', '18/07/2011 15:52:00', '18/07/2011 15:53:00', '18/07/2011 15:54:00', '18/07/2011 15:55:00', '18/07/2011 15:56:00', '18/07/2011 15:57:00', '18/07/2011 15:58:00', '18/07/2011 15:59:00', '18/07/2011 16:00:00', '18/07/2011 16:01:00', '18/07/2011 16:02:00', '18/07/2011 16:03:00', '18/07/2011 16:04:00', '18/07/2011 16:05:00', '18/07/2011 16:06:00', '18/07/2011 16:07:00', '18/07/2011 16:08:00', '18/07/2011 16:09:00', '18/07/2011 16:10:00', '18/07/2011 16:11:00', '18/07/2011 16:12:00', '18/07/2011 16:13:00', '18/07/2011 16:14:00', '18/07/2011 16:15:00', '18/07/2011 16:16:00', '18/07/2011 16:17:00', '18/07/2011 16:18:00', '18/07/2011 16:19:00', '18/07/2011 16:20:00', '18/07/2011 16:21:00', '18/07/2011 16:22:00', '18/07/2011 16:23:00', '18/07/2011 16:24:00', '18/07/2011 16:25:00', '18/07/2011 16:26:00', '18/07/2011 16:27:00', '18/07/2011 16:28:00', '18/07/2011 16:29:00', '18/07/2011 16:30:00', '18/07/2011 16:31:00', '18/07/2011 16:32:00', '18/07/2011 16:33:00', '18/07/2011 16:34:00', '18/07/2011 16:35:00', '18/07/2011 16:36:00', '18/07/2011 16:37:00', '18/07/2011 16:38:00', '18/07/2011 16:39:00', '18/07/2011 16:40:00', '18/07/2011 16:41:00', '18/07/2011 16:42:00', '18/07/2011 16:43:00', '18/07/2011 16:44:00', '18/07/2011 16:45:00', '18/07/2011 16:46:00', '18/07/2011 16:47:00', '18/07/2011 16:48:00', '18/07/2011 16:49:00', '18/07/2011 16:50:00', '18/07/2011 16:51:00', '18/07/2011 16:52:00', '18/07/2011 16:53:00', '18/07/2011 16:54:00', '18/07/2011 16:55:00', '18/07/2011 16:56:00', '18/07/2011 16:57:00', '18/07/2011 16:58:00', '18/07/2011 16:59:00', '18/07/2011 17:00:00', '18/07/2011 17:01:00', '18/07/2011 17:02:00', '18/07/2011 17:03:00', '18/07/2011 17:04:00', '18/07/2011 17:05:00', '18/07/2011 17:06:00', '18/07/2011 17:07:00', '18/07/2011 17:08:00', '18/07/2011 17:09:00', '18/07/2011 17:10:00', '18/07/2011 17:11:00', '18/07/2011 17:12:00', '18/07/2011 17:13:00', '18/07/2011 17:14:00', '18/07/2011 17:15:00', '18/07/2011 17:16:00', '18/07/2011 17:17:00', '18/07/2011 17:18:00', '18/07/2011 17:19:00', '18/07/2011 17:20:00', '18/07/2011 17:21:00', '18/07/2011 17:22:00', '18/07/2011 17:23:00', '18/07/2011 17:24:00', '18/07/2011 17:25:00', '18/07/2011 17:26:00', '18/07/2011 17:27:00', '18/07/2011 17:28:00', '18/07/2011 17:29:00', '18/07/2011 17:30:00', '18/07/2011 17:31:00', '18/07/2011 17:32:00', '18/07/2011 17:33:00', '18/07/2011 17:34:00', '18/07/2011 17:35:00', '18/07/2011 17:36:00', '18/07/2011 17:37:00', '18/07/2011 17:38:00', '18/07/2011 17:39:00', '18/07/2011 17:40:00', '18/07/2011 17:41:00', '18/07/2011 17:42:00', '18/07/2011 17:43:00', '18/07/2011 17:44:00', '18/07/2011 17:45:00', '18/07/2011 17:46:00', '18/07/2011 17:47:00', '18/07/2011 17:48:00', '18/07/2011 17:49:00', '18/07/2011 17:50:00', '18/07/2011 17:51:00', '18/07/2011 17:52:00', '18/07/2011 17:53:00', '18/07/2011 17:54:00', '18/07/2011 17:55:00', '18/07/2011 17:56:00', '18/07/2011 17:57:00', '18/07/2011 17:58:00', '18/07/2011 17:59:00', '18/07/2011 18:00:00', '18/07/2011 18:01:00', '18/07/2011 18:02:00', '18/07/2011 18:03:00', '18/07/2011 18:04:00', '18/07/2011 18:05:00', '18/07/2011 18:06:00', '18/07/2011 18:07:00', '18/07/2011 18:08:00', '18/07/2011 18:09:00', '18/07/2011 18:10:00', '18/07/2011 18:11:00', '18/07/2011 18:12:00', '18/07/2011 18:13:00', '18/07/2011 18:14:00', '18/07/2011 18:15:00', '18/07/2011 18:16:00', '18/07/2011 18:17:00', '18/07/2011 18:18:00', '18/07/2011 18:19:00', '18/07/2011 18:20:00', '18/07/2011 18:21:00', '18/07/2011 18:22:00', '18/07/2011 18:23:00', '18/07/2011 18:24:00', '18/07/2011 18:25:00', '18/07/2011 18:26:00', '18/07/2011 18:27:00', '18/07/2011 18:28:00', '18/07/2011 18:29:00', '18/07/2011 18:30:00', '18/07/2011 18:31:00', '18/07/2011 18:32:00', '18/07/2011 18:33:00', '18/07/2011 18:34:00', '18/07/2011 18:35:00', '18/07/2011 18:36:00', '18/07/2011 18:37:00', '18/07/2011 18:38:00', '18/07/2011 18:39:00', '18/07/2011 18:40:00', '18/07/2011 18:41:00', '18/07/2011 18:42:00', '18/07/2011 18:43:00', '18/07/2011 18:44:00', '18/07/2011 18:45:00', '18/07/2011 18:46:00', '18/07/2011 18:47:00', '18/07/2011 18:48:00', '18/07/2011 18:49:00', '18/07/2011 18:50:00', '18/07/2011 18:51:00', '18/07/2011 18:52:00', '18/07/2011 18:53:00', '18/07/2011 18:54:00', '18/07/2011 18:55:00', '18/07/2011 18:56:00', '18/07/2011 18:57:00', '18/07/2011 18:58:00', '18/07/2011 18:59:00', '18/07/2011 19:00:00', '18/07/2011 19:01:00', '18/07/2011 19:02:00', '18/07/2011 19:03:00', '18/07/2011 19:04:00', '18/07/2011 19:05:00', '18/07/2011 19:06:00', '18/07/2011 19:07:00', '18/07/2011 19:08:00', '18/07/2011 19:09:00', '18/07/2011 19:10:00', '18/07/2011 19:11:00', '18/07/2011 19:12:00', '18/07/2011 19:13:00', '18/07/2011 19:14:00', '18/07/2011 19:15:00', '18/07/2011 19:16:00', '18/07/2011 19:17:00', '18/07/2011 19:18:00', '18/07/2011 19:19:00', '18/07/2011 19:20:00', '18/07/2011 19:21:00', '18/07/2011 19:22:00', '18/07/2011 19:23:00', '18/07/2011 19:24:00', '18/07/2011 19:25:00', '18/07/2011 19:26:00', '18/07/2011 19:27:00', '18/07/2011 19:28:00', '18/07/2011 19:29:00', '18/07/2011 19:30:00', '18/07/2011 19:31:00', '18/07/2011 19:32:00', '18/07/2011 19:33:00', '18/07/2011 19:34:00', '18/07/2011 19:35:00', '18/07/2011 19:36:00', '18/07/2011 19:37:00', '18/07/2011 19:38:00', '18/07/2011 19:39:00', '18/07/2011 19:40:00', '18/07/2011 19:41:00', '18/07/2011 19:42:00', '18/07/2011 19:43:00', '18/07/2011 19:44:00', '18/07/2011 19:45:00', '18/07/2011 19:46:00', '18/07/2011 19:47:00', '18/07/2011 19:48:00', '18/07/2011 19:49:00', '18/07/2011 19:50:00', '18/07/2011 19:51:00', '18/07/2011 19:52:00', '18/07/2011 19:53:00', '18/07/2011 19:54:00', '18/07/2011 19:55:00', '18/07/2011 19:56:00', '18/07/2011 19:57:00', '18/07/2011 19:58:00', '18/07/2011 19:59:00', '18/07/2011 20:00:00', '18/07/2011 20:01:00', '18/07/2011 20:02:00', '18/07/2011 20:03:00', '18/07/2011 20:04:00', '18/07/2011 20:05:00', '18/07/2011 20:06:00', '18/07/2011 20:07:00', '18/07/2011 20:08:00', '18/07/2011 20:09:00', '18/07/2011 20:10:00', '18/07/2011 20:11:00', '18/07/2011 20:12:00', '18/07/2011 20:13:00', '18/07/2011 20:14:00', '18/07/2011 20:15:00', '18/07/2011 20:16:00', '18/07/2011 20:17:00', '18/07/2011 20:18:00', '18/07/2011 20:19:00', '18/07/2011 20:20:00', '18/07/2011 20:21:00', '18/07/2011 20:22:00', '18/07/2011 20:23:00', '18/07/2011 20:24:00', '18/07/2011 20:25:00', '18/07/2011 20:26:00', '18/07/2011 20:27:00', '18/07/2011 20:28:00', '18/07/2011 20:29:00', '18/07/2011 20:30:00', '18/07/2011 20:31:00', '18/07/2011 20:32:00', '18/07/2011 20:33:00', '18/07/2011 20:34:00', '18/07/2011 20:35:00', '18/07/2011 20:36:00', '18/07/2011 20:37:00', '18/07/2011 20:38:00', '18/07/2011 20:39:00', '18/07/2011 20:40:00', '18/07/2011 20:41:00', '18/07/2011 20:42:00', '18/07/2011 20:43:00', '18/07/2011 20:44:00', '18/07/2011 20:45:00', '18/07/2011 20:46:00', '18/07/2011 20:47:00', '18/07/2011 20:48:00', '18/07/2011 20:49:00', '18/07/2011 20:50:00', '18/07/2011 20:51:00', '18/07/2011 20:52:00', '18/07/2011 20:53:00', '18/07/2011 20:54:00', '18/07/2011 20:55:00', '18/07/2011 20:56:00', '18/07/2011 20:57:00', '18/07/2011 20:58:00', '18/07/2011 20:59:00', '18/07/2011 21:00:00', '18/07/2011 21:01:00', '18/07/2011 21:02:00', '18/07/2011 21:03:00', '18/07/2011 21:04:00', '18/07/2011 21:05:00', '18/07/2011 21:06:00', '18/07/2011 21:07:00', '18/07/2011 21:08:00', '18/07/2011 21:09:00', '18/07/2011 21:10:00', '18/07/2011 21:11:00', '18/07/2011 21:12:00', '18/07/2011 21:13:00', '18/07/2011 21:14:00', '18/07/2011 21:15:00', '18/07/2011 21:16:00', '18/07/2011 21:17:00', '18/07/2011 21:18:00', '18/07/2011 21:19:00', '18/07/2011 21:20:00', '18/07/2011 21:21:00', '18/07/2011 21:22:00', '18/07/2011 21:23:00', '18/07/2011 21:24:00', '18/07/2011 21:25:00', '18/07/2011 21:26:00', '18/07/2011 21:27:00', '18/07/2011 21:28:00', '18/07/2011 21:29:00', '18/07/2011 21:30:00', '18/07/2011 21:31:00', '18/07/2011 21:32:00', '18/07/2011 21:33:00', '18/07/2011 21:34:00', '18/07/2011 21:35:00', '18/07/2011 21:36:00', '18/07/2011 21:37:00', '18/07/2011 21:38:00', '18/07/2011 21:39:00', '18/07/2011 21:40:00', '18/07/2011 21:41:00', '18/07/2011 21:42:00', '18/07/2011 21:43:00', '18/07/2011 21:44:00', '18/07/2011 21:45:00', '18/07/2011 21:46:00', '18/07/2011 21:47:00', '18/07/2011 21:48:00', '18/07/2011 21:49:00', '18/07/2011 21:50:00', '18/07/2011 21:51:00', '18/07/2011 21:52:00', '18/07/2011 21:53:00', '18/07/2011 21:54:00', '18/07/2011 21:55:00', '18/07/2011 21:56:00', '18/07/2011 21:57:00', '18/07/2011 21:58:00', '18/07/2011 21:59:00', '18/07/2011 22:00:00', '18/07/2011 22:01:00', '18/07/2011 22:02:00', '18/07/2011 22:03:00', '18/07/2011 22:04:00', '18/07/2011 22:05:00', '18/07/2011 22:06:00', '18/07/2011 22:07:00', '18/07/2011 22:08:00', '18/07/2011 22:09:00', '18/07/2011 22:10:00', '18/07/2011 22:11:00', '18/07/2011 22:12:00', '18/07/2011 22:13:00', '18/07/2011 22:14:00', '18/07/2011 22:15:00', '18/07/2011 22:16:00', '18/07/2011 22:17:00', '18/07/2011 22:18:00', '18/07/2011 22:19:00', '18/07/2011 22:20:00', '18/07/2011 22:21:00', '18/07/2011 22:22:00', '18/07/2011 22:23:00', '18/07/2011 22:24:00', '18/07/2011 22:25:00', '18/07/2011 22:26:00', '18/07/2011 22:27:00', '18/07/2011 22:28:00', '18/07/2011 22:29:00', '18/07/2011 22:30:00', '18/07/2011 22:31:00', '18/07/2011 22:32:00', '18/07/2011 22:33:00', '18/07/2011 22:34:00', '18/07/2011 22:35:00', '18/07/2011 22:36:00', '18/07/2011 22:37:00', '18/07/2011 22:38:00', '18/07/2011 22:39:00', '18/07/2011 22:40:00', '18/07/2011 22:41:00', '18/07/2011 22:42:00', '18/07/2011 22:43:00', '18/07/2011 22:44:00', '18/07/2011 22:45:00', '18/07/2011 22:46:00', '18/07/2011 22:47:00', '18/07/2011 22:48:00', '18/07/2011 22:49:00', '18/07/2011 22:50:00', '18/07/2011 22:51:00', '18/07/2011 22:52:00', '18/07/2011 22:53:00', '18/07/2011 22:54:00', '18/07/2011 22:55:00', '18/07/2011 22:56:00', '18/07/2011 22:57:00', '18/07/2011 22:58:00', '18/07/2011 22:59:00', '18/07/2011 23:00:00', '18/07/2011 23:01:00', '18/07/2011 23:02:00', '18/07/2011 23:03:00', '18/07/2011 23:04:00', '18/07/2011 23:05:00', '18/07/2011 23:06:00', '18/07/2011 23:07:00', '18/07/2011 23:08:00', '18/07/2011 23:09:00', '18/07/2011 23:10:00', '18/07/2011 23:11:00', '18/07/2011 23:12:00', '18/07/2011 23:13:00', '18/07/2011 23:14:00', '18/07/2011 23:15:00', '18/07/2011 23:16:00', '18/07/2011 23:17:00', '18/07/2011 23:18:00', '18/07/2011 23:19:00', '18/07/2011 23:20:00', '18/07/2011 23:21:00', '18/07/2011 23:22:00', '18/07/2011 23:23:00', '18/07/2011 23:24:00', '18/07/2011 23:25:00', '18/07/2011 23:26:00', '18/07/2011 23:27:00', '18/07/2011 23:28:00', '18/07/2011 23:29:00', '18/07/2011 23:30:00', '18/07/2011 23:31:00', '18/07/2011 23:32:00', '18/07/2011 23:33:00', '18/07/2011 23:34:00', '18/07/2011 23:35:00', '18/07/2011 23:36:00', '18/07/2011 23:37:00', '18/07/2011 23:38:00', '18/07/2011 23:39:00', '18/07/2011 23:40:00', '18/07/2011 23:41:00', '18/07/2011 23:42:00', '18/07/2011 23:43:00', '18/07/2011 23:44:00', '18/07/2011 23:45:00', '18/07/2011 23:46:00', '18/07/2011 23:47:00', '18/07/2011 23:48:00', '18/07/2011 23:49:00', '18/07/2011 23:50:00', '18/07/2011 23:51:00', '18/07/2011 23:52:00', '18/07/2011 23:53:00', '18/07/2011 23:54:00', '18/07/2011 23:55:00', '18/07/2011 23:56:00', '18/07/2011 23:57:00', '18/07/2011 23:58:00', '18/07/2011 23:59:00', '19/07/2011 0:00:00', '19/07/2011 0:01:00', '19/07/2011 0:02:00', '19/07/2011 0:03:00', '19/07/2011 0:04:00', '19/07/2011 0:05:00', '19/07/2011 0:06:00', '19/07/2011 0:07:00', '19/07/2011 0:08:00', '19/07/2011 0:09:00', '19/07/2011 0:10:00', '19/07/2011 0:11:00', '19/07/2011 0:12:00', '19/07/2011 0:13:00', '19/07/2011 0:14:00', '19/07/2011 0:15:00', '19/07/2011 0:16:00', '19/07/2011 0:17:00', '19/07/2011 0:18:00', '19/07/2011 0:19:00', '19/07/2011 0:20:00', '19/07/2011 0:21:00', '19/07/2011 0:22:00', '19/07/2011 0:23:00', '19/07/2011 0:24:00', '19/07/2011 0:25:00', '19/07/2011 0:26:00', '19/07/2011 0:27:00', '19/07/2011 0:28:00', '19/07/2011 0:29:00', '19/07/2011 0:30:00', '19/07/2011 0:31:00', '19/07/2011 0:32:00', '19/07/2011 0:33:00', '19/07/2011 0:34:00', '19/07/2011 0:35:00', '19/07/2011 0:36:00', '19/07/2011 0:37:00', '19/07/2011 0:38:00', '19/07/2011 0:39:00', '19/07/2011 0:40:00', '19/07/2011 0:41:00', '19/07/2011 0:42:00', '19/07/2011 0:43:00', '19/07/2011 0:44:00', '19/07/2011 0:45:00', '19/07/2011 0:46:00', '19/07/2011 0:47:00', '19/07/2011 0:48:00', '19/07/2011 0:49:00', '19/07/2011 0:50:00', '19/07/2011 0:51:00', '19/07/2011 0:52:00', '19/07/2011 0:53:00', '19/07/2011 0:54:00', '19/07/2011 0:55:00', '19/07/2011 0:56:00', '19/07/2011 0:57:00', '19/07/2011 0:58:00', '19/07/2011 0:59:00', '19/07/2011 1:00:00', '19/07/2011 1:01:00', '19/07/2011 1:02:00', '19/07/2011 1:03:00', '19/07/2011 1:04:00', '19/07/2011 1:05:00', '19/07/2011 1:06:00', '19/07/2011 1:07:00', '19/07/2011 1:08:00', '19/07/2011 1:09:00', '19/07/2011 1:10:00', '19/07/2011 1:11:00', '19/07/2011 1:12:00', '19/07/2011 1:13:00', '19/07/2011 1:14:00', '19/07/2011 1:15:00', '19/07/2011 1:16:00', '19/07/2011 1:17:00', '19/07/2011 1:18:00', '19/07/2011 1:19:00', '19/07/2011 1:20:00', '19/07/2011 1:21:00', '19/07/2011 1:22:00', '19/07/2011 1:23:00', '19/07/2011 1:24:00', '19/07/2011 1:25:00', '19/07/2011 1:26:00', '19/07/2011 1:27:00', '19/07/2011 1:28:00', '19/07/2011 1:29:00', '19/07/2011 1:30:00', '19/07/2011 1:31:00', '19/07/2011 1:32:00', '19/07/2011 1:33:00', '19/07/2011 1:34:00', '19/07/2011 1:35:00', '19/07/2011 1:36:00', '19/07/2011 1:37:00', '19/07/2011 1:38:00', '19/07/2011 1:39:00', '19/07/2011 1:40:00', '19/07/2011 1:41:00', '19/07/2011 1:42:00', '19/07/2011 1:43:00', '19/07/2011 1:44:00', '19/07/2011 1:45:00', '19/07/2011 1:46:00', '19/07/2011 1:47:00', '19/07/2011 1:48:00', '19/07/2011 1:49:00', '19/07/2011 1:50:00', '19/07/2011 1:51:00', '19/07/2011 1:52:00', '19/07/2011 1:53:00', '19/07/2011 1:54:00', '19/07/2011 1:55:00', '19/07/2011 1:56:00', '19/07/2011 1:57:00', '19/07/2011 1:58:00', '19/07/2011 1:59:00', '19/07/2011 2:00:00', '19/07/2011 2:01:00', '19/07/2011 2:02:00', '19/07/2011 2:03:00', '19/07/2011 2:04:00', '19/07/2011 2:05:00', '19/07/2011 2:06:00', '19/07/2011 2:07:00', '19/07/2011 2:08:00', '19/07/2011 2:09:00', '19/07/2011 2:10:00', '19/07/2011 2:11:00', '19/07/2011 2:12:00', '19/07/2011 2:13:00', '19/07/2011 2:14:00', '19/07/2011 2:15:00', '19/07/2011 2:16:00', '19/07/2011 2:17:00', '19/07/2011 2:18:00', '19/07/2011 2:19:00', '19/07/2011 2:20:00', '19/07/2011 2:21:00', '19/07/2011 2:22:00', '19/07/2011 2:23:00', '19/07/2011 2:24:00', '19/07/2011 2:25:00', '19/07/2011 2:26:00', '19/07/2011 2:27:00', '19/07/2011 2:28:00', '19/07/2011 2:29:00', '19/07/2011 2:30:00', '19/07/2011 2:31:00', '19/07/2011 2:32:00', '19/07/2011 2:33:00', '19/07/2011 2:34:00', '19/07/2011 2:35:00', '19/07/2011 2:36:00', '19/07/2011 2:37:00', '19/07/2011 2:38:00', '19/07/2011 2:39:00', '19/07/2011 2:40:00', '19/07/2011 2:41:00', '19/07/2011 2:42:00', '19/07/2011 2:43:00', '19/07/2011 2:44:00', '19/07/2011 2:45:00', '19/07/2011 2:46:00', '19/07/2011 2:47:00', '19/07/2011 2:48:00', '19/07/2011 2:49:00', '19/07/2011 2:50:00', '19/07/2011 2:51:00', '19/07/2011 2:52:00', '19/07/2011 2:53:00', '19/07/2011 2:54:00', '19/07/2011 2:55:00', '19/07/2011 2:56:00', '19/07/2011 2:57:00', '19/07/2011 2:58:00', '19/07/2011 2:59:00', '19/07/2011 3:00:00', '19/07/2011 3:01:00', '19/07/2011 3:02:00', '19/07/2011 3:03:00', '19/07/2011 3:04:00', '19/07/2011 3:05:00', '19/07/2011 3:06:00', '19/07/2011 3:07:00', '19/07/2011 3:08:00', '19/07/2011 3:09:00', '19/07/2011 3:10:00', '19/07/2011 3:11:00', '19/07/2011 3:12:00', '19/07/2011 3:13:00', '19/07/2011 3:14:00', '19/07/2011 3:15:00', '19/07/2011 3:16:00', '19/07/2011 3:17:00', '19/07/2011 3:18:00', '19/07/2011 3:19:00', '19/07/2011 3:20:00', '19/07/2011 3:21:00', '19/07/2011 3:22:00', '19/07/2011 3:23:00', '19/07/2011 3:24:00', '19/07/2011 3:25:00', '19/07/2011 3:26:00', '19/07/2011 3:27:00', '19/07/2011 3:28:00', '19/07/2011 3:29:00', '19/07/2011 3:30:00', '19/07/2011 3:31:00', '19/07/2011 3:32:00', '19/07/2011 3:33:00', '19/07/2011 3:34:00', '19/07/2011 3:35:00', '19/07/2011 3:36:00', '19/07/2011 3:37:00', '19/07/2011 3:38:00', '19/07/2011 3:39:00', '19/07/2011 3:40:00', '19/07/2011 3:41:00', '19/07/2011 3:42:00', '19/07/2011 3:43:00', '19/07/2011 3:44:00', '19/07/2011 3:45:00', '19/07/2011 3:46:00', '19/07/2011 3:47:00', '19/07/2011 3:48:00', '19/07/2011 3:49:00', '19/07/2011 3:50:00', '19/07/2011 3:51:00', '19/07/2011 3:52:00', '19/07/2011 3:53:00', '19/07/2011 3:54:00', '19/07/2011 3:55:00', '19/07/2011 3:56:00', '19/07/2011 3:57:00', '19/07/2011 3:58:00', '19/07/2011 3:59:00', '19/07/2011 4:00:00', '19/07/2011 4:01:00', '19/07/2011 4:02:00', '19/07/2011 4:03:00', '19/07/2011 4:04:00', '19/07/2011 4:05:00', '19/07/2011 4:06:00', '19/07/2011 4:07:00', '19/07/2011 4:08:00', '19/07/2011 4:09:00', '19/07/2011 4:10:00', '19/07/2011 4:11:00', '19/07/2011 4:12:00', '19/07/2011 4:13:00', '19/07/2011 4:14:00', '19/07/2011 4:15:00', '19/07/2011 4:16:00', '19/07/2011 4:17:00', '19/07/2011 4:18:00', '19/07/2011 4:19:00', '19/07/2011 4:20:00', '19/07/2011 4:21:00', '19/07/2011 4:22:00', '19/07/2011 4:23:00', '19/07/2011 4:24:00', '19/07/2011 4:25:00', '19/07/2011 4:26:00', '19/07/2011 4:27:00', '19/07/2011 4:28:00', '19/07/2011 4:29:00', '19/07/2011 4:30:00', '19/07/2011 4:31:00', '19/07/2011 4:32:00', '19/07/2011 4:33:00', '19/07/2011 4:34:00', '19/07/2011 4:35:00', '19/07/2011 4:36:00', '19/07/2011 4:37:00', '19/07/2011 4:38:00', '19/07/2011 4:39:00', '19/07/2011 4:40:00', '19/07/2011 4:41:00', '19/07/2011 4:42:00', '19/07/2011 4:43:00', '19/07/2011 4:44:00', '19/07/2011 4:45:00', '19/07/2011 4:46:00', '19/07/2011 4:47:00', '19/07/2011 4:48:00', '19/07/2011 4:49:00', '19/07/2011 4:50:00', '19/07/2011 4:51:00', '19/07/2011 4:52:00', '19/07/2011 4:53:00', '19/07/2011 4:54:00', '19/07/2011 4:55:00', '19/07/2011 4:56:00', '19/07/2011 4:57:00', '19/07/2011 4:58:00', '19/07/2011 4:59:00', '19/07/2011 5:00:00', '19/07/2011 5:01:00', '19/07/2011 5:02:00', '19/07/2011 5:03:00', '19/07/2011 5:04:00', '19/07/2011 5:05:00', '19/07/2011 5:06:00', '19/07/2011 5:07:00', '19/07/2011 5:08:00', '19/07/2011 5:09:00', '19/07/2011 5:10:00', '19/07/2011 5:11:00', '19/07/2011 5:12:00', '19/07/2011 5:13:00', '19/07/2011 5:14:00', '19/07/2011 5:15:00', '19/07/2011 5:16:00', '19/07/2011 5:17:00', '19/07/2011 5:18:00', '19/07/2011 5:19:00', '19/07/2011 5:20:00', '19/07/2011 5:21:00', '19/07/2011 5:22:00', '19/07/2011 5:23:00', '19/07/2011 5:24:00', '19/07/2011 5:25:00', '19/07/2011 5:26:00', '19/07/2011 5:27:00', '19/07/2011 5:28:00', '19/07/2011 5:29:00', '19/07/2011 5:30:00', '19/07/2011 5:31:00', '19/07/2011 5:32:00', '19/07/2011 5:33:00', '19/07/2011 5:34:00', '19/07/2011 5:35:00', '19/07/2011 5:36:00', '19/07/2011 5:37:00', '19/07/2011 5:38:00', '19/07/2011 5:39:00', '19/07/2011 5:40:00', '19/07/2011 5:41:00', '19/07/2011 5:42:00', '19/07/2011 5:43:00', '19/07/2011 5:44:00', '19/07/2011 5:45:00', '19/07/2011 5:46:00', '19/07/2011 5:47:00', '19/07/2011 5:48:00', '19/07/2011 5:49:00', '19/07/2011 5:50:00', '19/07/2011 5:51:00', '19/07/2011 5:52:00', '19/07/2011 5:53:00', '19/07/2011 5:54:00', '19/07/2011 5:55:00', '19/07/2011 5:56:00', '19/07/2011 5:57:00', '19/07/2011 5:58:00', '19/07/2011 5:59:00', '19/07/2011 6:00:00', '19/07/2011 6:01:00', '19/07/2011 6:02:00', '19/07/2011 6:03:00', '19/07/2011 6:04:00', '19/07/2011 6:05:00', '19/07/2011 6:06:00', '19/07/2011 6:07:00', '19/07/2011 6:08:00', '19/07/2011 6:09:00', '19/07/2011 6:10:00', '19/07/2011 6:11:00', '19/07/2011 6:12:00', '19/07/2011 6:13:00', '19/07/2011 6:14:00', '19/07/2011 6:15:00', '19/07/2011 6:16:00', '19/07/2011 6:17:00', '19/07/2011 6:18:00', '19/07/2011 6:19:00', '19/07/2011 6:20:00', '19/07/2011 6:21:00', '19/07/2011 6:22:00', '19/07/2011 6:23:00', '19/07/2011 6:24:00', '19/07/2011 6:25:00', '19/07/2011 6:26:00', '19/07/2011 6:27:00', '19/07/2011 6:28:00', '19/07/2011 6:29:00', '19/07/2011 6:30:00', '19/07/2011 6:31:00', '19/07/2011 6:32:00', '19/07/2011 6:33:00', '19/07/2011 6:34:00', '19/07/2011 6:35:00', '19/07/2011 6:36:00', '19/07/2011 6:37:00', '19/07/2011 6:38:00', '19/07/2011 6:39:00', '19/07/2011 6:40:00', '19/07/2011 6:41:00', '19/07/2011 6:42:00', '19/07/2011 6:43:00', '19/07/2011 6:44:00', '19/07/2011 6:45:00', '19/07/2011 6:46:00', '19/07/2011 6:47:00', '19/07/2011 6:48:00', '19/07/2011 6:49:00', '19/07/2011 6:50:00', '19/07/2011 6:51:00', '19/07/2011 6:52:00', '19/07/2011 6:53:00', '19/07/2011 6:54:00', '19/07/2011 6:55:00', '19/07/2011 6:56:00', '19/07/2011 6:57:00', '19/07/2011 6:58:00', '19/07/2011 6:59:00', '19/07/2011 7:00:00', '19/07/2011 7:01:00', '19/07/2011 7:02:00', '19/07/2011 7:03:00', '19/07/2011 7:04:00', '19/07/2011 7:05:00', '19/07/2011 7:06:00', '19/07/2011 7:07:00', '19/07/2011 7:08:00', '19/07/2011 7:09:00', '19/07/2011 7:10:00', '19/07/2011 7:11:00', '19/07/2011 7:12:00', '19/07/2011 7:13:00', '19/07/2011 7:14:00', '19/07/2011 7:15:00', '19/07/2011 7:16:00', '19/07/2011 7:17:00', '19/07/2011 7:18:00', '19/07/2011 7:19:00', '19/07/2011 7:20:00', '19/07/2011 7:21:00', '19/07/2011 7:22:00', '19/07/2011 7:23:00', '19/07/2011 7:24:00', '19/07/2011 7:25:00', '19/07/2011 7:26:00', '19/07/2011 7:27:00', '19/07/2011 7:28:00', '19/07/2011 7:29:00', '19/07/2011 7:30:00', '19/07/2011 7:31:00', '19/07/2011 7:32:00', '19/07/2011 7:33:00', '19/07/2011 7:34:00', '19/07/2011 7:35:00', '19/07/2011 7:36:00', '19/07/2011 7:37:00', '19/07/2011 7:38:00', '19/07/2011 7:39:00', '19/07/2011 7:40:00', '19/07/2011 7:41:00', '19/07/2011 7:42:00', '19/07/2011 7:43:00', '19/07/2011 7:44:00', '19/07/2011 7:45:00', '19/07/2011 7:46:00', '19/07/2011 7:47:00', '19/07/2011 7:48:00', '19/07/2011 7:49:00', '19/07/2011 7:50:00', '19/07/2011 7:51:00', '19/07/2011 7:52:00', '19/07/2011 7:53:00', '19/07/2011 7:54:00', '19/07/2011 7:55:00', '19/07/2011 7:56:00', '19/07/2011 7:57:00', '19/07/2011 7:58:00', '19/07/2011 7:59:00', '19/07/2011 8:00:00', '19/07/2011 8:01:00', '19/07/2011 8:02:00', '19/07/2011 8:03:00', '19/07/2011 8:04:00', '19/07/2011 8:05:00', '19/07/2011 8:06:00', '19/07/2011 8:07:00', '19/07/2011 8:08:00', '19/07/2011 8:09:00', '19/07/2011 8:10:00', '19/07/2011 8:11:00', '19/07/2011 8:12:00', '19/07/2011 8:13:00', '19/07/2011 8:14:00', '19/07/2011 8:15:00', '19/07/2011 8:16:00', '19/07/2011 8:17:00', '19/07/2011 8:18:00', '19/07/2011 8:19:00', '19/07/2011 8:20:00', '19/07/2011 8:21:00', '19/07/2011 8:22:00', '19/07/2011 8:23:00', '19/07/2011 8:24:00', '19/07/2011 8:25:00', '19/07/2011 8:26:00', '19/07/2011 8:27:00', '19/07/2011 8:28:00', '19/07/2011 8:29:00', '19/07/2011 8:30:00', '19/07/2011 8:31:00', '19/07/2011 8:32:00', '19/07/2011 8:33:00', '19/07/2011 8:34:00', '19/07/2011 8:35:00', '19/07/2011 8:36:00', '19/07/2011 8:37:00', '19/07/2011 8:38:00', '19/07/2011 8:39:00', '19/07/2011 8:40:00', '19/07/2011 8:41:00', '19/07/2011 8:42:00', '19/07/2011 8:43:00', '19/07/2011 8:44:00', '19/07/2011 8:45:00', '19/07/2011 8:46:00', '19/07/2011 8:47:00', '19/07/2011 8:48:00', '19/07/2011 8:49:00', '19/07/2011 8:50:00', '19/07/2011 8:51:00', '19/07/2011 8:52:00', '19/07/2011 8:53:00', '19/07/2011 8:54:00', '19/07/2011 8:55:00', '19/07/2011 8:56:00', '19/07/2011 8:57:00', '19/07/2011 8:58:00', '19/07/2011 8:59:00', '19/07/2011 9:00:00', '19/07/2011 9:01:00', '19/07/2011 9:02:00', '19/07/2011 9:03:00', '19/07/2011 9:04:00', '19/07/2011 9:05:00', '19/07/2011 9:06:00', '19/07/2011 9:07:00', '19/07/2011 9:08:00', '19/07/2011 9:09:00', '19/07/2011 9:10:00', '19/07/2011 9:11:00', '19/07/2011 9:12:00', '19/07/2011 9:13:00', '19/07/2011 9:14:00', '19/07/2011 9:15:00', '19/07/2011 9:16:00', '19/07/2011 9:17:00', '19/07/2011 9:18:00', '19/07/2011 9:19:00', '19/07/2011 9:20:00', '19/07/2011 9:21:00', '19/07/2011 9:22:00', '19/07/2011 9:23:00', '19/07/2011 9:24:00', '19/07/2011 9:25:00', '19/07/2011 9:26:00', '19/07/2011 9:27:00', '19/07/2011 9:28:00', '19/07/2011 9:29:00', '19/07/2011 9:30:00', '19/07/2011 9:31:00', '19/07/2011 9:32:00', '19/07/2011 9:33:00', '19/07/2011 9:34:00', '19/07/2011 9:35:00', '19/07/2011 9:36:00', '19/07/2011 9:37:00', '19/07/2011 9:38:00', '19/07/2011 9:39:00', '19/07/2011 9:40:00', '19/07/2011 9:41:00', '19/07/2011 9:42:00', '19/07/2011 9:43:00', '19/07/2011 9:44:00', '19/07/2011 9:45:00', '19/07/2011 9:46:00', '19/07/2011 9:47:00', '19/07/2011 9:48:00', '19/07/2011 9:49:00', '19/07/2011 9:50:00', '19/07/2011 9:51:00', '19/07/2011 9:52:00', '19/07/2011 9:53:00', '19/07/2011 9:54:00', '19/07/2011 9:55:00', '19/07/2011 9:56:00', '19/07/2011 9:57:00', '19/07/2011 9:58:00', '19/07/2011 9:59:00', '19/07/2011 10:00:00', '19/07/2011 10:01:00', '19/07/2011 10:02:00', '19/07/2011 10:03:00', '19/07/2011 10:04:00', '19/07/2011 10:05:00', '19/07/2011 10:06:00', '19/07/2011 10:07:00', '19/07/2011 10:08:00', '19/07/2011 10:09:00', '19/07/2011 10:10:00', '19/07/2011 10:11:00', '19/07/2011 10:12:00', '19/07/2011 10:13:00', '19/07/2011 10:14:00', '19/07/2011 10:15:00', '19/07/2011 10:16:00', '19/07/2011 10:17:00', '19/07/2011 10:18:00', '19/07/2011 10:19:00', '19/07/2011 10:20:00', '19/07/2011 10:21:00', '19/07/2011 10:22:00', '19/07/2011 10:23:00', '19/07/2011 10:24:00', '19/07/2011 10:25:00', '19/07/2011 10:26:00', '19/07/2011 10:27:00', '19/07/2011 10:28:00', '19/07/2011 10:29:00', '19/07/2011 10:30:00', '19/07/2011 10:31:00', '19/07/2011 10:32:00', '19/07/2011 10:33:00', '19/07/2011 10:34:00', '19/07/2011 10:35:00', '19/07/2011 10:36:00', '19/07/2011 10:37:00', '19/07/2011 10:38:00', '19/07/2011 10:39:00', '19/07/2011 10:40:00', '19/07/2011 10:41:00', '19/07/2011 10:42:00', '19/07/2011 10:43:00', '19/07/2011 10:44:00', '19/07/2011 10:45:00', '19/07/2011 10:46:00', '19/07/2011 10:47:00', '19/07/2011 10:48:00', '19/07/2011 10:49:00', '19/07/2011 10:50:00', '19/07/2011 10:51:00', '19/07/2011 10:52:00', '19/07/2011 10:53:00', '19/07/2011 10:54:00', '19/07/2011 10:55:00', '19/07/2011 10:56:00', '19/07/2011 10:57:00', '19/07/2011 10:58:00', '19/07/2011 10:59:00', '19/07/2011 11:00:00', '19/07/2011 11:01:00', '19/07/2011 11:02:00', '19/07/2011 11:03:00', '19/07/2011 11:04:00', '19/07/2011 11:05:00', '19/07/2011 11:06:00', '19/07/2011 11:07:00', '19/07/2011 11:08:00', '19/07/2011 11:09:00', '19/07/2011 11:10:00', '19/07/2011 11:11:00', '19/07/2011 11:12:00', '19/07/2011 11:13:00', '19/07/2011 11:14:00', '19/07/2011 11:15:00', '19/07/2011 11:16:00', '19/07/2011 11:17:00', '19/07/2011 11:18:00', '19/07/2011 11:19:00', '19/07/2011 11:20:00', '19/07/2011 11:21:00', '19/07/2011 11:22:00', '19/07/2011 11:23:00', '19/07/2011 11:24:00', '19/07/2011 11:25:00', '19/07/2011 11:26:00', '19/07/2011 11:27:00', '19/07/2011 11:28:00', '19/07/2011 11:29:00', '19/07/2011 11:30:00', '19/07/2011 11:31:00', '19/07/2011 11:32:00', '19/07/2011 11:33:00', '19/07/2011 11:34:00', '19/07/2011 11:35:00', '19/07/2011 11:36:00', '19/07/2011 11:37:00', '19/07/2011 11:38:00', '19/07/2011 11:39:00', '19/07/2011 11:40:00', '19/07/2011 11:41:00', '19/07/2011 11:42:00', '19/07/2011 11:43:00', '19/07/2011 11:44:00', '19/07/2011 11:45:00', '19/07/2011 11:46:00', '19/07/2011 11:47:00', '19/07/2011 11:48:00', '19/07/2011 11:49:00', '19/07/2011 11:50:00', '19/07/2011 11:51:00', '19/07/2011 11:52:00', '19/07/2011 11:53:00', '19/07/2011 11:54:00', '19/07/2011 11:55:00', '19/07/2011 11:56:00', '19/07/2011 11:57:00', '19/07/2011 11:58:00', '19/07/2011 11:59:00', '19/07/2011 12:00:00', '19/07/2011 12:01:00', '19/07/2011 12:02:00', '19/07/2011 12:03:00', '19/07/2011 12:04:00', '19/07/2011 12:05:00', '19/07/2011 12:06:00', '19/07/2011 12:07:00', '19/07/2011 12:08:00', '19/07/2011 12:09:00', '19/07/2011 12:10:00', '19/07/2011 12:11:00', '19/07/2011 12:12:00', '19/07/2011 12:13:00', '19/07/2011 12:14:00', '19/07/2011 12:15:00', '19/07/2011 12:16:00', '19/07/2011 12:17:00', '19/07/2011 12:18:00', '19/07/2011 12:19:00', '19/07/2011 12:20:00', '19/07/2011 12:21:00', '19/07/2011 12:22:00', '19/07/2011 12:23:00', '19/07/2011 12:24:00', '19/07/2011 12:25:00', '19/07/2011 12:26:00', '19/07/2011 12:27:00', '19/07/2011 12:28:00', '19/07/2011 12:29:00', '19/07/2011 12:30:00', '19/07/2011 12:31:00', '19/07/2011 12:32:00', '19/07/2011 12:33:00', '19/07/2011 12:34:00', '19/07/2011 12:35:00', '19/07/2011 12:36:00', '19/07/2011 12:37:00', '19/07/2011 12:38:00', '19/07/2011 12:39:00', '19/07/2011 12:40:00', '19/07/2011 12:41:00', '19/07/2011 12:42:00', '19/07/2011 12:43:00', '19/07/2011 12:44:00', '19/07/2011 12:45:00', '19/07/2011 12:46:00', '19/07/2011 12:47:00', '19/07/2011 12:48:00', '19/07/2011 12:49:00', '19/07/2011 12:50:00', '19/07/2011 12:51:00', '19/07/2011 12:52:00', '19/07/2011 12:53:00', '19/07/2011 12:54:00', '19/07/2011 12:55:00', '19/07/2011 12:56:00', '19/07/2011 12:57:00', '19/07/2011 12:58:00', '19/07/2011 12:59:00', '19/07/2011 13:00:00', '19/07/2011 13:01:00', '19/07/2011 13:02:00', '19/07/2011 13:03:00', '19/07/2011 13:04:00', '19/07/2011 13:05:00', '19/07/2011 13:06:00', '19/07/2011 13:07:00', '19/07/2011 13:08:00', '19/07/2011 13:09:00', '19/07/2011 13:10:00', '19/07/2011 13:11:00', '19/07/2011 13:12:00', '19/07/2011 13:13:00', '19/07/2011 13:14:00', '19/07/2011 13:15:00', '19/07/2011 13:16:00', '19/07/2011 13:17:00', '19/07/2011 13:18:00', '19/07/2011 13:19:00', '19/07/2011 13:20:00', '19/07/2011 13:21:00', '19/07/2011 13:22:00', '19/07/2011 13:23:00', '19/07/2011 13:24:00', '19/07/2011 13:25:00', '19/07/2011 13:26:00', '19/07/2011 13:27:00', '19/07/2011 13:28:00', '19/07/2011 13:29:00', '19/07/2011 13:30:00', '19/07/2011 13:31:00', '19/07/2011 13:32:00', '19/07/2011 13:33:00', '19/07/2011 13:34:00', '19/07/2011 13:35:00', '19/07/2011 13:36:00', '19/07/2011 13:37:00', '19/07/2011 13:38:00', '19/07/2011 13:39:00', '19/07/2011 13:40:00', '19/07/2011 13:41:00', '19/07/2011 13:42:00', '19/07/2011 13:43:00', '19/07/2011 13:44:00', '19/07/2011 13:45:00', '19/07/2011 13:46:00', '19/07/2011 13:47:00', '19/07/2011 13:48:00', '19/07/2011 13:49:00', '19/07/2011 13:50:00', '19/07/2011 13:51:00', '19/07/2011 13:52:00', '19/07/2011 13:53:00', '19/07/2011 13:54:00', '19/07/2011 13:55:00', '19/07/2011 13:56:00', '19/07/2011 13:57:00', '19/07/2011 13:58:00', '19/07/2011 13:59:00', '19/07/2011 14:00:00', '19/07/2011 14:01:00', '19/07/2011 14:02:00', '19/07/2011 14:03:00', '19/07/2011 14:04:00', '19/07/2011 14:05:00', '19/07/2011 14:06:00', '19/07/2011 14:07:00', '19/07/2011 14:08:00', '19/07/2011 14:09:00', '19/07/2011 14:10:00', '19/07/2011 14:11:00', '19/07/2011 14:12:00', '19/07/2011 14:13:00', '19/07/2011 14:14:00', '19/07/2011 14:15:00', '19/07/2011 14:16:00', '19/07/2011 14:17:00', '19/07/2011 14:18:00', '19/07/2011 14:19:00', '19/07/2011 14:20:00', '19/07/2011 14:21:00', '19/07/2011 14:22:00', '19/07/2011 14:23:00', '19/07/2011 14:24:00', '19/07/2011 14:25:00', '19/07/2011 14:26:00', '19/07/2011 14:27:00', '19/07/2011 14:28:00', '19/07/2011 14:29:00', '19/07/2011 14:30:00', '19/07/2011 14:31:00', '19/07/2011 14:32:00', '19/07/2011 14:33:00', '19/07/2011 14:34:00', '19/07/2011 14:35:00', '19/07/2011 14:36:00', '19/07/2011 14:37:00', '19/07/2011 14:38:00', '19/07/2011 14:39:00', '19/07/2011 14:40:00', '19/07/2011 14:41:00', '19/07/2011 14:42:00', '19/07/2011 14:43:00', '19/07/2011 14:44:00', '19/07/2011 14:45:00', '19/07/2011 14:46:00', '19/07/2011 14:47:00', '19/07/2011 14:48:00', '19/07/2011 14:49:00', '19/07/2011 14:50:00', '19/07/2011 14:51:00', '19/07/2011 14:52:00', '19/07/2011 14:53:00', '19/07/2011 14:54:00', '19/07/2011 14:55:00', '19/07/2011 14:56:00', '19/07/2011 14:57:00', '19/07/2011 14:58:00', '19/07/2011 14:59:00', '19/07/2011 15:00:00', '19/07/2011 15:01:00', '19/07/2011 15:02:00', '19/07/2011 15:03:00', '19/07/2011 15:04:00', '19/07/2011 15:05:00', '19/07/2011 15:06:00', '19/07/2011 15:07:00', '19/07/2011 15:08:00', '19/07/2011 15:09:00', '19/07/2011 15:10:00', '19/07/2011 15:11:00', '19/07/2011 15:12:00', '19/07/2011 15:13:00', '19/07/2011 15:14:00', '19/07/2011 15:15:00', '19/07/2011 15:16:00', '19/07/2011 15:17:00', '19/07/2011 15:18:00', '19/07/2011 15:19:00', '19/07/2011 15:20:00', '19/07/2011 15:21:00', '19/07/2011 15:22:00', '19/07/2011 15:23:00', '19/07/2011 15:24:00', '19/07/2011 15:25:00', '19/07/2011 15:26:00', '19/07/2011 15:27:00', '19/07/2011 15:28:00', '19/07/2011 15:29:00', '19/07/2011 15:30:00', '19/07/2011 15:31:00', '19/07/2011 15:32:00', '19/07/2011 15:33:00', '19/07/2011 15:34:00', '19/07/2011 15:35:00', '19/07/2011 15:36:00', '19/07/2011 15:37:00', '19/07/2011 15:38:00', '19/07/2011 15:39:00', '19/07/2011 15:40:00', '19/07/2011 15:41:00', '19/07/2011 15:42:00', '19/07/2011 15:43:00', '19/07/2011 15:44:00', '19/07/2011 15:45:00', '19/07/2011 15:46:00', '19/07/2011 15:47:00', '19/07/2011 15:48:00', '19/07/2011 15:49:00', '19/07/2011 15:50:00', '19/07/2011 15:51:00', '19/07/2011 15:52:00', '19/07/2011 15:53:00', '19/07/2011 15:54:00', '19/07/2011 15:55:00', '19/07/2011 15:56:00', '19/07/2011 15:57:00', '19/07/2011 15:58:00', '19/07/2011 15:59:00', '19/07/2011 16:00:00', '19/07/2011 16:01:00', '19/07/2011 16:02:00', '19/07/2011 16:03:00', '19/07/2011 16:04:00', '19/07/2011 16:05:00', '19/07/2011 16:06:00', '19/07/2011 16:07:00', '19/07/2011 16:08:00', '19/07/2011 16:09:00', '19/07/2011 16:10:00', '19/07/2011 16:11:00', '19/07/2011 16:12:00', '19/07/2011 16:13:00', '19/07/2011 16:14:00', '19/07/2011 16:15:00', '19/07/2011 16:16:00', '19/07/2011 16:17:00', '19/07/2011 16:18:00', '19/07/2011 16:19:00', '19/07/2011 16:20:00', '19/07/2011 16:21:00', '19/07/2011 16:22:00', '19/07/2011 16:23:00', '19/07/2011 16:24:00', '19/07/2011 16:25:00', '19/07/2011 16:26:00', '19/07/2011 16:27:00', '19/07/2011 16:28:00', '19/07/2011 16:29:00', '19/07/2011 16:30:00', '19/07/2011 16:31:00', '19/07/2011 16:32:00', '19/07/2011 16:33:00', '19/07/2011 16:34:00', '19/07/2011 16:35:00', '19/07/2011 16:36:00', '19/07/2011 16:37:00', '19/07/2011 16:38:00', '19/07/2011 16:39:00', '19/07/2011 16:40:00', '19/07/2011 16:41:00', '19/07/2011 16:42:00', '19/07/2011 16:43:00', '19/07/2011 16:44:00', '19/07/2011 16:45:00', '19/07/2011 16:46:00', '19/07/2011 16:47:00', '19/07/2011 16:48:00', '19/07/2011 16:49:00', '19/07/2011 16:50:00', '19/07/2011 16:51:00', '19/07/2011 16:52:00', '19/07/2011 16:53:00', '19/07/2011 16:54:00', '19/07/2011 16:55:00', '19/07/2011 16:56:00', '19/07/2011 16:57:00', '19/07/2011 16:58:00', '19/07/2011 16:59:00', '19/07/2011 17:00:00', '19/07/2011 17:01:00', '19/07/2011 17:02:00', '19/07/2011 17:03:00', '19/07/2011 17:04:00', '19/07/2011 17:05:00', '19/07/2011 17:06:00', '19/07/2011 17:07:00', '19/07/2011 17:08:00', '19/07/2011 17:09:00', '19/07/2011 17:10:00', '19/07/2011 17:11:00', '19/07/2011 17:12:00', '19/07/2011 17:13:00', '19/07/2011 17:14:00', '19/07/2011 17:15:00', '19/07/2011 17:16:00', '19/07/2011 17:17:00', '19/07/2011 17:18:00', '19/07/2011 17:19:00', '19/07/2011 17:20:00', '19/07/2011 17:21:00', '19/07/2011 17:22:00', '19/07/2011 17:23:00', '19/07/2011 17:24:00', '19/07/2011 17:25:00', '19/07/2011 17:26:00', '19/07/2011 17:27:00', '19/07/2011 17:28:00', '19/07/2011 17:29:00', '19/07/2011 17:30:00', '19/07/2011 17:31:00', '19/07/2011 17:32:00', '19/07/2011 17:33:00', '19/07/2011 17:34:00', '19/07/2011 17:35:00', '19/07/2011 17:36:00', '19/07/2011 17:37:00', '19/07/2011 17:38:00', '19/07/2011 17:39:00', '19/07/2011 17:40:00', '19/07/2011 17:41:00', '19/07/2011 17:42:00', '19/07/2011 17:43:00', '19/07/2011 17:44:00', '19/07/2011 17:45:00', '19/07/2011 17:46:00', '19/07/2011 17:47:00', '19/07/2011 17:48:00', '19/07/2011 17:49:00', '19/07/2011 17:50:00', '19/07/2011 17:51:00', '19/07/2011 17:52:00', '19/07/2011 17:53:00', '19/07/2011 17:54:00', '19/07/2011 17:55:00', '19/07/2011 17:56:00', '19/07/2011 17:57:00', '19/07/2011 17:58:00', '19/07/2011 17:59:00', '19/07/2011 18:00:00', '19/07/2011 18:01:00', '19/07/2011 18:02:00', '19/07/2011 18:03:00', '19/07/2011 18:04:00', '19/07/2011 18:05:00', '19/07/2011 18:06:00', '19/07/2011 18:07:00', '19/07/2011 18:08:00', '19/07/2011 18:09:00', '19/07/2011 18:10:00', '19/07/2011 18:11:00', '19/07/2011 18:12:00', '19/07/2011 18:13:00', '19/07/2011 18:14:00', '19/07/2011 18:15:00', '19/07/2011 18:16:00', '19/07/2011 18:17:00', '19/07/2011 18:18:00', '19/07/2011 18:19:00', '19/07/2011 18:20:00', '19/07/2011 18:21:00', '19/07/2011 18:22:00', '19/07/2011 18:23:00', '19/07/2011 18:24:00', '19/07/2011 18:25:00', '19/07/2011 18:26:00', '19/07/2011 18:27:00', '19/07/2011 18:28:00', '19/07/2011 18:29:00', '19/07/2011 18:30:00', '19/07/2011 18:31:00', '19/07/2011 18:32:00', '19/07/2011 18:33:00', '19/07/2011 18:34:00', '19/07/2011 18:35:00', '19/07/2011 18:36:00', '19/07/2011 18:37:00', '19/07/2011 18:38:00', '19/07/2011 18:39:00', '19/07/2011 18:40:00', '19/07/2011 18:41:00', '19/07/2011 18:42:00', '19/07/2011 18:43:00', '19/07/2011 18:44:00', '19/07/2011 18:45:00', '19/07/2011 18:46:00', '19/07/2011 18:47:00', '19/07/2011 18:48:00', '19/07/2011 18:49:00', '19/07/2011 18:50:00', '19/07/2011 18:51:00', '19/07/2011 18:52:00', '19/07/2011 18:53:00', '19/07/2011 18:54:00', '19/07/2011 18:55:00', '19/07/2011 18:56:00', '19/07/2011 18:57:00', '19/07/2011 18:58:00', '19/07/2011 18:59:00', '19/07/2011 19:00:00', '19/07/2011 19:01:00', '19/07/2011 19:02:00', '19/07/2011 19:03:00', '19/07/2011 19:04:00', '19/07/2011 19:05:00', '19/07/2011 19:06:00', '19/07/2011 19:07:00', '19/07/2011 19:08:00', '19/07/2011 19:09:00', '19/07/2011 19:10:00', '19/07/2011 19:11:00', '19/07/2011 19:12:00', '19/07/2011 19:13:00', '19/07/2011 19:14:00', '19/07/2011 19:15:00', '19/07/2011 19:16:00', '19/07/2011 19:17:00', '19/07/2011 19:18:00', '19/07/2011 19:19:00', '19/07/2011 19:20:00', '19/07/2011 19:21:00', '19/07/2011 19:22:00', '19/07/2011 19:23:00', '19/07/2011 19:24:00', '19/07/2011 19:25:00', '19/07/2011 19:26:00', '19/07/2011 19:27:00', '19/07/2011 19:28:00', '19/07/2011 19:29:00', '19/07/2011 19:30:00', '19/07/2011 19:31:00', '19/07/2011 19:32:00', '19/07/2011 19:33:00', '19/07/2011 19:34:00', '19/07/2011 19:35:00', '19/07/2011 19:36:00', '19/07/2011 19:37:00', '19/07/2011 19:38:00', '19/07/2011 19:39:00', '19/07/2011 19:40:00', '19/07/2011 19:41:00', '19/07/2011 19:42:00', '19/07/2011 19:43:00', '19/07/2011 19:44:00', '19/07/2011 19:45:00', '19/07/2011 19:46:00', '19/07/2011 19:47:00', '19/07/2011 19:48:00', '19/07/2011 19:49:00', '19/07/2011 19:50:00', '19/07/2011 19:51:00', '19/07/2011 19:52:00', '19/07/2011 19:53:00', '19/07/2011 19:54:00', '19/07/2011 19:55:00', '19/07/2011 19:56:00', '19/07/2011 19:57:00', '19/07/2011 19:58:00', '19/07/2011 19:59:00', '19/07/2011 20:00:00', '19/07/2011 20:01:00', '19/07/2011 20:02:00', '19/07/2011 20:03:00', '19/07/2011 20:04:00', '19/07/2011 20:05:00', '19/07/2011 20:06:00', '19/07/2011 20:07:00', '19/07/2011 20:08:00', '19/07/2011 20:09:00', '19/07/2011 20:10:00', '19/07/2011 20:11:00', '19/07/2011 20:12:00', '19/07/2011 20:13:00', '19/07/2011 20:14:00', '19/07/2011 20:15:00', '19/07/2011 20:16:00', '19/07/2011 20:17:00', '19/07/2011 20:18:00', '19/07/2011 20:19:00', '19/07/2011 20:20:00', '19/07/2011 20:21:00', '19/07/2011 20:22:00', '19/07/2011 20:23:00', '19/07/2011 20:24:00', '19/07/2011 20:25:00', '19/07/2011 20:26:00', '19/07/2011 20:27:00', '19/07/2011 20:28:00', '19/07/2011 20:29:00', '19/07/2011 20:30:00', '19/07/2011 20:31:00', '19/07/2011 20:32:00', '19/07/2011 20:33:00', '19/07/2011 20:34:00', '19/07/2011 20:35:00', '19/07/2011 20:36:00', '19/07/2011 20:37:00', '19/07/2011 20:38:00', '19/07/2011 20:39:00', '19/07/2011 20:40:00', '19/07/2011 20:41:00', '19/07/2011 20:42:00', '19/07/2011 20:43:00', '19/07/2011 20:44:00', '19/07/2011 20:45:00', '19/07/2011 20:46:00', '19/07/2011 20:47:00', '19/07/2011 20:48:00', '19/07/2011 20:49:00', '19/07/2011 20:50:00', '19/07/2011 20:51:00', '19/07/2011 20:52:00', '19/07/2011 20:53:00', '19/07/2011 20:54:00', '19/07/2011 20:55:00', '19/07/2011 20:56:00', '19/07/2011 20:57:00', '19/07/2011 20:58:00', '19/07/2011 20:59:00', '19/07/2011 21:00:00', '19/07/2011 21:01:00', '19/07/2011 21:02:00', '19/07/2011 21:03:00', '19/07/2011 21:04:00', '19/07/2011 21:05:00', '19/07/2011 21:06:00', '19/07/2011 21:07:00', '19/07/2011 21:08:00', '19/07/2011 21:09:00', '19/07/2011 21:10:00', '19/07/2011 21:11:00', '19/07/2011 21:12:00', '19/07/2011 21:13:00', '19/07/2011 21:14:00', '19/07/2011 21:15:00', '19/07/2011 21:16:00', '19/07/2011 21:17:00', '19/07/2011 21:18:00', '19/07/2011 21:19:00', '19/07/2011 21:20:00', '19/07/2011 21:21:00', '19/07/2011 21:22:00', '19/07/2011 21:23:00', '19/07/2011 21:24:00', '19/07/2011 21:25:00', '19/07/2011 21:26:00', '19/07/2011 21:27:00', '19/07/2011 21:28:00', '19/07/2011 21:29:00', '19/07/2011 21:30:00', '19/07/2011 21:31:00', '19/07/2011 21:32:00', '19/07/2011 21:33:00', '19/07/2011 21:34:00', '19/07/2011 21:35:00', '19/07/2011 21:36:00', '19/07/2011 21:37:00', '19/07/2011 21:38:00', '19/07/2011 21:39:00', '19/07/2011 21:40:00', '19/07/2011 21:41:00', '19/07/2011 21:42:00', '19/07/2011 21:43:00', '19/07/2011 21:44:00', '19/07/2011 21:45:00', '19/07/2011 21:46:00', '19/07/2011 21:47:00', '19/07/2011 21:48:00', '19/07/2011 21:49:00', '19/07/2011 21:50:00', '19/07/2011 21:51:00', '19/07/2011 21:52:00', '19/07/2011 21:53:00', '19/07/2011 21:54:00', '19/07/2011 21:55:00', '19/07/2011 21:56:00', '19/07/2011 21:57:00', '19/07/2011 21:58:00', '19/07/2011 21:59:00', '19/07/2011 22:00:00', '19/07/2011 22:01:00', '19/07/2011 22:02:00', '19/07/2011 22:03:00', '19/07/2011 22:04:00', '19/07/2011 22:05:00', '19/07/2011 22:06:00', '19/07/2011 22:07:00', '19/07/2011 22:08:00', '19/07/2011 22:09:00', '19/07/2011 22:10:00', '19/07/2011 22:11:00', '19/07/2011 22:12:00', '19/07/2011 22:13:00', '19/07/2011 22:14:00', '19/07/2011 22:15:00', '19/07/2011 22:16:00', '19/07/2011 22:17:00', '19/07/2011 22:18:00', '19/07/2011 22:19:00', '19/07/2011 22:20:00', '19/07/2011 22:21:00', '19/07/2011 22:22:00', '19/07/2011 22:23:00', '19/07/2011 22:24:00', '19/07/2011 22:25:00', '19/07/2011 22:26:00', '19/07/2011 22:27:00', '19/07/2011 22:28:00', '19/07/2011 22:29:00', '19/07/2011 22:30:00', '19/07/2011 22:31:00', '19/07/2011 22:32:00', '19/07/2011 22:33:00', '19/07/2011 22:34:00', '19/07/2011 22:35:00', '19/07/2011 22:36:00', '19/07/2011 22:37:00', '19/07/2011 22:38:00', '19/07/2011 22:39:00', '19/07/2011 22:40:00', '19/07/2011 22:41:00', '19/07/2011 22:42:00', '19/07/2011 22:43:00', '19/07/2011 22:44:00', '19/07/2011 22:45:00', '19/07/2011 22:46:00', '19/07/2011 22:47:00', '19/07/2011 22:48:00', '19/07/2011 22:49:00', '19/07/2011 22:50:00', '19/07/2011 22:51:00', '19/07/2011 22:52:00', '19/07/2011 22:53:00', '19/07/2011 22:54:00', '19/07/2011 22:55:00', '19/07/2011 22:56:00', '19/07/2011 22:57:00', '19/07/2011 22:58:00', '19/07/2011 22:59:00', '19/07/2011 23:00:00', '19/07/2011 23:01:00', '19/07/2011 23:02:00', '19/07/2011 23:03:00', '19/07/2011 23:04:00', '19/07/2011 23:05:00', '19/07/2011 23:06:00', '19/07/2011 23:07:00', '19/07/2011 23:08:00', '19/07/2011 23:09:00', '19/07/2011 23:10:00', '19/07/2011 23:11:00', '19/07/2011 23:12:00', '19/07/2011 23:13:00', '19/07/2011 23:14:00', '19/07/2011 23:15:00', '19/07/2011 23:16:00', '19/07/2011 23:17:00', '19/07/2011 23:18:00', '19/07/2011 23:19:00', '19/07/2011 23:20:00', '19/07/2011 23:21:00', '19/07/2011 23:22:00', '19/07/2011 23:23:00', '19/07/2011 23:24:00', '19/07/2011 23:25:00', '19/07/2011 23:26:00', '19/07/2011 23:27:00', '19/07/2011 23:28:00', '19/07/2011 23:29:00', '19/07/2011 23:30:00', '19/07/2011 23:31:00', '19/07/2011 23:32:00', '19/07/2011 23:33:00', '19/07/2011 23:34:00', '19/07/2011 23:35:00', '19/07/2011 23:36:00', '19/07/2011 23:37:00', '19/07/2011 23:38:00', '19/07/2011 23:39:00', '19/07/2011 23:40:00', '19/07/2011 23:41:00', '19/07/2011 23:42:00', '19/07/2011 23:43:00', '19/07/2011 23:44:00', '19/07/2011 23:45:00', '19/07/2011 23:46:00', '19/07/2011 23:47:00', '19/07/2011 23:48:00', '19/07/2011 23:49:00', '19/07/2011 23:50:00', '19/07/2011 23:51:00', '19/07/2011 23:52:00', '19/07/2011 23:53:00', '19/07/2011 23:54:00', '19/07/2011 23:55:00', '19/07/2011 23:56:00', '19/07/2011 23:57:00', '19/07/2011 23:58:00', '19/07/2011 23:59:00', '20/07/2011 0:00:00', '20/07/2011 0:01:00', '20/07/2011 0:02:00', '20/07/2011 0:03:00', '20/07/2011 0:04:00', '20/07/2011 0:05:00', '20/07/2011 0:06:00', '20/07/2011 0:07:00', '20/07/2011 0:08:00', '20/07/2011 0:09:00', '20/07/2011 0:10:00', '20/07/2011 0:11:00', '20/07/2011 0:12:00', '20/07/2011 0:13:00', '20/07/2011 0:14:00', '20/07/2011 0:15:00', '20/07/2011 0:16:00', '20/07/2011 0:17:00', '20/07/2011 0:18:00', '20/07/2011 0:19:00', '20/07/2011 0:20:00', '20/07/2011 0:21:00', '20/07/2011 0:22:00', '20/07/2011 0:23:00', '20/07/2011 0:24:00', '20/07/2011 0:25:00', '20/07/2011 0:26:00', '20/07/2011 0:27:00', '20/07/2011 0:28:00', '20/07/2011 0:29:00', '20/07/2011 0:30:00', '20/07/2011 0:31:00', '20/07/2011 0:32:00', '20/07/2011 0:33:00', '20/07/2011 0:34:00', '20/07/2011 0:35:00', '20/07/2011 0:36:00', '20/07/2011 0:37:00', '20/07/2011 0:38:00', '20/07/2011 0:39:00', '20/07/2011 0:40:00', '20/07/2011 0:41:00', '20/07/2011 0:42:00', '20/07/2011 0:43:00', '20/07/2011 0:44:00', '20/07/2011 0:45:00', '20/07/2011 0:46:00', '20/07/2011 0:47:00', '20/07/2011 0:48:00', '20/07/2011 0:49:00', '20/07/2011 0:50:00', '20/07/2011 0:51:00', '20/07/2011 0:52:00', '20/07/2011 0:53:00', '20/07/2011 0:54:00', '20/07/2011 0:55:00', '20/07/2011 0:56:00', '20/07/2011 0:57:00', '20/07/2011 0:58:00', '20/07/2011 0:59:00', '20/07/2011 1:00:00', '20/07/2011 1:01:00', '20/07/2011 1:02:00', '20/07/2011 1:03:00', '20/07/2011 1:04:00', '20/07/2011 1:05:00', '20/07/2011 1:06:00', '20/07/2011 1:07:00', '20/07/2011 1:08:00', '20/07/2011 1:09:00', '20/07/2011 1:10:00', '20/07/2011 1:11:00', '20/07/2011 1:12:00', '20/07/2011 1:13:00', '20/07/2011 1:14:00', '20/07/2011 1:15:00', '20/07/2011 1:16:00', '20/07/2011 1:17:00', '20/07/2011 1:18:00', '20/07/2011 1:19:00', '20/07/2011 1:20:00', '20/07/2011 1:21:00', '20/07/2011 1:22:00', '20/07/2011 1:23:00', '20/07/2011 1:24:00', '20/07/2011 1:25:00', '20/07/2011 1:26:00', '20/07/2011 1:27:00', '20/07/2011 1:28:00', '20/07/2011 1:29:00', '20/07/2011 1:30:00', '20/07/2011 1:31:00', '20/07/2011 1:32:00', '20/07/2011 1:33:00', '20/07/2011 1:34:00', '20/07/2011 1:35:00', '20/07/2011 1:36:00', '20/07/2011 1:37:00', '20/07/2011 1:38:00', '20/07/2011 1:39:00', '20/07/2011 1:40:00', '20/07/2011 1:41:00', '20/07/2011 1:42:00', '20/07/2011 1:43:00', '20/07/2011 1:44:00', '20/07/2011 1:45:00', '20/07/2011 1:46:00', '20/07/2011 1:47:00', '20/07/2011 1:48:00', '20/07/2011 1:49:00', '20/07/2011 1:50:00', '20/07/2011 1:51:00', '20/07/2011 1:52:00', '20/07/2011 1:53:00', '20/07/2011 1:54:00', '20/07/2011 1:55:00', '20/07/2011 1:56:00', '20/07/2011 1:57:00', '20/07/2011 1:58:00', '20/07/2011 1:59:00', '20/07/2011 2:00:00', '20/07/2011 2:01:00', '20/07/2011 2:02:00', '20/07/2011 2:03:00', '20/07/2011 2:04:00', '20/07/2011 2:05:00', '20/07/2011 2:06:00', '20/07/2011 2:07:00', '20/07/2011 2:08:00', '20/07/2011 2:09:00', '20/07/2011 2:10:00', '20/07/2011 2:11:00', '20/07/2011 2:12:00', '20/07/2011 2:13:00', '20/07/2011 2:14:00', '20/07/2011 2:15:00', '20/07/2011 2:16:00', '20/07/2011 2:17:00', '20/07/2011 2:18:00', '20/07/2011 2:19:00', '20/07/2011 2:20:00', '20/07/2011 2:21:00', '20/07/2011 2:22:00', '20/07/2011 2:23:00', '20/07/2011 2:24:00', '20/07/2011 2:25:00', '20/07/2011 2:26:00', '20/07/2011 2:27:00', '20/07/2011 2:28:00', '20/07/2011 2:29:00', '20/07/2011 2:30:00', '20/07/2011 2:31:00', '20/07/2011 2:32:00', '20/07/2011 2:33:00', '20/07/2011 2:34:00', '20/07/2011 2:35:00', '20/07/2011 2:36:00', '20/07/2011 2:37:00', '20/07/2011 2:38:00', '20/07/2011 2:39:00', '20/07/2011 2:40:00', '20/07/2011 2:41:00', '20/07/2011 2:42:00', '20/07/2011 2:43:00', '20/07/2011 2:44:00', '20/07/2011 2:45:00', '20/07/2011 2:46:00', '20/07/2011 2:47:00', '20/07/2011 2:48:00', '20/07/2011 2:49:00', '20/07/2011 2:50:00', '20/07/2011 2:51:00', '20/07/2011 2:52:00', '20/07/2011 2:53:00', '20/07/2011 2:54:00', '20/07/2011 2:55:00', '20/07/2011 2:56:00', '20/07/2011 2:57:00', '20/07/2011 2:58:00', '20/07/2011 2:59:00', '20/07/2011 3:00:00', '20/07/2011 3:01:00', '20/07/2011 3:02:00', '20/07/2011 3:03:00', '20/07/2011 3:04:00', '20/07/2011 3:05:00', '20/07/2011 3:06:00', '20/07/2011 3:07:00', '20/07/2011 3:08:00', '20/07/2011 3:09:00', '20/07/2011 3:10:00', '20/07/2011 3:11:00', '20/07/2011 3:12:00', '20/07/2011 3:13:00', '20/07/2011 3:14:00', '20/07/2011 3:15:00', '20/07/2011 3:16:00', '20/07/2011 3:17:00', '20/07/2011 3:18:00', '20/07/2011 3:19:00', '20/07/2011 3:20:00', '20/07/2011 3:21:00', '20/07/2011 3:22:00', '20/07/2011 3:23:00', '20/07/2011 3:24:00', '20/07/2011 3:25:00', '20/07/2011 3:26:00', '20/07/2011 3:27:00', '20/07/2011 3:28:00', '20/07/2011 3:29:00', '20/07/2011 3:30:00', '20/07/2011 3:31:00', '20/07/2011 3:32:00', '20/07/2011 3:33:00', '20/07/2011 3:34:00', '20/07/2011 3:35:00', '20/07/2011 3:36:00', '20/07/2011 3:37:00', '20/07/2011 3:38:00', '20/07/2011 3:39:00', '20/07/2011 3:40:00', '20/07/2011 3:41:00', '20/07/2011 3:42:00', '20/07/2011 3:43:00', '20/07/2011 3:44:00', '20/07/2011 3:45:00', '20/07/2011 3:46:00', '20/07/2011 3:47:00', '20/07/2011 3:48:00', '20/07/2011 3:49:00', '20/07/2011 3:50:00', '20/07/2011 3:51:00', '20/07/2011 3:52:00', '20/07/2011 3:53:00', '20/07/2011 3:54:00', '20/07/2011 3:55:00', '20/07/2011 3:56:00', '20/07/2011 3:57:00', '20/07/2011 3:58:00', '20/07/2011 3:59:00', '20/07/2011 4:00:00', '20/07/2011 4:01:00', '20/07/2011 4:02:00', '20/07/2011 4:03:00', '20/07/2011 4:04:00', '20/07/2011 4:05:00', '20/07/2011 4:06:00', '20/07/2011 4:07:00', '20/07/2011 4:08:00', '20/07/2011 4:09:00', '20/07/2011 4:10:00', '20/07/2011 4:11:00', '20/07/2011 4:12:00', '20/07/2011 4:13:00', '20/07/2011 4:14:00', '20/07/2011 4:15:00', '20/07/2011 4:16:00', '20/07/2011 4:17:00', '20/07/2011 4:18:00', '20/07/2011 4:19:00', '20/07/2011 4:20:00', '20/07/2011 4:21:00', '20/07/2011 4:22:00', '20/07/2011 4:23:00', '20/07/2011 4:24:00', '20/07/2011 4:25:00', '20/07/2011 4:26:00', '20/07/2011 4:27:00', '20/07/2011 4:28:00', '20/07/2011 4:29:00', '20/07/2011 4:30:00', '20/07/2011 4:31:00', '20/07/2011 4:32:00', '20/07/2011 4:33:00', '20/07/2011 4:34:00', '20/07/2011 4:35:00', '20/07/2011 4:36:00', '20/07/2011 4:37:00', '20/07/2011 4:38:00', '20/07/2011 4:39:00', '20/07/2011 4:40:00', '20/07/2011 4:41:00', '20/07/2011 4:42:00', '20/07/2011 4:43:00', '20/07/2011 4:44:00', '20/07/2011 4:45:00', '20/07/2011 4:46:00', '20/07/2011 4:47:00', '20/07/2011 4:48:00', '20/07/2011 4:49:00', '20/07/2011 4:50:00', '20/07/2011 4:51:00', '20/07/2011 4:52:00', '20/07/2011 4:53:00', '20/07/2011 4:54:00', '20/07/2011 4:55:00', '20/07/2011 4:56:00', '20/07/2011 4:57:00', '20/07/2011 4:58:00', '20/07/2011 4:59:00', '20/07/2011 5:00:00', '20/07/2011 5:01:00', '20/07/2011 5:02:00', '20/07/2011 5:03:00', '20/07/2011 5:04:00', '20/07/2011 5:05:00', '20/07/2011 5:06:00', '20/07/2011 5:07:00', '20/07/2011 5:08:00', '20/07/2011 5:09:00', '20/07/2011 5:10:00', '20/07/2011 5:11:00', '20/07/2011 5:12:00', '20/07/2011 5:13:00', '20/07/2011 5:14:00', '20/07/2011 5:15:00', '20/07/2011 5:16:00', '20/07/2011 5:17:00', '20/07/2011 5:18:00', '20/07/2011 5:19:00', '20/07/2011 5:20:00', '20/07/2011 5:21:00', '20/07/2011 5:22:00', '20/07/2011 5:23:00', '20/07/2011 5:24:00', '20/07/2011 5:25:00', '20/07/2011 5:26:00', '20/07/2011 5:27:00', '20/07/2011 5:28:00', '20/07/2011 5:29:00', '20/07/2011 5:30:00', '20/07/2011 5:31:00', '20/07/2011 5:32:00', '20/07/2011 5:33:00', '20/07/2011 5:34:00', '20/07/2011 5:35:00', '20/07/2011 5:36:00', '20/07/2011 5:37:00', '20/07/2011 5:38:00', '20/07/2011 5:39:00', '20/07/2011 5:40:00', '20/07/2011 5:41:00', '20/07/2011 5:42:00', '20/07/2011 5:43:00', '20/07/2011 5:44:00', '20/07/2011 5:45:00', '20/07/2011 5:46:00', '20/07/2011 5:47:00', '20/07/2011 5:48:00', '20/07/2011 5:49:00', '20/07/2011 5:50:00', '20/07/2011 5:51:00', '20/07/2011 5:52:00', '20/07/2011 5:53:00', '20/07/2011 5:54:00', '20/07/2011 5:55:00', '20/07/2011 5:56:00', '20/07/2011 5:57:00', '20/07/2011 5:58:00', '20/07/2011 5:59:00', '20/07/2011 6:00:00', '20/07/2011 6:01:00', '20/07/2011 6:02:00', '20/07/2011 6:03:00', '20/07/2011 6:04:00', '20/07/2011 6:05:00', '20/07/2011 6:06:00', '20/07/2011 6:07:00', '20/07/2011 6:08:00', '20/07/2011 6:09:00', '20/07/2011 6:10:00', '20/07/2011 6:11:00', '20/07/2011 6:12:00', '20/07/2011 6:13:00', '20/07/2011 6:14:00', '20/07/2011 6:15:00', '20/07/2011 6:16:00', '20/07/2011 6:17:00', '20/07/2011 6:18:00', '20/07/2011 6:19:00', '20/07/2011 6:20:00', '20/07/2011 6:21:00', '20/07/2011 6:22:00', '20/07/2011 6:23:00', '20/07/2011 6:24:00', '20/07/2011 6:25:00', '20/07/2011 6:26:00', '20/07/2011 6:27:00', '20/07/2011 6:28:00', '20/07/2011 6:29:00', '20/07/2011 6:30:00', '20/07/2011 6:31:00', '20/07/2011 6:32:00', '20/07/2011 6:33:00', '20/07/2011 6:34:00', '20/07/2011 6:35:00', '20/07/2011 6:36:00', '20/07/2011 6:37:00', '20/07/2011 6:38:00', '20/07/2011 6:39:00', '20/07/2011 6:40:00', '20/07/2011 6:41:00', '20/07/2011 6:42:00', '20/07/2011 6:43:00', '20/07/2011 6:44:00', '20/07/2011 6:45:00', '20/07/2011 6:46:00', '20/07/2011 6:47:00', '20/07/2011 6:48:00', '20/07/2011 6:49:00', '20/07/2011 6:50:00', '20/07/2011 6:51:00', '20/07/2011 6:52:00', '20/07/2011 6:53:00', '20/07/2011 6:54:00', '20/07/2011 6:55:00', '20/07/2011 6:56:00', '20/07/2011 6:57:00', '20/07/2011 6:58:00', '20/07/2011 6:59:00', '20/07/2011 7:00:00', '20/07/2011 7:01:00', '20/07/2011 7:02:00', '20/07/2011 7:03:00', '20/07/2011 7:04:00', '20/07/2011 7:05:00', '20/07/2011 7:06:00', '20/07/2011 7:07:00', '20/07/2011 7:08:00', '20/07/2011 7:09:00', '20/07/2011 7:10:00', '20/07/2011 7:11:00', '20/07/2011 7:12:00', '20/07/2011 7:13:00', '20/07/2011 7:14:00', '20/07/2011 7:15:00', '20/07/2011 7:16:00', '20/07/2011 7:17:00', '20/07/2011 7:18:00', '20/07/2011 7:19:00', '20/07/2011 7:20:00', '20/07/2011 7:21:00', '20/07/2011 7:22:00', '20/07/2011 7:23:00', '20/07/2011 7:24:00', '20/07/2011 7:25:00', '20/07/2011 7:26:00', '20/07/2011 7:27:00', '20/07/2011 7:28:00', '20/07/2011 7:29:00', '20/07/2011 7:30:00', '20/07/2011 7:31:00', '20/07/2011 7:32:00', '20/07/2011 7:33:00', '20/07/2011 7:34:00', '20/07/2011 7:35:00', '20/07/2011 7:36:00', '20/07/2011 7:37:00', '20/07/2011 7:38:00', '20/07/2011 7:39:00', '20/07/2011 7:40:00', '20/07/2011 7:41:00', '20/07/2011 7:42:00', '20/07/2011 7:43:00', '20/07/2011 7:44:00', '20/07/2011 7:45:00', '20/07/2011 7:46:00', '20/07/2011 7:47:00', '20/07/2011 7:48:00', '20/07/2011 7:49:00', '20/07/2011 7:50:00', '20/07/2011 7:51:00', '20/07/2011 7:52:00', '20/07/2011 7:53:00', '20/07/2011 7:54:00', '20/07/2011 7:55:00', '20/07/2011 7:56:00', '20/07/2011 7:57:00', '20/07/2011 7:58:00', '20/07/2011 7:59:00', '20/07/2011 8:00:00', '20/07/2011 8:01:00', '20/07/2011 8:02:00', '20/07/2011 8:03:00', '20/07/2011 8:04:00', '20/07/2011 8:05:00', '20/07/2011 8:06:00', '20/07/2011 8:07:00', '20/07/2011 8:08:00', '20/07/2011 8:09:00', '20/07/2011 8:10:00', '20/07/2011 8:11:00', '20/07/2011 8:12:00', '20/07/2011 8:13:00', '20/07/2011 8:14:00', '20/07/2011 8:15:00', '20/07/2011 8:16:00', '20/07/2011 8:17:00', '20/07/2011 8:18:00', '20/07/2011 8:19:00', '20/07/2011 8:20:00', '20/07/2011 8:21:00', '20/07/2011 8:22:00', '20/07/2011 8:23:00', '20/07/2011 8:24:00', '20/07/2011 8:25:00', '20/07/2011 8:26:00', '20/07/2011 8:27:00', '20/07/2011 8:28:00', '20/07/2011 8:29:00', '20/07/2011 8:30:00', '20/07/2011 8:31:00', '20/07/2011 8:32:00', '20/07/2011 8:33:00', '20/07/2011 8:34:00', '20/07/2011 8:35:00', '20/07/2011 8:36:00', '20/07/2011 8:37:00', '20/07/2011 8:38:00', '20/07/2011 8:39:00', '20/07/2011 8:40:00', '20/07/2011 8:41:00', '20/07/2011 8:42:00', '20/07/2011 8:43:00', '20/07/2011 8:44:00', '20/07/2011 8:45:00', '20/07/2011 8:46:00', '20/07/2011 8:47:00', '20/07/2011 8:48:00', '20/07/2011 8:49:00', '20/07/2011 8:50:00', '20/07/2011 8:51:00', '20/07/2011 8:52:00', '20/07/2011 8:53:00', '20/07/2011 8:54:00', '20/07/2011 8:55:00', '20/07/2011 8:56:00', '20/07/2011 8:57:00', '20/07/2011 8:58:00', '20/07/2011 8:59:00', '20/07/2011 9:00:00', '20/07/2011 9:01:00', '20/07/2011 9:02:00', '20/07/2011 9:03:00', '20/07/2011 9:04:00', '20/07/2011 9:05:00', '20/07/2011 9:06:00', '20/07/2011 9:07:00', '20/07/2011 9:08:00', '20/07/2011 9:09:00', '20/07/2011 9:10:00', '20/07/2011 9:11:00', '20/07/2011 9:12:00', '20/07/2011 9:13:00', '20/07/2011 9:14:00', '20/07/2011 9:15:00', '20/07/2011 9:16:00', '20/07/2011 9:17:00', '20/07/2011 9:18:00', '20/07/2011 9:19:00', '20/07/2011 9:20:00', '20/07/2011 9:21:00', '20/07/2011 9:22:00', '20/07/2011 9:23:00', '20/07/2011 9:24:00', '20/07/2011 9:25:00', '20/07/2011 9:26:00', '20/07/2011 9:27:00', '20/07/2011 9:28:00', '20/07/2011 9:29:00', '20/07/2011 9:30:00', '20/07/2011 9:31:00', '20/07/2011 9:32:00', '20/07/2011 9:33:00', '20/07/2011 9:34:00', '20/07/2011 9:35:00', '20/07/2011 9:36:00', '20/07/2011 9:37:00', '20/07/2011 9:38:00', '20/07/2011 9:39:00', '20/07/2011 9:40:00', '20/07/2011 9:41:00', '20/07/2011 9:42:00', '20/07/2011 9:43:00', '20/07/2011 9:44:00', '20/07/2011 9:45:00', '20/07/2011 9:46:00', '20/07/2011 9:47:00', '20/07/2011 9:48:00', '20/07/2011 9:49:00', '20/07/2011 9:50:00', '20/07/2011 9:51:00', '20/07/2011 9:52:00', '20/07/2011 9:53:00', '20/07/2011 9:54:00', '20/07/2011 9:55:00', '20/07/2011 9:56:00', '20/07/2011 9:57:00', '20/07/2011 9:58:00', '20/07/2011 9:59:00', '20/07/2011 10:00:00', '20/07/2011 10:01:00', '20/07/2011 10:02:00', '20/07/2011 10:03:00', '20/07/2011 10:04:00', '20/07/2011 10:05:00', '20/07/2011 10:06:00', '20/07/2011 10:07:00', '20/07/2011 10:08:00', '20/07/2011 10:09:00', '20/07/2011 10:10:00', '20/07/2011 10:11:00', '20/07/2011 10:12:00', '20/07/2011 10:13:00', '20/07/2011 10:14:00', '20/07/2011 10:15:00', '20/07/2011 10:16:00', '20/07/2011 10:17:00', '20/07/2011 10:18:00', '20/07/2011 10:19:00', '20/07/2011 10:20:00', '20/07/2011 10:21:00', '20/07/2011 10:22:00', '20/07/2011 10:23:00', '20/07/2011 10:24:00', '20/07/2011 10:25:00', '20/07/2011 10:26:00', '20/07/2011 10:27:00', '20/07/2011 10:28:00', '20/07/2011 10:29:00', '20/07/2011 10:30:00', '20/07/2011 10:31:00', '20/07/2011 10:32:00', '20/07/2011 10:33:00', '20/07/2011 10:34:00', '20/07/2011 10:35:00', '20/07/2011 10:36:00', '20/07/2011 10:37:00', '20/07/2011 10:38:00', '20/07/2011 10:39:00', '20/07/2011 10:40:00', '20/07/2011 10:41:00', '20/07/2011 10:42:00', '20/07/2011 10:43:00', '20/07/2011 10:44:00', '20/07/2011 10:45:00', '20/07/2011 10:46:00', '20/07/2011 10:47:00', '20/07/2011 10:48:00', '20/07/2011 10:49:00', '20/07/2011 10:50:00', '20/07/2011 10:51:00', '20/07/2011 10:52:00', '20/07/2011 10:53:00', '20/07/2011 10:54:00', '20/07/2011 10:55:00', '20/07/2011 10:56:00', '20/07/2011 10:57:00', '20/07/2011 10:58:00', '20/07/2011 10:59:00', '20/07/2011 11:00:00', '20/07/2011 11:01:00', '20/07/2011 11:02:00', '20/07/2011 11:03:00', '20/07/2011 11:04:00', '20/07/2011 11:05:00', '20/07/2011 11:06:00', '20/07/2011 11:07:00', '20/07/2011 11:08:00', '20/07/2011 11:09:00', '20/07/2011 11:10:00', '20/07/2011 11:11:00', '20/07/2011 11:12:00', '20/07/2011 11:13:00', '20/07/2011 11:14:00', '20/07/2011 11:15:00', '20/07/2011 11:16:00', '20/07/2011 11:17:00', '20/07/2011 11:18:00', '20/07/2011 11:19:00', '20/07/2011 11:20:00', '20/07/2011 11:21:00', '20/07/2011 11:22:00', '20/07/2011 11:23:00', '20/07/2011 11:24:00', '20/07/2011 11:25:00', '20/07/2011 11:26:00', '20/07/2011 11:27:00', '20/07/2011 11:28:00', '20/07/2011 11:29:00', '20/07/2011 11:30:00', '20/07/2011 11:31:00', '20/07/2011 11:32:00', '20/07/2011 11:33:00', '20/07/2011 11:34:00', '20/07/2011 11:35:00', '20/07/2011 11:36:00', '20/07/2011 11:37:00', '20/07/2011 11:38:00', '20/07/2011 11:39:00', '20/07/2011 11:40:00', '20/07/2011 11:41:00', '20/07/2011 11:42:00', '20/07/2011 11:43:00', '20/07/2011 11:44:00', '20/07/2011 11:45:00', '20/07/2011 11:46:00', '20/07/2011 11:47:00', '20/07/2011 11:48:00', '20/07/2011 11:49:00', '20/07/2011 11:50:00', '20/07/2011 11:51:00', '20/07/2011 11:52:00', '20/07/2011 11:53:00', '20/07/2011 11:54:00', '20/07/2011 11:55:00', '20/07/2011 11:56:00', '20/07/2011 11:57:00', '20/07/2011 11:58:00', '20/07/2011 11:59:00', '20/07/2011 12:00:00', '20/07/2011 12:01:00', '20/07/2011 12:02:00', '20/07/2011 12:03:00', '20/07/2011 12:04:00', '20/07/2011 12:05:00', '20/07/2011 12:06:00', '20/07/2011 12:07:00', '20/07/2011 12:08:00', '20/07/2011 12:09:00', '20/07/2011 12:10:00', '20/07/2011 12:11:00', '20/07/2011 12:12:00', '20/07/2011 12:13:00', '20/07/2011 12:14:00', '20/07/2011 12:15:00', '20/07/2011 12:16:00', '20/07/2011 12:17:00', '20/07/2011 12:18:00', '20/07/2011 12:19:00', '20/07/2011 12:20:00', '20/07/2011 12:21:00', '20/07/2011 12:22:00', '20/07/2011 12:23:00', '20/07/2011 12:24:00', '20/07/2011 12:25:00', '20/07/2011 12:26:00', '20/07/2011 12:27:00', '20/07/2011 12:28:00', '20/07/2011 12:29:00', '20/07/2011 12:30:00', '20/07/2011 12:31:00', '20/07/2011 12:32:00', '20/07/2011 12:33:00', '20/07/2011 12:34:00', '20/07/2011 12:35:00', '20/07/2011 12:36:00', '20/07/2011 12:37:00', '20/07/2011 12:38:00', '20/07/2011 12:39:00', '20/07/2011 12:40:00', '20/07/2011 12:41:00', '20/07/2011 12:42:00', '20/07/2011 12:43:00', '20/07/2011 12:44:00', '20/07/2011 12:45:00', '20/07/2011 12:46:00', '20/07/2011 12:47:00', '20/07/2011 12:48:00', '20/07/2011 12:49:00', '20/07/2011 12:50:00', '20/07/2011 12:51:00', '20/07/2011 12:52:00', '20/07/2011 12:53:00', '20/07/2011 12:54:00', '20/07/2011 12:55:00', '20/07/2011 12:56:00', '20/07/2011 12:57:00', '20/07/2011 12:58:00', '20/07/2011 12:59:00', '20/07/2011 13:00:00', '20/07/2011 13:01:00', '20/07/2011 13:02:00', '20/07/2011 13:03:00', '20/07/2011 13:04:00', '20/07/2011 13:05:00', '20/07/2011 13:06:00', '20/07/2011 13:07:00', '20/07/2011 13:08:00', '20/07/2011 13:09:00', '20/07/2011 13:10:00', '20/07/2011 13:11:00', '20/07/2011 13:12:00', '20/07/2011 13:13:00', '20/07/2011 13:14:00', '20/07/2011 13:15:00', '20/07/2011 13:16:00', '20/07/2011 13:17:00', '20/07/2011 13:18:00', '20/07/2011 13:19:00', '20/07/2011 13:20:00', '20/07/2011 13:21:00', '20/07/2011 13:22:00', '20/07/2011 13:23:00', '20/07/2011 13:24:00', '20/07/2011 13:25:00', '20/07/2011 13:26:00', '20/07/2011 13:27:00', '20/07/2011 13:28:00', '20/07/2011 13:29:00', '20/07/2011 13:30:00', '20/07/2011 13:31:00', '20/07/2011 13:32:00', '20/07/2011 13:33:00', '20/07/2011 13:34:00', '20/07/2011 13:35:00', '20/07/2011 13:36:00', '20/07/2011 13:37:00', '20/07/2011 13:38:00', '20/07/2011 13:39:00', '20/07/2011 13:40:00', '20/07/2011 13:41:00', '20/07/2011 13:42:00', '20/07/2011 13:43:00', '20/07/2011 13:44:00', '20/07/2011 13:45:00', '20/07/2011 13:46:00', '20/07/2011 13:47:00', '20/07/2011 13:48:00', '20/07/2011 13:49:00', '20/07/2011 13:50:00', '20/07/2011 13:51:00', '20/07/2011 13:52:00', '20/07/2011 13:53:00', '20/07/2011 13:54:00', '20/07/2011 13:55:00', '20/07/2011 13:56:00', '20/07/2011 13:57:00', '20/07/2011 13:58:00', '20/07/2011 13:59:00', '20/07/2011 14:00:00', '20/07/2011 14:01:00', '20/07/2011 14:02:00', '20/07/2011 14:03:00', '20/07/2011 14:04:00', '20/07/2011 14:05:00', '20/07/2011 14:06:00', '20/07/2011 14:07:00', '20/07/2011 14:08:00', '20/07/2011 14:09:00', '20/07/2011 14:10:00', '20/07/2011 14:11:00', '20/07/2011 14:12:00', '20/07/2011 14:13:00', '20/07/2011 14:14:00', '20/07/2011 14:15:00', '20/07/2011 14:16:00', '20/07/2011 14:17:00', '20/07/2011 14:18:00', '20/07/2011 14:19:00', '20/07/2011 14:20:00', '20/07/2011 14:21:00', '20/07/2011 14:22:00', '20/07/2011 14:23:00', '20/07/2011 14:24:00', '20/07/2011 14:25:00', '20/07/2011 14:26:00', '20/07/2011 14:27:00', '20/07/2011 14:28:00', '20/07/2011 14:29:00', '20/07/2011 14:30:00', '20/07/2011 14:31:00', '20/07/2011 14:32:00', '20/07/2011 14:33:00', '20/07/2011 14:34:00', '20/07/2011 14:35:00', '20/07/2011 14:36:00', '20/07/2011 14:37:00', '20/07/2011 14:38:00', '20/07/2011 14:39:00', '20/07/2011 14:40:00', '20/07/2011 14:41:00', '20/07/2011 14:42:00', '20/07/2011 14:43:00', '20/07/2011 14:44:00', '20/07/2011 14:45:00', '20/07/2011 14:46:00', '20/07/2011 14:47:00', '20/07/2011 14:48:00', '20/07/2011 14:49:00', '20/07/2011 14:50:00', '20/07/2011 14:51:00', '20/07/2011 14:52:00', '20/07/2011 14:53:00', '20/07/2011 14:54:00', '20/07/2011 14:55:00', '20/07/2011 14:56:00', '20/07/2011 14:57:00', '20/07/2011 14:58:00', '20/07/2011 14:59:00', '20/07/2011 15:00:00', '20/07/2011 15:01:00', '20/07/2011 15:02:00', '20/07/2011 15:03:00', '20/07/2011 15:04:00', '20/07/2011 15:05:00', '20/07/2011 15:06:00', '20/07/2011 15:07:00', '20/07/2011 15:08:00', '20/07/2011 15:09:00', '20/07/2011 15:10:00', '20/07/2011 15:11:00', '20/07/2011 15:12:00', '20/07/2011 15:13:00', '20/07/2011 15:14:00', '20/07/2011 15:15:00', '20/07/2011 15:16:00', '20/07/2011 15:17:00', '20/07/2011 15:18:00', '20/07/2011 15:19:00', '20/07/2011 15:20:00', '20/07/2011 15:21:00', '20/07/2011 15:22:00', '20/07/2011 15:23:00', '20/07/2011 15:24:00', '20/07/2011 15:25:00', '20/07/2011 15:26:00', '20/07/2011 15:27:00', '20/07/2011 15:28:00', '20/07/2011 15:29:00', '20/07/2011 15:30:00', '20/07/2011 15:31:00', '20/07/2011 15:32:00', '20/07/2011 15:33:00', '20/07/2011 15:34:00', '20/07/2011 15:35:00', '20/07/2011 15:36:00', '20/07/2011 15:37:00', '20/07/2011 15:38:00', '20/07/2011 15:39:00', '20/07/2011 15:40:00', '20/07/2011 15:41:00', '20/07/2011 15:42:00', '20/07/2011 15:43:00', '20/07/2011 15:44:00', '20/07/2011 15:45:00', '20/07/2011 15:46:00', '20/07/2011 15:47:00', '20/07/2011 15:48:00', '20/07/2011 15:49:00', '20/07/2011 15:50:00', '20/07/2011 15:51:00', '20/07/2011 15:52:00', '20/07/2011 15:53:00', '20/07/2011 15:54:00', '20/07/2011 15:55:00', '20/07/2011 15:56:00', '20/07/2011 15:57:00', '20/07/2011 15:58:00', '20/07/2011 15:59:00', '20/07/2011 16:00:00', '20/07/2011 16:01:00', '20/07/2011 16:02:00', '20/07/2011 16:03:00', '20/07/2011 16:04:00', '20/07/2011 16:05:00', '20/07/2011 16:06:00', '20/07/2011 16:07:00', '20/07/2011 16:08:00', '20/07/2011 16:09:00', '20/07/2011 16:10:00', '20/07/2011 16:11:00', '20/07/2011 16:12:00', '20/07/2011 16:13:00', '20/07/2011 16:14:00', '20/07/2011 16:15:00', '20/07/2011 16:16:00', '20/07/2011 16:17:00', '20/07/2011 16:18:00', '20/07/2011 16:19:00', '20/07/2011 16:20:00', '20/07/2011 16:21:00', '20/07/2011 16:22:00', '20/07/2011 16:23:00', '20/07/2011 16:24:00', '20/07/2011 16:25:00', '20/07/2011 16:26:00', '20/07/2011 16:27:00', '20/07/2011 16:28:00', '20/07/2011 16:29:00', '20/07/2011 16:30:00', '20/07/2011 16:31:00', '20/07/2011 16:32:00', '20/07/2011 16:33:00', '20/07/2011 16:34:00', '20/07/2011 16:35:00', '20/07/2011 16:36:00', '20/07/2011 16:37:00', '20/07/2011 16:38:00', '20/07/2011 16:39:00', '20/07/2011 16:40:00', '20/07/2011 16:41:00', '20/07/2011 16:42:00', '20/07/2011 16:43:00', '20/07/2011 16:44:00', '20/07/2011 16:45:00', '20/07/2011 16:46:00', '20/07/2011 16:47:00', '20/07/2011 16:48:00', '20/07/2011 16:49:00', '20/07/2011 16:50:00', '20/07/2011 16:51:00', '20/07/2011 16:52:00', '20/07/2011 16:53:00', '20/07/2011 16:54:00', '20/07/2011 16:55:00', '20/07/2011 16:56:00', '20/07/2011 16:57:00', '20/07/2011 16:58:00', '20/07/2011 16:59:00', '20/07/2011 17:00:00', '20/07/2011 17:01:00', '20/07/2011 17:02:00', '20/07/2011 17:03:00', '20/07/2011 17:04:00', '20/07/2011 17:05:00', '20/07/2011 17:06:00', '20/07/2011 17:07:00', '20/07/2011 17:08:00', '20/07/2011 17:09:00', '20/07/2011 17:10:00', '20/07/2011 17:11:00', '20/07/2011 17:12:00', '20/07/2011 17:13:00', '20/07/2011 17:14:00', '20/07/2011 17:15:00', '20/07/2011 17:16:00', '20/07/2011 17:17:00', '20/07/2011 17:18:00', '20/07/2011 17:19:00', '20/07/2011 17:20:00', '20/07/2011 17:21:00', '20/07/2011 17:22:00', '20/07/2011 17:23:00', '20/07/2011 17:24:00', '20/07/2011 17:25:00', '20/07/2011 17:26:00', '20/07/2011 17:27:00', '20/07/2011 17:28:00', '20/07/2011 17:29:00', '20/07/2011 17:30:00', '20/07/2011 17:31:00', '20/07/2011 17:32:00', '20/07/2011 17:33:00', '20/07/2011 17:34:00', '20/07/2011 17:35:00', '20/07/2011 17:36:00', '20/07/2011 17:37:00', '20/07/2011 17:38:00', '20/07/2011 17:39:00', '20/07/2011 17:40:00', '20/07/2011 17:41:00', '20/07/2011 17:42:00', '20/07/2011 17:43:00', '20/07/2011 17:44:00', '20/07/2011 17:45:00', '20/07/2011 17:46:00', '20/07/2011 17:47:00', '20/07/2011 17:48:00', '20/07/2011 17:49:00', '20/07/2011 17:50:00', '20/07/2011 17:51:00', '20/07/2011 17:52:00', '20/07/2011 17:53:00', '20/07/2011 17:54:00', '20/07/2011 17:55:00', '20/07/2011 17:56:00', '20/07/2011 17:57:00', '20/07/2011 17:58:00', '20/07/2011 17:59:00', '20/07/2011 18:00:00', '20/07/2011 18:01:00', '20/07/2011 18:02:00', '20/07/2011 18:03:00', '20/07/2011 18:04:00', '20/07/2011 18:05:00', '20/07/2011 18:06:00', '20/07/2011 18:07:00', '20/07/2011 18:08:00', '20/07/2011 18:09:00', '20/07/2011 18:10:00', '20/07/2011 18:11:00', '20/07/2011 18:12:00', '20/07/2011 18:13:00', '20/07/2011 18:14:00', '20/07/2011 18:15:00', '20/07/2011 18:16:00', '20/07/2011 18:17:00', '20/07/2011 18:18:00', '20/07/2011 18:19:00', '20/07/2011 18:20:00', '20/07/2011 18:21:00', '20/07/2011 18:22:00', '20/07/2011 18:23:00', '20/07/2011 18:24:00', '20/07/2011 18:25:00', '20/07/2011 18:26:00', '20/07/2011 18:27:00', '20/07/2011 18:28:00', '20/07/2011 18:29:00', '20/07/2011 18:30:00', '20/07/2011 18:31:00', '20/07/2011 18:32:00', '20/07/2011 18:33:00', '20/07/2011 18:34:00', '20/07/2011 18:35:00', '20/07/2011 18:36:00', '20/07/2011 18:37:00', '20/07/2011 18:38:00', '20/07/2011 18:39:00', '20/07/2011 18:40:00', '20/07/2011 18:41:00', '20/07/2011 18:42:00', '20/07/2011 18:43:00', '20/07/2011 18:44:00', '20/07/2011 18:45:00', '20/07/2011 18:46:00', '20/07/2011 18:47:00', '20/07/2011 18:48:00', '20/07/2011 18:49:00', '20/07/2011 18:50:00', '20/07/2011 18:51:00', '20/07/2011 18:52:00', '20/07/2011 18:53:00', '20/07/2011 18:54:00', '20/07/2011 18:55:00', '20/07/2011 18:56:00', '20/07/2011 18:57:00', '20/07/2011 18:58:00', '20/07/2011 18:59:00', '20/07/2011 19:00:00', '20/07/2011 19:01:00', '20/07/2011 19:02:00', '20/07/2011 19:03:00', '20/07/2011 19:04:00', '20/07/2011 19:05:00', '20/07/2011 19:06:00', '20/07/2011 19:07:00', '20/07/2011 19:08:00', '20/07/2011 19:09:00', '20/07/2011 19:10:00', '20/07/2011 19:11:00', '20/07/2011 19:12:00', '20/07/2011 19:13:00', '20/07/2011 19:14:00', '20/07/2011 19:15:00', '20/07/2011 19:16:00', '20/07/2011 19:17:00', '20/07/2011 19:18:00', '20/07/2011 19:19:00', '20/07/2011 19:20:00', '20/07/2011 19:21:00', '20/07/2011 19:22:00', '20/07/2011 19:23:00', '20/07/2011 19:24:00', '20/07/2011 19:25:00', '20/07/2011 19:26:00', '20/07/2011 19:27:00', '20/07/2011 19:28:00', '20/07/2011 19:29:00', '20/07/2011 19:30:00', '20/07/2011 19:31:00', '20/07/2011 19:32:00', '20/07/2011 19:33:00', '20/07/2011 19:34:00', '20/07/2011 19:35:00', '20/07/2011 19:36:00', '20/07/2011 19:37:00', '20/07/2011 19:38:00', '20/07/2011 19:39:00', '20/07/2011 19:40:00', '20/07/2011 19:41:00', '20/07/2011 19:42:00', '20/07/2011 19:43:00', '20/07/2011 19:44:00', '20/07/2011 19:45:00', '20/07/2011 19:46:00', '20/07/2011 19:47:00', '20/07/2011 19:48:00', '20/07/2011 19:49:00', '20/07/2011 19:50:00', '20/07/2011 19:51:00', '20/07/2011 19:52:00', '20/07/2011 19:53:00', '20/07/2011 19:54:00', '20/07/2011 19:55:00', '20/07/2011 19:56:00', '20/07/2011 19:57:00', '20/07/2011 19:58:00', '20/07/2011 19:59:00', '20/07/2011 20:00:00', '20/07/2011 20:01:00', '20/07/2011 20:02:00', '20/07/2011 20:03:00', '20/07/2011 20:04:00', '20/07/2011 20:05:00', '20/07/2011 20:06:00', '20/07/2011 20:07:00', '20/07/2011 20:08:00', '20/07/2011 20:09:00', '20/07/2011 20:10:00', '20/07/2011 20:11:00', '20/07/2011 20:12:00', '20/07/2011 20:13:00', '20/07/2011 20:14:00', '20/07/2011 20:15:00', '20/07/2011 20:16:00', '20/07/2011 20:17:00', '20/07/2011 20:18:00', '20/07/2011 20:19:00', '20/07/2011 20:20:00', '20/07/2011 20:21:00', '20/07/2011 20:22:00', '20/07/2011 20:23:00', '20/07/2011 20:24:00', '20/07/2011 20:25:00', '20/07/2011 20:26:00', '20/07/2011 20:27:00', '20/07/2011 20:28:00', '20/07/2011 20:29:00', '20/07/2011 20:30:00', '20/07/2011 20:31:00', '20/07/2011 20:32:00', '20/07/2011 20:33:00', '20/07/2011 20:34:00', '20/07/2011 20:35:00', '20/07/2011 20:36:00', '20/07/2011 20:37:00', '20/07/2011 20:38:00', '20/07/2011 20:39:00', '20/07/2011 20:40:00', '20/07/2011 20:41:00', '20/07/2011 20:42:00', '20/07/2011 20:43:00', '20/07/2011 20:44:00', '20/07/2011 20:45:00', '20/07/2011 20:46:00', '20/07/2011 20:47:00', '20/07/2011 20:48:00', '20/07/2011 20:49:00', '20/07/2011 20:50:00', '20/07/2011 20:51:00', '20/07/2011 20:52:00', '20/07/2011 20:53:00', '20/07/2011 20:54:00', '20/07/2011 20:55:00', '20/07/2011 20:56:00', '20/07/2011 20:57:00', '20/07/2011 20:58:00', '20/07/2011 20:59:00', '20/07/2011 21:00:00', '20/07/2011 21:01:00', '20/07/2011 21:02:00', '20/07/2011 21:03:00', '20/07/2011 21:04:00', '20/07/2011 21:05:00', '20/07/2011 21:06:00', '20/07/2011 21:07:00', '20/07/2011 21:08:00', '20/07/2011 21:09:00', '20/07/2011 21:10:00', '20/07/2011 21:11:00', '20/07/2011 21:12:00', '20/07/2011 21:13:00', '20/07/2011 21:14:00', '20/07/2011 21:15:00', '20/07/2011 21:16:00', '20/07/2011 21:17:00', '20/07/2011 21:18:00', '20/07/2011 21:19:00', '20/07/2011 21:20:00', '20/07/2011 21:21:00', '20/07/2011 21:22:00', '20/07/2011 21:23:00', '20/07/2011 21:24:00', '20/07/2011 21:25:00', '20/07/2011 21:26:00', '20/07/2011 21:27:00', '20/07/2011 21:28:00', '20/07/2011 21:29:00', '20/07/2011 21:30:00', '20/07/2011 21:31:00', '20/07/2011 21:32:00', '20/07/2011 21:33:00', '20/07/2011 21:34:00', '20/07/2011 21:35:00', '20/07/2011 21:36:00', '20/07/2011 21:37:00', '20/07/2011 21:38:00', '20/07/2011 21:39:00', '20/07/2011 21:40:00', '20/07/2011 21:41:00', '20/07/2011 21:42:00', '20/07/2011 21:43:00', '20/07/2011 21:44:00', '20/07/2011 21:45:00', '20/07/2011 21:46:00', '20/07/2011 21:47:00', '20/07/2011 21:48:00', '20/07/2011 21:49:00', '20/07/2011 21:50:00', '20/07/2011 21:51:00', '20/07/2011 21:52:00', '20/07/2011 21:53:00', '20/07/2011 21:54:00', '20/07/2011 21:55:00', '20/07/2011 21:56:00', '20/07/2011 21:57:00', '20/07/2011 21:58:00', '20/07/2011 21:59:00', '20/07/2011 22:00:00', '20/07/2011 22:01:00', '20/07/2011 22:02:00', '20/07/2011 22:03:00', '20/07/2011 22:04:00', '20/07/2011 22:05:00', '20/07/2011 22:06:00', '20/07/2011 22:07:00', '20/07/2011 22:08:00', '20/07/2011 22:09:00', '20/07/2011 22:10:00', '20/07/2011 22:11:00', '20/07/2011 22:12:00', '20/07/2011 22:13:00', '20/07/2011 22:14:00', '20/07/2011 22:15:00', '20/07/2011 22:16:00', '20/07/2011 22:17:00', '20/07/2011 22:18:00', '20/07/2011 22:19:00', '20/07/2011 22:20:00', '20/07/2011 22:21:00', '20/07/2011 22:22:00', '20/07/2011 22:23:00', '20/07/2011 22:24:00', '20/07/2011 22:25:00', '20/07/2011 22:26:00', '20/07/2011 22:27:00', '20/07/2011 22:28:00', '20/07/2011 22:29:00', '20/07/2011 22:30:00', '20/07/2011 22:31:00', '20/07/2011 22:32:00', '20/07/2011 22:33:00', '20/07/2011 22:34:00', '20/07/2011 22:35:00', '20/07/2011 22:36:00', '20/07/2011 22:37:00', '20/07/2011 22:38:00', '20/07/2011 22:39:00', '20/07/2011 22:40:00', '20/07/2011 22:41:00', '20/07/2011 22:42:00', '20/07/2011 22:43:00', '20/07/2011 22:44:00', '20/07/2011 22:45:00', '20/07/2011 22:46:00', '20/07/2011 22:47:00', '20/07/2011 22:48:00', '20/07/2011 22:49:00', '20/07/2011 22:50:00', '20/07/2011 22:51:00', '20/07/2011 22:52:00', '20/07/2011 22:53:00', '20/07/2011 22:54:00', '20/07/2011 22:55:00', '20/07/2011 22:56:00', '20/07/2011 22:57:00', '20/07/2011 22:58:00', '20/07/2011 22:59:00', '20/07/2011 23:00:00', '20/07/2011 23:01:00', '20/07/2011 23:02:00', '20/07/2011 23:03:00', '20/07/2011 23:04:00', '20/07/2011 23:05:00', '20/07/2011 23:06:00', '20/07/2011 23:07:00', '20/07/2011 23:08:00', '20/07/2011 23:09:00', '20/07/2011 23:10:00', '20/07/2011 23:11:00', '20/07/2011 23:12:00', '20/07/2011 23:13:00', '20/07/2011 23:14:00', '20/07/2011 23:15:00', '20/07/2011 23:16:00', '20/07/2011 23:17:00', '20/07/2011 23:18:00', '20/07/2011 23:19:00', '20/07/2011 23:20:00', '20/07/2011 23:21:00', '20/07/2011 23:22:00', '20/07/2011 23:23:00', '20/07/2011 23:24:00', '20/07/2011 23:25:00', '20/07/2011 23:26:00', '20/07/2011 23:27:00', '20/07/2011 23:28:00', '20/07/2011 23:29:00', '20/07/2011 23:30:00', '20/07/2011 23:31:00', '20/07/2011 23:32:00', '20/07/2011 23:33:00', '20/07/2011 23:34:00', '20/07/2011 23:35:00', '20/07/2011 23:36:00', '20/07/2011 23:37:00', '20/07/2011 23:38:00', '20/07/2011 23:39:00', '20/07/2011 23:40:00', '20/07/2011 23:41:00', '20/07/2011 23:42:00', '20/07/2011 23:43:00', '20/07/2011 23:44:00', '20/07/2011 23:45:00', '20/07/2011 23:46:00', '20/07/2011 23:47:00', '20/07/2011 23:48:00', '20/07/2011 23:49:00', '20/07/2011 23:50:00', '20/07/2011 23:51:00', '20/07/2011 23:52:00', '20/07/2011 23:53:00', '20/07/2011 23:54:00', '20/07/2011 23:55:00', '20/07/2011 23:56:00', '20/07/2011 23:57:00', '20/07/2011 23:58:00', '20/07/2011 23:59:00', '21/07/2011 0:00:00', '21/07/2011 0:01:00', '21/07/2011 0:02:00', '21/07/2011 0:03:00', '21/07/2011 0:04:00', '21/07/2011 0:05:00', '21/07/2011 0:06:00', '21/07/2011 0:07:00', '21/07/2011 0:08:00', '21/07/2011 0:09:00', '21/07/2011 0:10:00', '21/07/2011 0:11:00', '21/07/2011 0:12:00', '21/07/2011 0:13:00', '21/07/2011 0:14:00', '21/07/2011 0:15:00', '21/07/2011 0:16:00', '21/07/2011 0:17:00', '21/07/2011 0:18:00', '21/07/2011 0:19:00', '21/07/2011 0:20:00', '21/07/2011 0:21:00', '21/07/2011 0:22:00', '21/07/2011 0:23:00', '21/07/2011 0:24:00', '21/07/2011 0:25:00', '21/07/2011 0:26:00', '21/07/2011 0:27:00', '21/07/2011 0:28:00', '21/07/2011 0:29:00', '21/07/2011 0:30:00', '21/07/2011 0:31:00', '21/07/2011 0:32:00', '21/07/2011 0:33:00', '21/07/2011 0:34:00', '21/07/2011 0:35:00', '21/07/2011 0:36:00', '21/07/2011 0:37:00', '21/07/2011 0:38:00', '21/07/2011 0:39:00', '21/07/2011 0:40:00', '21/07/2011 0:41:00', '21/07/2011 0:42:00', '21/07/2011 0:43:00', '21/07/2011 0:44:00', '21/07/2011 0:45:00', '21/07/2011 0:46:00', '21/07/2011 0:47:00', '21/07/2011 0:48:00', '21/07/2011 0:49:00', '21/07/2011 0:50:00', '21/07/2011 0:51:00', '21/07/2011 0:52:00', '21/07/2011 0:53:00', '21/07/2011 0:54:00', '21/07/2011 0:55:00', '21/07/2011 0:56:00', '21/07/2011 0:57:00', '21/07/2011 0:58:00', '21/07/2011 0:59:00', '21/07/2011 1:00:00', '21/07/2011 1:01:00', '21/07/2011 1:02:00', '21/07/2011 1:03:00', '21/07/2011 1:04:00', '21/07/2011 1:05:00', '21/07/2011 1:06:00', '21/07/2011 1:07:00', '21/07/2011 1:08:00', '21/07/2011 1:09:00', '21/07/2011 1:10:00', '21/07/2011 1:11:00', '21/07/2011 1:12:00', '21/07/2011 1:13:00', '21/07/2011 1:14:00', '21/07/2011 1:15:00', '21/07/2011 1:16:00', '21/07/2011 1:17:00', '21/07/2011 1:18:00', '21/07/2011 1:19:00', '21/07/2011 1:20:00', '21/07/2011 1:21:00', '21/07/2011 1:22:00', '21/07/2011 1:23:00', '21/07/2011 1:24:00', '21/07/2011 1:25:00', '21/07/2011 1:26:00', '21/07/2011 1:27:00', '21/07/2011 1:28:00', '21/07/2011 1:29:00', '21/07/2011 1:30:00', '21/07/2011 1:31:00', '21/07/2011 1:32:00', '21/07/2011 1:33:00', '21/07/2011 1:34:00', '21/07/2011 1:35:00', '21/07/2011 1:36:00', '21/07/2011 1:37:00', '21/07/2011 1:38:00', '21/07/2011 1:39:00', '21/07/2011 1:40:00', '21/07/2011 1:41:00', '21/07/2011 1:42:00', '21/07/2011 1:43:00', '21/07/2011 1:44:00', '21/07/2011 1:45:00', '21/07/2011 1:46:00', '21/07/2011 1:47:00', '21/07/2011 1:48:00', '21/07/2011 1:49:00', '21/07/2011 1:50:00', '21/07/2011 1:51:00', '21/07/2011 1:52:00', '21/07/2011 1:53:00', '21/07/2011 1:54:00', '21/07/2011 1:55:00', '21/07/2011 1:56:00', '21/07/2011 1:57:00', '21/07/2011 1:58:00', '21/07/2011 1:59:00', '21/07/2011 2:00:00', '21/07/2011 2:01:00', '21/07/2011 2:02:00', '21/07/2011 2:03:00', '21/07/2011 2:04:00', '21/07/2011 2:05:00', '21/07/2011 2:06:00', '21/07/2011 2:07:00', '21/07/2011 2:08:00', '21/07/2011 2:09:00', '21/07/2011 2:10:00', '21/07/2011 2:11:00', '21/07/2011 2:12:00', '21/07/2011 2:13:00', '21/07/2011 2:14:00', '21/07/2011 2:15:00', '21/07/2011 2:16:00', '21/07/2011 2:17:00', '21/07/2011 2:18:00', '21/07/2011 2:19:00', '21/07/2011 2:20:00', '21/07/2011 2:21:00', '21/07/2011 2:22:00', '21/07/2011 2:23:00', '21/07/2011 2:24:00', '21/07/2011 2:25:00', '21/07/2011 2:26:00', '21/07/2011 2:27:00', '21/07/2011 2:28:00', '21/07/2011 2:29:00', '21/07/2011 2:30:00', '21/07/2011 2:31:00', '21/07/2011 2:32:00', '21/07/2011 2:33:00', '21/07/2011 2:34:00', '21/07/2011 2:35:00', '21/07/2011 2:36:00', '21/07/2011 2:37:00', '21/07/2011 2:38:00', '21/07/2011 2:39:00', '21/07/2011 2:40:00', '21/07/2011 2:41:00', '21/07/2011 2:42:00', '21/07/2011 2:43:00', '21/07/2011 2:44:00', '21/07/2011 2:45:00', '21/07/2011 2:46:00', '21/07/2011 2:47:00', '21/07/2011 2:48:00', '21/07/2011 2:49:00', '21/07/2011 2:50:00', '21/07/2011 2:51:00', '21/07/2011 2:52:00', '21/07/2011 2:53:00', '21/07/2011 2:54:00', '21/07/2011 2:55:00', '21/07/2011 2:56:00', '21/07/2011 2:57:00', '21/07/2011 2:58:00', '21/07/2011 2:59:00', '21/07/2011 3:00:00', '21/07/2011 3:01:00', '21/07/2011 3:02:00', '21/07/2011 3:03:00', '21/07/2011 3:04:00', '21/07/2011 3:05:00', '21/07/2011 3:06:00', '21/07/2011 3:07:00', '21/07/2011 3:08:00', '21/07/2011 3:09:00', '21/07/2011 3:10:00', '21/07/2011 3:11:00', '21/07/2011 3:12:00', '21/07/2011 3:13:00', '21/07/2011 3:14:00', '21/07/2011 3:15:00', '21/07/2011 3:16:00', '21/07/2011 3:17:00', '21/07/2011 3:18:00', '21/07/2011 3:19:00', '21/07/2011 3:20:00', '21/07/2011 3:21:00', '21/07/2011 3:22:00', '21/07/2011 3:23:00', '21/07/2011 3:24:00', '21/07/2011 3:25:00', '21/07/2011 3:26:00', '21/07/2011 3:27:00', '21/07/2011 3:28:00', '21/07/2011 3:29:00', '21/07/2011 3:30:00', '21/07/2011 3:31:00', '21/07/2011 3:32:00', '21/07/2011 3:33:00', '21/07/2011 3:34:00', '21/07/2011 3:35:00', '21/07/2011 3:36:00', '21/07/2011 3:37:00', '21/07/2011 3:38:00', '21/07/2011 3:39:00', '21/07/2011 3:40:00', '21/07/2011 3:41:00', '21/07/2011 3:42:00', '21/07/2011 3:43:00', '21/07/2011 3:44:00', '21/07/2011 3:45:00', '21/07/2011 3:46:00', '21/07/2011 3:47:00', '21/07/2011 3:48:00', '21/07/2011 3:49:00', '21/07/2011 3:50:00', '21/07/2011 3:51:00', '21/07/2011 3:52:00', '21/07/2011 3:53:00', '21/07/2011 3:54:00', '21/07/2011 3:55:00', '21/07/2011 3:56:00', '21/07/2011 3:57:00', '21/07/2011 3:58:00', '21/07/2011 3:59:00', '21/07/2011 4:00:00', '21/07/2011 4:01:00', '21/07/2011 4:02:00', '21/07/2011 4:03:00', '21/07/2011 4:04:00', '21/07/2011 4:05:00', '21/07/2011 4:06:00', '21/07/2011 4:07:00', '21/07/2011 4:08:00', '21/07/2011 4:09:00', '21/07/2011 4:10:00', '21/07/2011 4:11:00', '21/07/2011 4:12:00', '21/07/2011 4:13:00', '21/07/2011 4:14:00', '21/07/2011 4:15:00', '21/07/2011 4:16:00', '21/07/2011 4:17:00', '21/07/2011 4:18:00', '21/07/2011 4:19:00', '21/07/2011 4:20:00', '21/07/2011 4:21:00', '21/07/2011 4:22:00', '21/07/2011 4:23:00', '21/07/2011 4:24:00', '21/07/2011 4:25:00', '21/07/2011 4:26:00', '21/07/2011 4:27:00', '21/07/2011 4:28:00', '21/07/2011 4:29:00', '21/07/2011 4:30:00', '21/07/2011 4:31:00', '21/07/2011 4:32:00', '21/07/2011 4:33:00', '21/07/2011 4:34:00', '21/07/2011 4:35:00', '21/07/2011 4:36:00', '21/07/2011 4:37:00', '21/07/2011 4:38:00', '21/07/2011 4:39:00', '21/07/2011 4:40:00', '21/07/2011 4:41:00', '21/07/2011 4:42:00', '21/07/2011 4:43:00', '21/07/2011 4:44:00', '21/07/2011 4:45:00', '21/07/2011 4:46:00', '21/07/2011 4:47:00', '21/07/2011 4:48:00', '21/07/2011 4:49:00', '21/07/2011 4:50:00', '21/07/2011 4:51:00', '21/07/2011 4:52:00', '21/07/2011 4:53:00', '21/07/2011 4:54:00', '21/07/2011 4:55:00', '21/07/2011 4:56:00', '21/07/2011 4:57:00', '21/07/2011 4:58:00', '21/07/2011 4:59:00', '21/07/2011 5:00:00', '21/07/2011 5:01:00', '21/07/2011 5:02:00', '21/07/2011 5:03:00', '21/07/2011 5:04:00', '21/07/2011 5:05:00', '21/07/2011 5:06:00', '21/07/2011 5:07:00', '21/07/2011 5:08:00', '21/07/2011 5:09:00', '21/07/2011 5:10:00', '21/07/2011 5:11:00', '21/07/2011 5:12:00', '21/07/2011 5:13:00', '21/07/2011 5:14:00', '21/07/2011 5:15:00', '21/07/2011 5:16:00', '21/07/2011 5:17:00', '21/07/2011 5:18:00', '21/07/2011 5:19:00', '21/07/2011 5:20:00', '21/07/2011 5:21:00', '21/07/2011 5:22:00', '21/07/2011 5:23:00', '21/07/2011 5:24:00', '21/07/2011 5:25:00', '21/07/2011 5:26:00', '21/07/2011 5:27:00', '21/07/2011 5:28:00', '21/07/2011 5:29:00', '21/07/2011 5:30:00', '21/07/2011 5:31:00', '21/07/2011 5:32:00', '21/07/2011 5:33:00', '21/07/2011 5:34:00', '21/07/2011 5:35:00', '21/07/2011 5:36:00', '21/07/2011 5:37:00', '21/07/2011 5:38:00', '21/07/2011 5:39:00', '21/07/2011 5:40:00', '21/07/2011 5:41:00', '21/07/2011 5:42:00', '21/07/2011 5:43:00', '21/07/2011 5:44:00', '21/07/2011 5:45:00', '21/07/2011 5:46:00', '21/07/2011 5:47:00', '21/07/2011 5:48:00', '21/07/2011 5:49:00', '21/07/2011 5:50:00', '21/07/2011 5:51:00', '21/07/2011 5:52:00', '21/07/2011 5:53:00', '21/07/2011 5:54:00', '21/07/2011 5:55:00', '21/07/2011 5:56:00', '21/07/2011 5:57:00', '21/07/2011 5:58:00', '21/07/2011 5:59:00', '21/07/2011 6:00:00', '21/07/2011 6:01:00', '21/07/2011 6:02:00', '21/07/2011 6:03:00', '21/07/2011 6:04:00', '21/07/2011 6:05:00', '21/07/2011 6:06:00', '21/07/2011 6:07:00', '21/07/2011 6:08:00', '21/07/2011 6:09:00', '21/07/2011 6:10:00', '21/07/2011 6:11:00', '21/07/2011 6:12:00', '21/07/2011 6:13:00', '21/07/2011 6:14:00', '21/07/2011 6:15:00', '21/07/2011 6:16:00', '21/07/2011 6:17:00', '21/07/2011 6:18:00', '21/07/2011 6:19:00', '21/07/2011 6:20:00', '21/07/2011 6:21:00', '21/07/2011 6:22:00', '21/07/2011 6:23:00', '21/07/2011 6:24:00', '21/07/2011 6:25:00', '21/07/2011 6:26:00', '21/07/2011 6:27:00', '21/07/2011 6:28:00', '21/07/2011 6:29:00', '21/07/2011 6:30:00', '21/07/2011 6:31:00', '21/07/2011 6:32:00', '21/07/2011 6:33:00', '21/07/2011 6:34:00', '21/07/2011 6:35:00', '21/07/2011 6:36:00', '21/07/2011 6:37:00', '21/07/2011 6:38:00', '21/07/2011 6:39:00', '21/07/2011 6:40:00', '21/07/2011 6:41:00', '21/07/2011 6:42:00', '21/07/2011 6:43:00', '21/07/2011 6:44:00', '21/07/2011 6:45:00', '21/07/2011 6:46:00', '21/07/2011 6:47:00', '21/07/2011 6:48:00', '21/07/2011 6:49:00', '21/07/2011 6:50:00', '21/07/2011 6:51:00', '21/07/2011 6:52:00', '21/07/2011 6:53:00', '21/07/2011 6:54:00', '21/07/2011 6:55:00', '21/07/2011 6:56:00', '21/07/2011 6:57:00', '21/07/2011 6:58:00', '21/07/2011 6:59:00', '21/07/2011 7:00:00', '21/07/2011 7:01:00', '21/07/2011 7:02:00', '21/07/2011 7:03:00', '21/07/2011 7:04:00', '21/07/2011 7:05:00', '21/07/2011 7:06:00', '21/07/2011 7:07:00', '21/07/2011 7:08:00', '21/07/2011 7:09:00', '21/07/2011 7:10:00', '21/07/2011 7:11:00', '21/07/2011 7:12:00', '21/07/2011 7:13:00', '21/07/2011 7:14:00', '21/07/2011 7:15:00', '21/07/2011 7:16:00', '21/07/2011 7:17:00', '21/07/2011 7:18:00', '21/07/2011 7:19:00', '21/07/2011 7:20:00', '21/07/2011 7:21:00', '21/07/2011 7:22:00', '21/07/2011 7:23:00', '21/07/2011 7:24:00', '21/07/2011 7:25:00', '21/07/2011 7:26:00', '21/07/2011 7:27:00', '21/07/2011 7:28:00', '21/07/2011 7:29:00', '21/07/2011 7:30:00', '21/07/2011 7:31:00', '21/07/2011 7:32:00', '21/07/2011 7:33:00', '21/07/2011 7:34:00', '21/07/2011 7:35:00', '21/07/2011 7:36:00', '21/07/2011 7:37:00', '21/07/2011 7:38:00', '21/07/2011 7:39:00', '21/07/2011 7:40:00', '21/07/2011 7:41:00', '21/07/2011 7:42:00', '21/07/2011 7:43:00', '21/07/2011 7:44:00', '21/07/2011 7:45:00', '21/07/2011 7:46:00', '21/07/2011 7:47:00', '21/07/2011 7:48:00', '21/07/2011 7:49:00', '21/07/2011 7:50:00', '21/07/2011 7:51:00', '21/07/2011 7:52:00', '21/07/2011 7:53:00', '21/07/2011 7:54:00', '21/07/2011 7:55:00', '21/07/2011 7:56:00', '21/07/2011 7:57:00', '21/07/2011 7:58:00', '21/07/2011 7:59:00', '21/07/2011 8:00:00', '21/07/2011 8:01:00', '21/07/2011 8:02:00', '21/07/2011 8:03:00', '21/07/2011 8:04:00', '21/07/2011 8:05:00', '21/07/2011 8:06:00', '21/07/2011 8:07:00', '21/07/2011 8:08:00', '21/07/2011 8:09:00', '21/07/2011 8:10:00', '21/07/2011 8:11:00', '21/07/2011 8:12:00', '21/07/2011 8:13:00', '21/07/2011 8:14:00', '21/07/2011 8:15:00', '21/07/2011 8:16:00', '21/07/2011 8:17:00', '21/07/2011 8:18:00', '21/07/2011 8:19:00', '21/07/2011 8:20:00', '21/07/2011 8:21:00', '21/07/2011 8:22:00', '21/07/2011 8:23:00', '21/07/2011 8:24:00', '21/07/2011 8:25:00', '21/07/2011 8:26:00', '21/07/2011 8:27:00', '21/07/2011 8:28:00', '21/07/2011 8:29:00', '21/07/2011 8:30:00', '21/07/2011 8:31:00', '21/07/2011 8:32:00', '21/07/2011 8:33:00', '21/07/2011 8:34:00', '21/07/2011 8:35:00', '21/07/2011 8:36:00', '21/07/2011 8:37:00', '21/07/2011 8:38:00', '21/07/2011 8:39:00', '21/07/2011 8:40:00', '21/07/2011 8:41:00', '21/07/2011 8:42:00', '21/07/2011 8:43:00', '21/07/2011 8:44:00', '21/07/2011 8:45:00', '21/07/2011 8:46:00', '21/07/2011 8:47:00', '21/07/2011 8:48:00', '21/07/2011 8:49:00', '21/07/2011 8:50:00', '21/07/2011 8:51:00', '21/07/2011 8:52:00', '21/07/2011 8:53:00', '21/07/2011 8:54:00', '21/07/2011 8:55:00', '21/07/2011 8:56:00', '21/07/2011 8:57:00', '21/07/2011 8:58:00', '21/07/2011 8:59:00', '21/07/2011 9:00:00', '21/07/2011 9:01:00', '21/07/2011 9:02:00', '21/07/2011 9:03:00', '21/07/2011 9:04:00', '21/07/2011 9:05:00', '21/07/2011 9:06:00', '21/07/2011 9:07:00', '21/07/2011 9:08:00', '21/07/2011 9:09:00', '21/07/2011 9:10:00', '21/07/2011 9:11:00', '21/07/2011 9:12:00', '21/07/2011 9:13:00', '21/07/2011 9:14:00', '21/07/2011 9:15:00', '21/07/2011 9:16:00', '21/07/2011 9:17:00', '21/07/2011 9:18:00', '21/07/2011 9:19:00', '21/07/2011 9:20:00', '21/07/2011 9:21:00', '21/07/2011 9:22:00', '21/07/2011 9:23:00', '21/07/2011 9:24:00', '21/07/2011 9:25:00', '21/07/2011 9:26:00', '21/07/2011 9:27:00', '21/07/2011 9:28:00', '21/07/2011 9:29:00', '21/07/2011 9:30:00', '21/07/2011 9:31:00', '21/07/2011 9:32:00', '21/07/2011 9:33:00', '21/07/2011 9:34:00', '21/07/2011 9:35:00', '21/07/2011 9:36:00', '21/07/2011 9:37:00', '21/07/2011 9:38:00', '21/07/2011 9:39:00', '21/07/2011 9:40:00', '21/07/2011 9:41:00', '21/07/2011 9:42:00', '21/07/2011 9:43:00', '21/07/2011 9:44:00', '21/07/2011 9:45:00', '21/07/2011 9:46:00', '21/07/2011 9:47:00', '21/07/2011 9:48:00', '21/07/2011 9:49:00', '21/07/2011 9:50:00', '21/07/2011 9:51:00', '21/07/2011 9:52:00', '21/07/2011 9:53:00', '21/07/2011 9:54:00', '21/07/2011 9:55:00', '21/07/2011 9:56:00', '21/07/2011 9:57:00', '21/07/2011 9:58:00', '21/07/2011 9:59:00', '21/07/2011 10:00:00', '21/07/2011 10:01:00', '21/07/2011 10:02:00', '21/07/2011 10:03:00', '21/07/2011 10:04:00', '21/07/2011 10:05:00', '21/07/2011 10:06:00', '21/07/2011 10:07:00', '21/07/2011 10:08:00', '21/07/2011 10:09:00', '21/07/2011 10:10:00', '21/07/2011 10:11:00', '21/07/2011 10:12:00', '21/07/2011 10:13:00', '21/07/2011 10:14:00', '21/07/2011 10:15:00', '21/07/2011 10:16:00', '21/07/2011 10:17:00', '21/07/2011 10:18:00', '21/07/2011 10:19:00', '21/07/2011 10:20:00', '21/07/2011 10:21:00', '21/07/2011 10:22:00', '21/07/2011 10:23:00', '21/07/2011 10:24:00', '21/07/2011 10:25:00', '21/07/2011 10:26:00', '21/07/2011 10:27:00', '21/07/2011 10:28:00', '21/07/2011 10:29:00', '21/07/2011 10:30:00', '21/07/2011 10:31:00', '21/07/2011 10:32:00', '21/07/2011 10:33:00', '21/07/2011 10:34:00', '21/07/2011 10:35:00', '21/07/2011 10:36:00', '21/07/2011 10:37:00', '21/07/2011 10:38:00', '21/07/2011 10:39:00', '21/07/2011 10:40:00', '21/07/2011 10:41:00', '21/07/2011 10:42:00', '21/07/2011 10:43:00', '21/07/2011 10:44:00', '21/07/2011 10:45:00', '21/07/2011 10:46:00', '21/07/2011 10:47:00', '21/07/2011 10:48:00', '21/07/2011 10:49:00', '21/07/2011 10:50:00', '21/07/2011 10:51:00', '21/07/2011 10:52:00', '21/07/2011 10:53:00', '21/07/2011 10:54:00', '21/07/2011 10:55:00', '21/07/2011 10:56:00', '21/07/2011 10:57:00', '21/07/2011 10:58:00', '21/07/2011 10:59:00', '21/07/2011 11:00:00', '21/07/2011 11:01:00', '21/07/2011 11:02:00', '21/07/2011 11:03:00', '21/07/2011 11:04:00', '21/07/2011 11:05:00', '21/07/2011 11:06:00', '21/07/2011 11:07:00', '21/07/2011 11:08:00', '21/07/2011 11:09:00', '21/07/2011 11:10:00', '21/07/2011 11:11:00', '21/07/2011 11:12:00', '21/07/2011 11:13:00', '21/07/2011 11:14:00', '21/07/2011 11:15:00', '21/07/2011 11:16:00', '21/07/2011 11:17:00', '21/07/2011 11:18:00', '21/07/2011 11:19:00', '21/07/2011 11:20:00', '21/07/2011 11:21:00', '21/07/2011 11:22:00', '21/07/2011 11:23:00', '21/07/2011 11:24:00', '21/07/2011 11:25:00', '21/07/2011 11:26:00', '21/07/2011 11:27:00', '21/07/2011 11:28:00', '21/07/2011 11:29:00', '21/07/2011 11:30:00', '21/07/2011 11:31:00', '21/07/2011 11:32:00', '21/07/2011 11:33:00', '21/07/2011 11:34:00', '21/07/2011 11:35:00', '21/07/2011 11:36:00', '21/07/2011 11:37:00', '21/07/2011 11:38:00', '21/07/2011 11:39:00', '21/07/2011 11:40:00', '21/07/2011 11:41:00', '21/07/2011 11:42:00', '21/07/2011 11:43:00', '21/07/2011 11:44:00', '21/07/2011 11:45:00', '21/07/2011 11:46:00', '21/07/2011 11:47:00', '21/07/2011 11:48:00', '21/07/2011 11:49:00', '21/07/2011 11:50:00', '21/07/2011 11:51:00', '21/07/2011 11:52:00', '21/07/2011 11:53:00', '21/07/2011 11:54:00', '21/07/2011 11:55:00', '21/07/2011 11:56:00', '21/07/2011 11:57:00', '21/07/2011 11:58:00', '21/07/2011 11:59:00', '21/07/2011 12:00:00', '21/07/2011 12:01:00', '21/07/2011 12:02:00', '21/07/2011 12:03:00', '21/07/2011 12:04:00', '21/07/2011 12:05:00', '21/07/2011 12:06:00', '21/07/2011 12:07:00', '21/07/2011 12:08:00', '21/07/2011 12:09:00', '21/07/2011 12:10:00', '21/07/2011 12:11:00', '21/07/2011 12:12:00', '21/07/2011 12:13:00', '21/07/2011 12:14:00', '21/07/2011 12:15:00', '21/07/2011 12:16:00', '21/07/2011 12:17:00', '21/07/2011 12:18:00', '21/07/2011 12:19:00', '21/07/2011 12:20:00', '21/07/2011 12:21:00', '21/07/2011 12:22:00', '21/07/2011 12:23:00', '21/07/2011 12:24:00', '21/07/2011 12:25:00', '21/07/2011 12:26:00', '21/07/2011 12:27:00', '21/07/2011 12:28:00', '21/07/2011 12:29:00', '21/07/2011 12:30:00', '21/07/2011 12:31:00', '21/07/2011 12:32:00', '21/07/2011 12:33:00', '21/07/2011 12:34:00', '21/07/2011 12:35:00', '21/07/2011 12:36:00', '21/07/2011 12:37:00', '21/07/2011 12:38:00', '21/07/2011 12:39:00', '21/07/2011 12:40:00', '21/07/2011 12:41:00', '21/07/2011 12:42:00', '21/07/2011 12:43:00', '21/07/2011 12:44:00', '21/07/2011 12:45:00', '21/07/2011 12:46:00', '21/07/2011 12:47:00', '21/07/2011 12:48:00', '21/07/2011 12:49:00', '21/07/2011 12:50:00', '21/07/2011 12:51:00', '21/07/2011 12:52:00', '21/07/2011 12:53:00', '21/07/2011 12:54:00', '21/07/2011 12:55:00', '21/07/2011 12:56:00', '21/07/2011 12:57:00', '21/07/2011 12:58:00', '21/07/2011 12:59:00', '21/07/2011 13:00:00', '21/07/2011 13:01:00', '21/07/2011 13:02:00', '21/07/2011 13:03:00', '21/07/2011 13:04:00', '21/07/2011 13:05:00', '21/07/2011 13:06:00', '21/07/2011 13:07:00', '21/07/2011 13:08:00', '21/07/2011 13:09:00', '21/07/2011 13:10:00', '21/07/2011 13:11:00', '21/07/2011 13:12:00', '21/07/2011 13:13:00', '21/07/2011 13:14:00', '21/07/2011 13:15:00', '21/07/2011 13:16:00', '21/07/2011 13:17:00', '21/07/2011 13:18:00', '21/07/2011 13:19:00', '21/07/2011 13:20:00', '21/07/2011 13:21:00', '21/07/2011 13:22:00', '21/07/2011 13:23:00', '21/07/2011 13:24:00', '21/07/2011 13:25:00', '21/07/2011 13:26:00', '21/07/2011 13:27:00', '21/07/2011 13:28:00', '21/07/2011 13:29:00', '21/07/2011 13:30:00', '21/07/2011 13:31:00', '21/07/2011 13:32:00', '21/07/2011 13:33:00', '21/07/2011 13:34:00', '21/07/2011 13:35:00', '21/07/2011 13:36:00', '21/07/2011 13:37:00', '21/07/2011 13:38:00', '21/07/2011 13:39:00', '21/07/2011 13:40:00', '21/07/2011 13:41:00', '21/07/2011 13:42:00', '21/07/2011 13:43:00', '21/07/2011 13:44:00', '21/07/2011 13:45:00', '21/07/2011 13:46:00', '21/07/2011 13:47:00', '21/07/2011 13:48:00', '21/07/2011 13:49:00', '21/07/2011 13:50:00', '21/07/2011 13:51:00', '21/07/2011 13:52:00', '21/07/2011 13:53:00', '21/07/2011 13:54:00', '21/07/2011 13:55:00', '21/07/2011 13:56:00', '21/07/2011 13:57:00', '21/07/2011 13:58:00', '21/07/2011 13:59:00', '21/07/2011 14:00:00', '21/07/2011 14:01:00', '21/07/2011 14:02:00', '21/07/2011 14:03:00', '21/07/2011 14:04:00', '21/07/2011 14:05:00', '21/07/2011 14:06:00', '21/07/2011 14:07:00', '21/07/2011 14:08:00', '21/07/2011 14:09:00', '21/07/2011 14:10:00', '21/07/2011 14:11:00', '21/07/2011 14:12:00', '21/07/2011 14:13:00', '21/07/2011 14:14:00', '21/07/2011 14:15:00', '21/07/2011 14:16:00', '21/07/2011 14:17:00', '21/07/2011 14:18:00', '21/07/2011 14:19:00', '21/07/2011 14:20:00', '21/07/2011 14:21:00', '21/07/2011 14:22:00', '21/07/2011 14:23:00', '21/07/2011 14:24:00', '21/07/2011 14:25:00', '21/07/2011 14:26:00', '21/07/2011 14:27:00', '21/07/2011 14:28:00', '21/07/2011 14:29:00', '21/07/2011 14:30:00', '21/07/2011 14:31:00', '21/07/2011 14:32:00', '21/07/2011 14:33:00', '21/07/2011 14:34:00', '21/07/2011 14:35:00', '21/07/2011 14:36:00', '21/07/2011 14:37:00', '21/07/2011 14:38:00', '21/07/2011 14:39:00', '21/07/2011 14:40:00', '21/07/2011 14:41:00', '21/07/2011 14:42:00', '21/07/2011 14:43:00', '21/07/2011 14:44:00', '21/07/2011 14:45:00', '21/07/2011 14:46:00', '21/07/2011 14:47:00', '21/07/2011 14:48:00', '21/07/2011 14:49:00', '21/07/2011 14:50:00', '21/07/2011 14:51:00', '21/07/2011 14:52:00', '21/07/2011 14:53:00', '21/07/2011 14:54:00', '21/07/2011 14:55:00', '21/07/2011 14:56:00', '21/07/2011 14:57:00', '21/07/2011 14:58:00', '21/07/2011 14:59:00', '21/07/2011 15:00:00', '21/07/2011 15:01:00', '21/07/2011 15:02:00', '21/07/2011 15:03:00', '21/07/2011 15:04:00', '21/07/2011 15:05:00', '21/07/2011 15:06:00', '21/07/2011 15:07:00', '21/07/2011 15:08:00', '21/07/2011 15:09:00', '21/07/2011 15:10:00', '21/07/2011 15:11:00', '21/07/2011 15:12:00', '21/07/2011 15:13:00', '21/07/2011 15:14:00', '21/07/2011 15:15:00', '21/07/2011 15:16:00', '21/07/2011 15:17:00', '21/07/2011 15:18:00', '21/07/2011 15:19:00', '21/07/2011 15:20:00', '21/07/2011 15:21:00', '21/07/2011 15:22:00', '21/07/2011 15:23:00', '21/07/2011 15:24:00', '21/07/2011 15:25:00', '21/07/2011 15:26:00', '21/07/2011 15:27:00', '21/07/2011 15:28:00', '21/07/2011 15:29:00', '21/07/2011 15:30:00', '21/07/2011 15:31:00', '21/07/2011 15:32:00', '21/07/2011 15:33:00', '21/07/2011 15:34:00', '21/07/2011 15:35:00', '21/07/2011 15:36:00', '21/07/2011 15:37:00', '21/07/2011 15:38:00', '21/07/2011 15:39:00', '21/07/2011 15:40:00', '21/07/2011 15:41:00', '21/07/2011 15:42:00', '21/07/2011 15:43:00', '21/07/2011 15:44:00', '21/07/2011 15:45:00', '21/07/2011 15:46:00', '21/07/2011 15:47:00', '21/07/2011 15:48:00', '21/07/2011 15:49:00', '21/07/2011 15:50:00', '21/07/2011 15:51:00', '21/07/2011 15:52:00', '21/07/2011 15:53:00', '21/07/2011 15:54:00', '21/07/2011 15:55:00', '21/07/2011 15:56:00', '21/07/2011 15:57:00', '21/07/2011 15:58:00', '21/07/2011 15:59:00', '21/07/2011 16:00:00', '21/07/2011 16:01:00', '21/07/2011 16:02:00', '21/07/2011 16:03:00', '21/07/2011 16:04:00', '21/07/2011 16:05:00', '21/07/2011 16:06:00', '21/07/2011 16:07:00', '21/07/2011 16:08:00', '21/07/2011 16:09:00', '21/07/2011 16:10:00', '21/07/2011 16:11:00', '21/07/2011 16:12:00', '21/07/2011 16:13:00', '21/07/2011 16:14:00', '21/07/2011 16:15:00', '21/07/2011 16:16:00', '21/07/2011 16:17:00', '21/07/2011 16:18:00', '21/07/2011 16:19:00', '21/07/2011 16:20:00', '21/07/2011 16:21:00', '21/07/2011 16:22:00', '21/07/2011 16:23:00', '21/07/2011 16:24:00', '21/07/2011 16:25:00', '21/07/2011 16:26:00', '21/07/2011 16:27:00', '21/07/2011 16:28:00', '21/07/2011 16:29:00', '21/07/2011 16:30:00', '21/07/2011 16:31:00', '21/07/2011 16:32:00', '21/07/2011 16:33:00', '21/07/2011 16:34:00', '21/07/2011 16:35:00', '21/07/2011 16:36:00', '21/07/2011 16:37:00', '21/07/2011 16:38:00', '21/07/2011 16:39:00', '21/07/2011 16:40:00', '21/07/2011 16:41:00', '21/07/2011 16:42:00', '21/07/2011 16:43:00', '21/07/2011 16:44:00', '21/07/2011 16:45:00', '21/07/2011 16:46:00', '21/07/2011 16:47:00', '21/07/2011 16:48:00', '21/07/2011 16:49:00', '21/07/2011 16:50:00', '21/07/2011 16:51:00', '21/07/2011 16:52:00', '21/07/2011 16:53:00', '21/07/2011 16:54:00', '21/07/2011 16:55:00', '21/07/2011 16:56:00', '21/07/2011 16:57:00', '21/07/2011 16:58:00', '21/07/2011 16:59:00', '21/07/2011 17:00:00', '21/07/2011 17:01:00', '21/07/2011 17:02:00', '21/07/2011 17:03:00', '21/07/2011 17:04:00', '21/07/2011 17:05:00', '21/07/2011 17:06:00', '21/07/2011 17:07:00', '21/07/2011 17:08:00', '21/07/2011 17:09:00', '21/07/2011 17:10:00', '21/07/2011 17:11:00', '21/07/2011 17:12:00', '21/07/2011 17:13:00', '21/07/2011 17:14:00', '21/07/2011 17:15:00', '21/07/2011 17:16:00', '21/07/2011 17:17:00', '21/07/2011 17:18:00', '21/07/2011 17:19:00', '21/07/2011 17:20:00', '21/07/2011 17:21:00', '21/07/2011 17:22:00', '21/07/2011 17:23:00', '21/07/2011 17:24:00', '21/07/2011 17:25:00', '21/07/2011 17:26:00', '21/07/2011 17:27:00', '21/07/2011 17:28:00', '21/07/2011 17:29:00', '21/07/2011 17:30:00', '21/07/2011 17:31:00', '21/07/2011 17:32:00', '21/07/2011 17:33:00', '21/07/2011 17:34:00', '21/07/2011 17:35:00', '21/07/2011 17:36:00', '21/07/2011 17:37:00', '21/07/2011 17:38:00', '21/07/2011 17:39:00', '21/07/2011 17:40:00', '21/07/2011 17:41:00', '21/07/2011 17:42:00', '21/07/2011 17:43:00', '21/07/2011 17:44:00', '21/07/2011 17:45:00', '21/07/2011 17:46:00', '21/07/2011 17:47:00', '21/07/2011 17:48:00', '21/07/2011 17:49:00', '21/07/2011 17:50:00', '21/07/2011 17:51:00', '21/07/2011 17:52:00', '21/07/2011 17:53:00', '21/07/2011 17:54:00', '21/07/2011 17:55:00', '21/07/2011 17:56:00', '21/07/2011 17:57:00', '21/07/2011 17:58:00', '21/07/2011 17:59:00', '21/07/2011 18:00:00', '21/07/2011 18:01:00', '21/07/2011 18:02:00', '21/07/2011 18:03:00', '21/07/2011 18:04:00', '21/07/2011 18:05:00', '21/07/2011 18:06:00', '21/07/2011 18:07:00', '21/07/2011 18:08:00', '21/07/2011 18:09:00', '21/07/2011 18:10:00', '21/07/2011 18:11:00', '21/07/2011 18:12:00', '21/07/2011 18:13:00', '21/07/2011 18:14:00', '21/07/2011 18:15:00', '21/07/2011 18:16:00', '21/07/2011 18:17:00', '21/07/2011 18:18:00', '21/07/2011 18:19:00', '21/07/2011 18:20:00', '21/07/2011 18:21:00', '21/07/2011 18:22:00', '21/07/2011 18:23:00', '21/07/2011 18:24:00', '21/07/2011 18:25:00', '21/07/2011 18:26:00', '21/07/2011 18:27:00', '21/07/2011 18:28:00', '21/07/2011 18:29:00', '21/07/2011 18:30:00', '21/07/2011 18:31:00', '21/07/2011 18:32:00', '21/07/2011 18:33:00', '21/07/2011 18:34:00', '21/07/2011 18:35:00', '21/07/2011 18:36:00', '21/07/2011 18:37:00', '21/07/2011 18:38:00', '21/07/2011 18:39:00', '21/07/2011 18:40:00', '21/07/2011 18:41:00', '21/07/2011 18:42:00', '21/07/2011 18:43:00', '21/07/2011 18:44:00', '21/07/2011 18:45:00', '21/07/2011 18:46:00', '21/07/2011 18:47:00', '21/07/2011 18:48:00', '21/07/2011 18:49:00', '21/07/2011 18:50:00', '21/07/2011 18:51:00', '21/07/2011 18:52:00', '21/07/2011 18:53:00', '21/07/2011 18:54:00', '21/07/2011 18:55:00', '21/07/2011 18:56:00', '21/07/2011 18:57:00', '21/07/2011 18:58:00', '21/07/2011 18:59:00', '21/07/2011 19:00:00', '21/07/2011 19:01:00', '21/07/2011 19:02:00', '21/07/2011 19:03:00', '21/07/2011 19:04:00', '21/07/2011 19:05:00', '21/07/2011 19:06:00', '21/07/2011 19:07:00', '21/07/2011 19:08:00', '21/07/2011 19:09:00', '21/07/2011 19:10:00', '21/07/2011 19:11:00', '21/07/2011 19:12:00', '21/07/2011 19:13:00', '21/07/2011 19:14:00', '21/07/2011 19:15:00', '21/07/2011 19:16:00', '21/07/2011 19:17:00', '21/07/2011 19:18:00', '21/07/2011 19:19:00', '21/07/2011 19:20:00', '21/07/2011 19:21:00', '21/07/2011 19:22:00', '21/07/2011 19:23:00', '21/07/2011 19:24:00', '21/07/2011 19:25:00', '21/07/2011 19:26:00', '21/07/2011 19:27:00', '21/07/2011 19:28:00', '21/07/2011 19:29:00', '21/07/2011 19:30:00', '21/07/2011 19:31:00', '21/07/2011 19:32:00', '21/07/2011 19:33:00', '21/07/2011 19:34:00', '21/07/2011 19:35:00', '21/07/2011 19:36:00', '21/07/2011 19:37:00', '21/07/2011 19:38:00', '21/07/2011 19:39:00', '21/07/2011 19:40:00', '21/07/2011 19:41:00', '21/07/2011 19:42:00', '21/07/2011 19:43:00', '21/07/2011 19:44:00', '21/07/2011 19:45:00', '21/07/2011 19:46:00', '21/07/2011 19:47:00', '21/07/2011 19:48:00', '21/07/2011 19:49:00', '21/07/2011 19:50:00', '21/07/2011 19:51:00', '21/07/2011 19:52:00', '21/07/2011 19:53:00', '21/07/2011 19:54:00', '21/07/2011 19:55:00', '21/07/2011 19:56:00', '21/07/2011 19:57:00', '21/07/2011 19:58:00', '21/07/2011 19:59:00', '21/07/2011 20:00:00', '21/07/2011 20:01:00', '21/07/2011 20:02:00', '21/07/2011 20:03:00', '21/07/2011 20:04:00', '21/07/2011 20:05:00', '21/07/2011 20:06:00', '21/07/2011 20:07:00', '21/07/2011 20:08:00', '21/07/2011 20:09:00', '21/07/2011 20:10:00', '21/07/2011 20:11:00', '21/07/2011 20:12:00', '21/07/2011 20:13:00', '21/07/2011 20:14:00', '21/07/2011 20:15:00', '21/07/2011 20:16:00', '21/07/2011 20:17:00', '21/07/2011 20:18:00', '21/07/2011 20:19:00', '21/07/2011 20:20:00', '21/07/2011 20:21:00', '21/07/2011 20:22:00', '21/07/2011 20:23:00', '21/07/2011 20:24:00', '21/07/2011 20:25:00', '21/07/2011 20:26:00', '21/07/2011 20:27:00', '21/07/2011 20:28:00', '21/07/2011 20:29:00', '21/07/2011 20:30:00', '21/07/2011 20:31:00', '21/07/2011 20:32:00', '21/07/2011 20:33:00', '21/07/2011 20:34:00', '21/07/2011 20:35:00', '21/07/2011 20:36:00', '21/07/2011 20:37:00', '21/07/2011 20:38:00', '21/07/2011 20:39:00', '21/07/2011 20:40:00', '21/07/2011 20:41:00', '21/07/2011 20:42:00', '21/07/2011 20:43:00', '21/07/2011 20:44:00', '21/07/2011 20:45:00', '21/07/2011 20:46:00', '21/07/2011 20:47:00', '21/07/2011 20:48:00', '21/07/2011 20:49:00', '21/07/2011 20:50:00', '21/07/2011 20:51:00', '21/07/2011 20:52:00', '21/07/2011 20:53:00', '21/07/2011 20:54:00', '21/07/2011 20:55:00', '21/07/2011 20:56:00', '21/07/2011 20:57:00', '21/07/2011 20:58:00', '21/07/2011 20:59:00', '21/07/2011 21:00:00', '21/07/2011 21:01:00', '21/07/2011 21:02:00', '21/07/2011 21:03:00', '21/07/2011 21:04:00', '21/07/2011 21:05:00', '21/07/2011 21:06:00', '21/07/2011 21:07:00', '21/07/2011 21:08:00', '21/07/2011 21:09:00', '21/07/2011 21:10:00', '21/07/2011 21:11:00', '21/07/2011 21:12:00', '21/07/2011 21:13:00', '21/07/2011 21:14:00', '21/07/2011 21:15:00', '21/07/2011 21:16:00', '21/07/2011 21:17:00', '21/07/2011 21:18:00', '21/07/2011 21:19:00', '21/07/2011 21:20:00', '21/07/2011 21:21:00', '21/07/2011 21:22:00', '21/07/2011 21:23:00', '21/07/2011 21:24:00', '21/07/2011 21:25:00', '21/07/2011 21:26:00', '21/07/2011 21:27:00', '21/07/2011 21:28:00', '21/07/2011 21:29:00', '21/07/2011 21:30:00', '21/07/2011 21:31:00', '21/07/2011 21:32:00', '21/07/2011 21:33:00', '21/07/2011 21:34:00', '21/07/2011 21:35:00', '21/07/2011 21:36:00', '21/07/2011 21:37:00', '21/07/2011 21:38:00', '21/07/2011 21:39:00', '21/07/2011 21:40:00', '21/07/2011 21:41:00', '21/07/2011 21:42:00', '21/07/2011 21:43:00', '21/07/2011 21:44:00', '21/07/2011 21:45:00', '21/07/2011 21:46:00', '21/07/2011 21:47:00', '21/07/2011 21:48:00', '21/07/2011 21:49:00', '21/07/2011 21:50:00', '21/07/2011 21:51:00', '21/07/2011 21:52:00', '21/07/2011 21:53:00', '21/07/2011 21:54:00', '21/07/2011 21:55:00', '21/07/2011 21:56:00', '21/07/2011 21:57:00', '21/07/2011 21:58:00', '21/07/2011 21:59:00', '21/07/2011 22:00:00', '21/07/2011 22:01:00', '21/07/2011 22:02:00', '21/07/2011 22:03:00', '21/07/2011 22:04:00', '21/07/2011 22:05:00', '21/07/2011 22:06:00', '21/07/2011 22:07:00', '21/07/2011 22:08:00', '21/07/2011 22:09:00', '21/07/2011 22:10:00', '21/07/2011 22:11:00', '21/07/2011 22:12:00', '21/07/2011 22:13:00', '21/07/2011 22:14:00', '21/07/2011 22:15:00', '21/07/2011 22:16:00', '21/07/2011 22:17:00', '21/07/2011 22:18:00', '21/07/2011 22:19:00', '21/07/2011 22:20:00', '21/07/2011 22:21:00', '21/07/2011 22:22:00', '21/07/2011 22:23:00', '21/07/2011 22:24:00', '21/07/2011 22:25:00', '21/07/2011 22:26:00', '21/07/2011 22:27:00', '21/07/2011 22:28:00', '21/07/2011 22:29:00', '21/07/2011 22:30:00', '21/07/2011 22:31:00', '21/07/2011 22:32:00', '21/07/2011 22:33:00', '21/07/2011 22:34:00', '21/07/2011 22:35:00', '21/07/2011 22:36:00', '21/07/2011 22:37:00', '21/07/2011 22:38:00', '21/07/2011 22:39:00', '21/07/2011 22:40:00', '21/07/2011 22:41:00', '21/07/2011 22:42:00', '21/07/2011 22:43:00', '21/07/2011 22:44:00', '21/07/2011 22:45:00', '21/07/2011 22:46:00', '21/07/2011 22:47:00', '21/07/2011 22:48:00', '21/07/2011 22:49:00', '21/07/2011 22:50:00', '21/07/2011 22:51:00', '21/07/2011 22:52:00', '21/07/2011 22:53:00', '21/07/2011 22:54:00', '21/07/2011 22:55:00', '21/07/2011 22:56:00', '21/07/2011 22:57:00', '21/07/2011 22:58:00', '21/07/2011 22:59:00', '21/07/2011 23:00:00', '21/07/2011 23:01:00', '21/07/2011 23:02:00', '21/07/2011 23:03:00', '21/07/2011 23:04:00', '21/07/2011 23:05:00', '21/07/2011 23:06:00', '21/07/2011 23:07:00', '21/07/2011 23:08:00', '21/07/2011 23:09:00', '21/07/2011 23:10:00', '21/07/2011 23:11:00', '21/07/2011 23:12:00', '21/07/2011 23:13:00', '21/07/2011 23:14:00', '21/07/2011 23:15:00', '21/07/2011 23:16:00', '21/07/2011 23:17:00', '21/07/2011 23:18:00', '21/07/2011 23:19:00', '21/07/2011 23:20:00', '21/07/2011 23:21:00', '21/07/2011 23:22:00', '21/07/2011 23:23:00', '21/07/2011 23:24:00', '21/07/2011 23:25:00', '21/07/2011 23:26:00', '21/07/2011 23:27:00', '21/07/2011 23:28:00', '21/07/2011 23:29:00', '21/07/2011 23:30:00', '21/07/2011 23:31:00', '21/07/2011 23:32:00', '21/07/2011 23:33:00', '21/07/2011 23:34:00', '21/07/2011 23:35:00', '21/07/2011 23:36:00', '21/07/2011 23:37:00', '21/07/2011 23:38:00', '21/07/2011 23:39:00', '21/07/2011 23:40:00', '21/07/2011 23:41:00', '21/07/2011 23:42:00', '21/07/2011 23:43:00', '21/07/2011 23:44:00', '21/07/2011 23:45:00', '21/07/2011 23:46:00', '21/07/2011 23:47:00', '21/07/2011 23:48:00', '21/07/2011 23:49:00', '21/07/2011 23:50:00', '21/07/2011 23:51:00', '21/07/2011 23:52:00', '21/07/2011 23:53:00', '21/07/2011 23:54:00', '21/07/2011 23:55:00', '21/07/2011 23:56:00', '21/07/2011 23:57:00', '21/07/2011 23:58:00', '21/07/2011 23:59:00', '22/07/2011 0:00:00', '22/07/2011 0:01:00', '22/07/2011 0:02:00', '22/07/2011 0:03:00', '22/07/2011 0:04:00', '22/07/2011 0:05:00', '22/07/2011 0:06:00', '22/07/2011 0:07:00', '22/07/2011 0:08:00', '22/07/2011 0:09:00', '22/07/2011 0:10:00', '22/07/2011 0:11:00', '22/07/2011 0:12:00', '22/07/2011 0:13:00', '22/07/2011 0:14:00', '22/07/2011 0:15:00', '22/07/2011 0:16:00', '22/07/2011 0:17:00', '22/07/2011 0:18:00', '22/07/2011 0:19:00', '22/07/2011 0:20:00', '22/07/2011 0:21:00', '22/07/2011 0:22:00', '22/07/2011 0:23:00', '22/07/2011 0:24:00', '22/07/2011 0:25:00', '22/07/2011 0:26:00', '22/07/2011 0:27:00', '22/07/2011 0:28:00', '22/07/2011 0:29:00', '22/07/2011 0:30:00', '22/07/2011 0:31:00', '22/07/2011 0:32:00', '22/07/2011 0:33:00', '22/07/2011 0:34:00', '22/07/2011 0:35:00', '22/07/2011 0:36:00', '22/07/2011 0:37:00', '22/07/2011 0:38:00', '22/07/2011 0:39:00', '22/07/2011 0:40:00', '22/07/2011 0:41:00', '22/07/2011 0:42:00', '22/07/2011 0:43:00', '22/07/2011 0:44:00', '22/07/2011 0:45:00', '22/07/2011 0:46:00', '22/07/2011 0:47:00', '22/07/2011 0:48:00', '22/07/2011 0:49:00', '22/07/2011 0:50:00', '22/07/2011 0:51:00', '22/07/2011 0:52:00', '22/07/2011 0:53:00', '22/07/2011 0:54:00', '22/07/2011 0:55:00', '22/07/2011 0:56:00', '22/07/2011 0:57:00', '22/07/2011 0:58:00', '22/07/2011 0:59:00', '22/07/2011 1:00:00', '22/07/2011 1:01:00', '22/07/2011 1:02:00', '22/07/2011 1:03:00', '22/07/2011 1:04:00', '22/07/2011 1:05:00', '22/07/2011 1:06:00', '22/07/2011 1:07:00', '22/07/2011 1:08:00', '22/07/2011 1:09:00', '22/07/2011 1:10:00', '22/07/2011 1:11:00', '22/07/2011 1:12:00', '22/07/2011 1:13:00', '22/07/2011 1:14:00', '22/07/2011 1:15:00', '22/07/2011 1:16:00', '22/07/2011 1:17:00', '22/07/2011 1:18:00', '22/07/2011 1:19:00', '22/07/2011 1:20:00', '22/07/2011 1:21:00', '22/07/2011 1:22:00', '22/07/2011 1:23:00', '22/07/2011 1:24:00', '22/07/2011 1:25:00', '22/07/2011 1:26:00', '22/07/2011 1:27:00', '22/07/2011 1:28:00', '22/07/2011 1:29:00', '22/07/2011 1:30:00', '22/07/2011 1:31:00', '22/07/2011 1:32:00', '22/07/2011 1:33:00', '22/07/2011 1:34:00', '22/07/2011 1:35:00', '22/07/2011 1:36:00', '22/07/2011 1:37:00', '22/07/2011 1:38:00', '22/07/2011 1:39:00', '22/07/2011 1:40:00', '22/07/2011 1:41:00', '22/07/2011 1:42:00', '22/07/2011 1:43:00', '22/07/2011 1:44:00', '22/07/2011 1:45:00', '22/07/2011 1:46:00', '22/07/2011 1:47:00', '22/07/2011 1:48:00', '22/07/2011 1:49:00', '22/07/2011 1:50:00', '22/07/2011 1:51:00', '22/07/2011 1:52:00', '22/07/2011 1:53:00', '22/07/2011 1:54:00', '22/07/2011 1:55:00', '22/07/2011 1:56:00', '22/07/2011 1:57:00', '22/07/2011 1:58:00', '22/07/2011 1:59:00', '22/07/2011 2:00:00', '22/07/2011 2:01:00', '22/07/2011 2:02:00', '22/07/2011 2:03:00', '22/07/2011 2:04:00', '22/07/2011 2:05:00', '22/07/2011 2:06:00', '22/07/2011 2:07:00', '22/07/2011 2:08:00', '22/07/2011 2:09:00', '22/07/2011 2:10:00', '22/07/2011 2:11:00', '22/07/2011 2:12:00', '22/07/2011 2:13:00', '22/07/2011 2:14:00', '22/07/2011 2:15:00', '22/07/2011 2:16:00', '22/07/2011 2:17:00', '22/07/2011 2:18:00', '22/07/2011 2:19:00', '22/07/2011 2:20:00', '22/07/2011 2:21:00', '22/07/2011 2:22:00', '22/07/2011 2:23:00', '22/07/2011 2:24:00', '22/07/2011 2:25:00', '22/07/2011 2:26:00', '22/07/2011 2:27:00', '22/07/2011 2:28:00', '22/07/2011 2:29:00', '22/07/2011 2:30:00', '22/07/2011 2:31:00', '22/07/2011 2:32:00', '22/07/2011 2:33:00', '22/07/2011 2:34:00', '22/07/2011 2:35:00', '22/07/2011 2:36:00', '22/07/2011 2:37:00', '22/07/2011 2:38:00', '22/07/2011 2:39:00', '22/07/2011 2:40:00', '22/07/2011 2:41:00', '22/07/2011 2:42:00', '22/07/2011 2:43:00', '22/07/2011 2:44:00', '22/07/2011 2:45:00', '22/07/2011 2:46:00', '22/07/2011 2:47:00', '22/07/2011 2:48:00', '22/07/2011 2:49:00', '22/07/2011 2:50:00', '22/07/2011 2:51:00', '22/07/2011 2:52:00', '22/07/2011 2:53:00', '22/07/2011 2:54:00', '22/07/2011 2:55:00', '22/07/2011 2:56:00', '22/07/2011 2:57:00', '22/07/2011 2:58:00', '22/07/2011 2:59:00', '22/07/2011 3:00:00', '22/07/2011 3:01:00', '22/07/2011 3:02:00', '22/07/2011 3:03:00', '22/07/2011 3:04:00', '22/07/2011 3:05:00', '22/07/2011 3:06:00', '22/07/2011 3:07:00', '22/07/2011 3:08:00', '22/07/2011 3:09:00', '22/07/2011 3:10:00', '22/07/2011 3:11:00', '22/07/2011 3:12:00', '22/07/2011 3:13:00', '22/07/2011 3:14:00', '22/07/2011 3:15:00', '22/07/2011 3:16:00', '22/07/2011 3:17:00', '22/07/2011 3:18:00', '22/07/2011 3:19:00', '22/07/2011 3:20:00', '22/07/2011 3:21:00', '22/07/2011 3:22:00', '22/07/2011 3:23:00', '22/07/2011 3:24:00', '22/07/2011 3:25:00', '22/07/2011 3:26:00', '22/07/2011 3:27:00', '22/07/2011 3:28:00', '22/07/2011 3:29:00', '22/07/2011 3:30:00', '22/07/2011 3:31:00', '22/07/2011 3:32:00', '22/07/2011 3:33:00', '22/07/2011 3:34:00', '22/07/2011 3:35:00', '22/07/2011 3:36:00', '22/07/2011 3:37:00', '22/07/2011 3:38:00', '22/07/2011 3:39:00', '22/07/2011 3:40:00', '22/07/2011 3:41:00', '22/07/2011 3:42:00', '22/07/2011 3:43:00', '22/07/2011 3:44:00', '22/07/2011 3:45:00', '22/07/2011 3:46:00', '22/07/2011 3:47:00', '22/07/2011 3:48:00', '22/07/2011 3:49:00', '22/07/2011 3:50:00', '22/07/2011 3:51:00', '22/07/2011 3:52:00', '22/07/2011 3:53:00', '22/07/2011 3:54:00', '22/07/2011 3:55:00', '22/07/2011 3:56:00', '22/07/2011 3:57:00', '22/07/2011 3:58:00', '22/07/2011 3:59:00', '22/07/2011 4:00:00', '22/07/2011 4:01:00', '22/07/2011 4:02:00', '22/07/2011 4:03:00', '22/07/2011 4:04:00', '22/07/2011 4:05:00', '22/07/2011 4:06:00', '22/07/2011 4:07:00', '22/07/2011 4:08:00', '22/07/2011 4:09:00', '22/07/2011 4:10:00', '22/07/2011 4:11:00', '22/07/2011 4:12:00', '22/07/2011 4:13:00', '22/07/2011 4:14:00', '22/07/2011 4:15:00', '22/07/2011 4:16:00', '22/07/2011 4:17:00', '22/07/2011 4:18:00', '22/07/2011 4:19:00', '22/07/2011 4:20:00', '22/07/2011 4:21:00', '22/07/2011 4:22:00', '22/07/2011 4:23:00', '22/07/2011 4:24:00', '22/07/2011 4:25:00', '22/07/2011 4:26:00', '22/07/2011 4:27:00', '22/07/2011 4:28:00', '22/07/2011 4:29:00', '22/07/2011 4:30:00', '22/07/2011 4:31:00', '22/07/2011 4:32:00', '22/07/2011 4:33:00', '22/07/2011 4:34:00', '22/07/2011 4:35:00', '22/07/2011 4:36:00', '22/07/2011 4:37:00', '22/07/2011 4:38:00', '22/07/2011 4:39:00', '22/07/2011 4:40:00', '22/07/2011 4:41:00', '22/07/2011 4:42:00', '22/07/2011 4:43:00', '22/07/2011 4:44:00', '22/07/2011 4:45:00', '22/07/2011 4:46:00', '22/07/2011 4:47:00', '22/07/2011 4:48:00', '22/07/2011 4:49:00', '22/07/2011 4:50:00', '22/07/2011 4:51:00', '22/07/2011 4:52:00', '22/07/2011 4:53:00', '22/07/2011 4:54:00', '22/07/2011 4:55:00', '22/07/2011 4:56:00', '22/07/2011 4:57:00', '22/07/2011 4:58:00', '22/07/2011 4:59:00', '22/07/2011 5:00:00', '22/07/2011 5:01:00', '22/07/2011 5:02:00', '22/07/2011 5:03:00', '22/07/2011 5:04:00', '22/07/2011 5:05:00', '22/07/2011 5:06:00', '22/07/2011 5:07:00', '22/07/2011 5:08:00', '22/07/2011 5:09:00', '22/07/2011 5:10:00', '22/07/2011 5:11:00', '22/07/2011 5:12:00', '22/07/2011 5:13:00', '22/07/2011 5:14:00', '22/07/2011 5:15:00', '22/07/2011 5:16:00', '22/07/2011 5:17:00', '22/07/2011 5:18:00', '22/07/2011 5:19:00', '22/07/2011 5:20:00', '22/07/2011 5:21:00', '22/07/2011 5:22:00', '22/07/2011 5:23:00', '22/07/2011 5:24:00', '22/07/2011 5:25:00', '22/07/2011 5:26:00', '22/07/2011 5:27:00', '22/07/2011 5:28:00', '22/07/2011 5:29:00', '22/07/2011 5:30:00', '22/07/2011 5:31:00', '22/07/2011 5:32:00', '22/07/2011 5:33:00', '22/07/2011 5:34:00', '22/07/2011 5:35:00', '22/07/2011 5:36:00', '22/07/2011 5:37:00', '22/07/2011 5:38:00', '22/07/2011 5:39:00', '22/07/2011 5:40:00', '22/07/2011 5:41:00', '22/07/2011 5:42:00', '22/07/2011 5:43:00', '22/07/2011 5:44:00', '22/07/2011 5:45:00', '22/07/2011 5:46:00', '22/07/2011 5:47:00', '22/07/2011 5:48:00', '22/07/2011 5:49:00', '22/07/2011 5:50:00', '22/07/2011 5:51:00', '22/07/2011 5:52:00', '22/07/2011 5:53:00', '22/07/2011 5:54:00', '22/07/2011 5:55:00', '22/07/2011 5:56:00', '22/07/2011 5:57:00', '22/07/2011 5:58:00', '22/07/2011 5:59:00', '22/07/2011 6:00:00', '22/07/2011 6:01:00', '22/07/2011 6:02:00', '22/07/2011 6:03:00', '22/07/2011 6:04:00', '22/07/2011 6:05:00', '22/07/2011 6:06:00', '22/07/2011 6:07:00', '22/07/2011 6:08:00', '22/07/2011 6:09:00', '22/07/2011 6:10:00', '22/07/2011 6:11:00', '22/07/2011 6:12:00', '22/07/2011 6:13:00', '22/07/2011 6:14:00', '22/07/2011 6:15:00', '22/07/2011 6:16:00', '22/07/2011 6:17:00', '22/07/2011 6:18:00', '22/07/2011 6:19:00', '22/07/2011 6:20:00', '22/07/2011 6:21:00', '22/07/2011 6:22:00', '22/07/2011 6:23:00', '22/07/2011 6:24:00', '22/07/2011 6:25:00', '22/07/2011 6:26:00', '22/07/2011 6:27:00', '22/07/2011 6:28:00', '22/07/2011 6:29:00', '22/07/2011 6:30:00', '22/07/2011 6:31:00', '22/07/2011 6:32:00', '22/07/2011 6:33:00', '22/07/2011 6:34:00', '22/07/2011 6:35:00', '22/07/2011 6:36:00', '22/07/2011 6:37:00', '22/07/2011 6:38:00', '22/07/2011 6:39:00', '22/07/2011 6:40:00', '22/07/2011 6:41:00', '22/07/2011 6:42:00', '22/07/2011 6:43:00', '22/07/2011 6:44:00', '22/07/2011 6:45:00', '22/07/2011 6:46:00', '22/07/2011 6:47:00', '22/07/2011 6:48:00', '22/07/2011 6:49:00', '22/07/2011 6:50:00', '22/07/2011 6:51:00', '22/07/2011 6:52:00', '22/07/2011 6:53:00', '22/07/2011 6:54:00', '22/07/2011 6:55:00', '22/07/2011 6:56:00', '22/07/2011 6:57:00', '22/07/2011 6:58:00', '22/07/2011 6:59:00', '22/07/2011 7:00:00', '22/07/2011 7:01:00', '22/07/2011 7:02:00', '22/07/2011 7:03:00', '22/07/2011 7:04:00', '22/07/2011 7:05:00', '22/07/2011 7:06:00', '22/07/2011 7:07:00', '22/07/2011 7:08:00', '22/07/2011 7:09:00', '22/07/2011 7:10:00', '22/07/2011 7:11:00', '22/07/2011 7:12:00', '22/07/2011 7:13:00', '22/07/2011 7:14:00', '22/07/2011 7:15:00', '22/07/2011 7:16:00', '22/07/2011 7:17:00', '22/07/2011 7:18:00', '22/07/2011 7:19:00', '22/07/2011 7:20:00', '22/07/2011 7:21:00', '22/07/2011 7:22:00', '22/07/2011 7:23:00', '22/07/2011 7:24:00', '22/07/2011 7:25:00', '22/07/2011 7:26:00', '22/07/2011 7:27:00', '22/07/2011 7:28:00', '22/07/2011 7:29:00', '22/07/2011 7:30:00', '22/07/2011 7:31:00', '22/07/2011 7:32:00', '22/07/2011 7:33:00', '22/07/2011 7:34:00', '22/07/2011 7:35:00', '22/07/2011 7:36:00', '22/07/2011 7:37:00', '22/07/2011 7:38:00', '22/07/2011 7:39:00', '22/07/2011 7:40:00', '22/07/2011 7:41:00', '22/07/2011 7:42:00', '22/07/2011 7:43:00', '22/07/2011 7:44:00', '22/07/2011 7:45:00', '22/07/2011 7:46:00', '22/07/2011 7:47:00', '22/07/2011 7:48:00', '22/07/2011 7:49:00', '22/07/2011 7:50:00', '22/07/2011 7:51:00', '22/07/2011 7:52:00', '22/07/2011 7:53:00', '22/07/2011 7:54:00', '22/07/2011 7:55:00', '22/07/2011 7:56:00', '22/07/2011 7:57:00', '22/07/2011 7:58:00', '22/07/2011 7:59:00', '22/07/2011 8:00:00', '22/07/2011 8:01:00', '22/07/2011 8:02:00', '22/07/2011 8:03:00', '22/07/2011 8:04:00', '22/07/2011 8:05:00', '22/07/2011 8:06:00', '22/07/2011 8:07:00', '22/07/2011 8:08:00', '22/07/2011 8:09:00', '22/07/2011 8:10:00', '22/07/2011 8:11:00', '22/07/2011 8:12:00', '22/07/2011 8:13:00', '22/07/2011 8:14:00', '22/07/2011 8:15:00', '22/07/2011 8:16:00', '22/07/2011 8:17:00', '22/07/2011 8:18:00', '22/07/2011 8:19:00', '22/07/2011 8:20:00', '22/07/2011 8:21:00', '22/07/2011 8:22:00', '22/07/2011 8:23:00', '22/07/2011 8:24:00', '22/07/2011 8:25:00', '22/07/2011 8:26:00', '22/07/2011 8:27:00', '22/07/2011 8:28:00', '22/07/2011 8:29:00', '22/07/2011 8:30:00', '22/07/2011 8:31:00', '22/07/2011 8:32:00', '22/07/2011 8:33:00', '22/07/2011 8:34:00', '22/07/2011 8:35:00', '22/07/2011 8:36:00', '22/07/2011 8:37:00', '22/07/2011 8:38:00', '22/07/2011 8:39:00', '22/07/2011 8:40:00', '22/07/2011 8:41:00', '22/07/2011 8:42:00', '22/07/2011 8:43:00', '22/07/2011 8:44:00', '22/07/2011 8:45:00', '22/07/2011 8:46:00', '22/07/2011 8:47:00', '22/07/2011 8:48:00', '22/07/2011 8:49:00', '22/07/2011 8:50:00', '22/07/2011 8:51:00', '22/07/2011 8:52:00', '22/07/2011 8:53:00', '22/07/2011 8:54:00', '22/07/2011 8:55:00', '22/07/2011 8:56:00', '22/07/2011 8:57:00', '22/07/2011 8:58:00', '22/07/2011 8:59:00', '22/07/2011 9:00:00', '22/07/2011 9:01:00', '22/07/2011 9:02:00', '22/07/2011 9:03:00', '22/07/2011 9:04:00', '22/07/2011 9:05:00', '22/07/2011 9:06:00', '22/07/2011 9:07:00', '22/07/2011 9:08:00', '22/07/2011 9:09:00', '22/07/2011 9:10:00', '22/07/2011 9:11:00', '22/07/2011 9:12:00', '22/07/2011 9:13:00', '22/07/2011 9:14:00', '22/07/2011 9:15:00', '22/07/2011 9:16:00', '22/07/2011 9:17:00', '22/07/2011 9:18:00', '22/07/2011 9:19:00', '22/07/2011 9:20:00', '22/07/2011 9:21:00', '22/07/2011 9:22:00', '22/07/2011 9:23:00', '22/07/2011 9:24:00', '22/07/2011 9:25:00', '22/07/2011 9:26:00', '22/07/2011 9:27:00', '22/07/2011 9:28:00', '22/07/2011 9:29:00', '22/07/2011 9:30:00', '22/07/2011 9:31:00', '22/07/2011 9:32:00', '22/07/2011 9:33:00', '22/07/2011 9:34:00', '22/07/2011 9:35:00', '22/07/2011 9:36:00', '22/07/2011 9:37:00', '22/07/2011 9:38:00', '22/07/2011 9:39:00', '22/07/2011 9:40:00', '22/07/2011 9:41:00', '22/07/2011 9:42:00', '22/07/2011 9:43:00', '22/07/2011 9:44:00', '22/07/2011 9:45:00', '22/07/2011 9:46:00', '22/07/2011 9:47:00', '22/07/2011 9:48:00', '22/07/2011 9:49:00', '22/07/2011 9:50:00', '22/07/2011 9:51:00', '22/07/2011 9:52:00', '22/07/2011 9:53:00', '22/07/2011 9:54:00', '22/07/2011 9:55:00', '22/07/2011 9:56:00', '22/07/2011 9:57:00', '22/07/2011 9:58:00', '22/07/2011 9:59:00', '22/07/2011 10:00:00', '22/07/2011 10:01:00', '22/07/2011 10:02:00', '22/07/2011 10:03:00', '22/07/2011 10:04:00', '22/07/2011 10:05:00', '22/07/2011 10:06:00', '22/07/2011 10:07:00', '22/07/2011 10:08:00', '22/07/2011 10:09:00', '22/07/2011 10:10:00', '22/07/2011 10:11:00', '22/07/2011 10:12:00', '22/07/2011 10:13:00', '22/07/2011 10:14:00', '22/07/2011 10:15:00', '22/07/2011 10:16:00', '22/07/2011 10:17:00', '22/07/2011 10:18:00', '22/07/2011 10:19:00', '22/07/2011 10:20:00', '22/07/2011 10:21:00', '22/07/2011 10:22:00', '22/07/2011 10:23:00', '22/07/2011 10:24:00', '22/07/2011 10:25:00', '22/07/2011 10:26:00', '22/07/2011 10:27:00', '22/07/2011 10:28:00', '22/07/2011 10:29:00', '22/07/2011 10:30:00', '22/07/2011 10:31:00', '22/07/2011 10:32:00', '22/07/2011 10:33:00', '22/07/2011 10:34:00', '22/07/2011 10:35:00', '22/07/2011 10:36:00', '22/07/2011 10:37:00', '22/07/2011 10:38:00', '22/07/2011 10:39:00', '22/07/2011 10:40:00', '22/07/2011 10:41:00', '22/07/2011 10:42:00', '22/07/2011 10:43:00', '22/07/2011 10:44:00', '22/07/2011 10:45:00', '22/07/2011 10:46:00', '22/07/2011 10:47:00', '22/07/2011 10:48:00', '22/07/2011 10:49:00', '22/07/2011 10:50:00', '22/07/2011 10:51:00', '22/07/2011 10:52:00', '22/07/2011 10:53:00', '22/07/2011 10:54:00', '22/07/2011 10:55:00', '22/07/2011 10:56:00', '22/07/2011 10:57:00', '22/07/2011 10:58:00', '22/07/2011 10:59:00', '22/07/2011 11:00:00', '22/07/2011 11:01:00', '22/07/2011 11:02:00', '22/07/2011 11:03:00', '22/07/2011 11:04:00', '22/07/2011 11:05:00', '22/07/2011 11:06:00', '22/07/2011 11:07:00', '22/07/2011 11:08:00', '22/07/2011 11:09:00', '22/07/2011 11:10:00', '22/07/2011 11:11:00', '22/07/2011 11:12:00', '22/07/2011 11:13:00', '22/07/2011 11:14:00', '22/07/2011 11:15:00', '22/07/2011 11:16:00', '22/07/2011 11:17:00', '22/07/2011 11:18:00', '22/07/2011 11:19:00', '22/07/2011 11:20:00', '22/07/2011 11:21:00', '22/07/2011 11:22:00', '22/07/2011 11:23:00', '22/07/2011 11:24:00', '22/07/2011 11:25:00', '22/07/2011 11:26:00', '22/07/2011 11:27:00', '22/07/2011 11:28:00', '22/07/2011 11:29:00', '22/07/2011 11:30:00', '22/07/2011 11:31:00', '22/07/2011 11:32:00', '22/07/2011 11:33:00', '22/07/2011 11:34:00', '22/07/2011 11:35:00', '22/07/2011 11:36:00', '22/07/2011 11:37:00', '22/07/2011 11:38:00', '22/07/2011 11:39:00', '22/07/2011 11:40:00', '22/07/2011 11:41:00', '22/07/2011 11:42:00', '22/07/2011 11:43:00', '22/07/2011 11:44:00', '22/07/2011 11:45:00', '22/07/2011 11:46:00', '22/07/2011 11:47:00', '22/07/2011 11:48:00', '22/07/2011 11:49:00', '22/07/2011 11:50:00', '22/07/2011 11:51:00', '22/07/2011 11:52:00', '22/07/2011 11:53:00', '22/07/2011 11:54:00', '22/07/2011 11:55:00', '22/07/2011 11:56:00', '22/07/2011 11:57:00', '22/07/2011 11:58:00', '22/07/2011 11:59:00', '22/07/2011 12:00:00', '22/07/2011 12:01:00', '22/07/2011 12:02:00', '22/07/2011 12:03:00', '22/07/2011 12:04:00', '22/07/2011 12:05:00', '22/07/2011 12:06:00', '22/07/2011 12:07:00', '22/07/2011 12:08:00', '22/07/2011 12:09:00', '22/07/2011 12:10:00', '22/07/2011 12:11:00', '22/07/2011 12:12:00', '22/07/2011 12:13:00', '22/07/2011 12:14:00', '22/07/2011 12:15:00', '22/07/2011 12:16:00', '22/07/2011 12:17:00', '22/07/2011 12:18:00', '22/07/2011 12:19:00', '22/07/2011 12:20:00', '22/07/2011 12:21:00', '22/07/2011 12:22:00', '22/07/2011 12:23:00', '22/07/2011 12:24:00', '22/07/2011 12:25:00', '22/07/2011 12:26:00', '22/07/2011 12:27:00', '22/07/2011 12:28:00', '22/07/2011 12:29:00', '22/07/2011 12:30:00', '22/07/2011 12:31:00', '22/07/2011 12:32:00', '22/07/2011 12:33:00', '22/07/2011 12:34:00', '22/07/2011 12:35:00', '22/07/2011 12:36:00', '22/07/2011 12:37:00', '22/07/2011 12:38:00', '22/07/2011 12:39:00', '22/07/2011 12:40:00', '22/07/2011 12:41:00', '22/07/2011 12:42:00', '22/07/2011 12:43:00', '22/07/2011 12:44:00', '22/07/2011 12:45:00', '22/07/2011 12:46:00', '22/07/2011 12:47:00', '22/07/2011 12:48:00', '22/07/2011 12:49:00', '22/07/2011 12:50:00', '22/07/2011 12:51:00', '22/07/2011 12:52:00', '22/07/2011 12:53:00', '22/07/2011 12:54:00', '22/07/2011 12:55:00', '22/07/2011 12:56:00', '22/07/2011 12:57:00', '22/07/2011 12:58:00', '22/07/2011 12:59:00', '22/07/2011 13:00:00', '22/07/2011 13:01:00', '22/07/2011 13:02:00', '22/07/2011 13:03:00', '22/07/2011 13:04:00', '22/07/2011 13:05:00', '22/07/2011 13:06:00', '22/07/2011 13:07:00', '22/07/2011 13:08:00', '22/07/2011 13:09:00', '22/07/2011 13:10:00', '22/07/2011 13:11:00', '22/07/2011 13:12:00', '22/07/2011 13:13:00', '22/07/2011 13:14:00', '22/07/2011 13:15:00', '22/07/2011 13:16:00', '22/07/2011 13:17:00', '22/07/2011 13:18:00', '22/07/2011 13:19:00', '22/07/2011 13:20:00', '22/07/2011 13:21:00', '22/07/2011 13:22:00', '22/07/2011 13:23:00', '22/07/2011 13:24:00', '22/07/2011 13:25:00', '22/07/2011 13:26:00', '22/07/2011 13:27:00', '22/07/2011 13:28:00', '22/07/2011 13:29:00', '22/07/2011 13:30:00', '22/07/2011 13:31:00', '22/07/2011 13:32:00', '22/07/2011 13:33:00', '22/07/2011 13:34:00', '22/07/2011 13:35:00', '22/07/2011 13:36:00', '22/07/2011 13:37:00', '22/07/2011 13:38:00', '22/07/2011 13:39:00', '22/07/2011 13:40:00', '22/07/2011 13:41:00', '22/07/2011 13:42:00', '22/07/2011 13:43:00', '22/07/2011 13:44:00', '22/07/2011 13:45:00', '22/07/2011 13:46:00', '22/07/2011 13:47:00', '22/07/2011 13:48:00', '22/07/2011 13:49:00', '22/07/2011 13:50:00', '22/07/2011 13:51:00', '22/07/2011 13:52:00', '22/07/2011 13:53:00', '22/07/2011 13:54:00', '22/07/2011 13:55:00', '22/07/2011 13:56:00', '22/07/2011 13:57:00', '22/07/2011 13:58:00', '22/07/2011 13:59:00', '22/07/2011 14:00:00', '22/07/2011 14:01:00', '22/07/2011 14:02:00', '22/07/2011 14:03:00', '22/07/2011 14:04:00', '22/07/2011 14:05:00', '22/07/2011 14:06:00', '22/07/2011 14:07:00', '22/07/2011 14:08:00', '22/07/2011 14:09:00', '22/07/2011 14:10:00', '22/07/2011 14:11:00', '22/07/2011 14:12:00', '22/07/2011 14:13:00', '22/07/2011 14:14:00', '22/07/2011 14:15:00', '22/07/2011 14:16:00', '22/07/2011 14:17:00', '22/07/2011 14:18:00', '22/07/2011 14:19:00', '22/07/2011 14:20:00', '22/07/2011 14:21:00', '22/07/2011 14:22:00', '22/07/2011 14:23:00', '22/07/2011 14:24:00', '22/07/2011 14:25:00', '22/07/2011 14:26:00', '22/07/2011 14:27:00', '22/07/2011 14:28:00', '22/07/2011 14:29:00', '22/07/2011 14:30:00', '22/07/2011 14:31:00', '22/07/2011 14:32:00', '22/07/2011 14:33:00', '22/07/2011 14:34:00', '22/07/2011 14:35:00', '22/07/2011 14:36:00', '22/07/2011 14:37:00', '22/07/2011 14:38:00', '22/07/2011 14:39:00', '22/07/2011 14:40:00', '22/07/2011 14:41:00', '22/07/2011 14:42:00', '22/07/2011 14:43:00', '22/07/2011 14:44:00', '22/07/2011 14:45:00', '22/07/2011 14:46:00', '22/07/2011 14:47:00', '22/07/2011 14:48:00', '22/07/2011 14:49:00', '22/07/2011 14:50:00', '22/07/2011 14:51:00', '22/07/2011 14:52:00', '22/07/2011 14:53:00', '22/07/2011 14:54:00', '22/07/2011 14:55:00', '22/07/2011 14:56:00', '22/07/2011 14:57:00', '22/07/2011 14:58:00', '22/07/2011 14:59:00', '22/07/2011 15:00:00', '22/07/2011 15:01:00', '22/07/2011 15:02:00', '22/07/2011 15:03:00', '22/07/2011 15:04:00', '22/07/2011 15:05:00', '22/07/2011 15:06:00', '22/07/2011 15:07:00', '22/07/2011 15:08:00', '22/07/2011 15:09:00', '22/07/2011 15:10:00', '22/07/2011 15:11:00', '22/07/2011 15:12:00', '22/07/2011 15:13:00', '22/07/2011 15:14:00', '22/07/2011 15:15:00', '22/07/2011 15:16:00', '22/07/2011 15:17:00', '22/07/2011 15:18:00', '22/07/2011 15:19:00', '22/07/2011 15:20:00', '22/07/2011 15:21:00', '22/07/2011 15:22:00', '22/07/2011 15:23:00', '22/07/2011 15:24:00', '22/07/2011 15:25:00', '22/07/2011 15:26:00', '22/07/2011 15:27:00', '22/07/2011 15:28:00', '22/07/2011 15:29:00', '22/07/2011 15:30:00', '22/07/2011 15:31:00', '22/07/2011 15:32:00', '22/07/2011 15:33:00', '22/07/2011 15:34:00', '22/07/2011 15:35:00', '22/07/2011 15:36:00', '22/07/2011 15:37:00', '22/07/2011 15:38:00', '22/07/2011 15:39:00', '22/07/2011 15:40:00', '22/07/2011 15:41:00', '22/07/2011 15:42:00', '22/07/2011 15:43:00', '22/07/2011 15:44:00', '22/07/2011 15:45:00', '22/07/2011 15:46:00', '22/07/2011 15:47:00', '22/07/2011 15:48:00', '22/07/2011 15:49:00', '22/07/2011 15:50:00', '22/07/2011 15:51:00', '22/07/2011 15:52:00', '22/07/2011 15:53:00', '22/07/2011 15:54:00', '22/07/2011 15:55:00', '22/07/2011 15:56:00', '22/07/2011 15:57:00', '22/07/2011 15:58:00', '22/07/2011 15:59:00', '22/07/2011 16:00:00', '22/07/2011 16:01:00', '22/07/2011 16:02:00', '22/07/2011 16:03:00', '22/07/2011 16:04:00', '22/07/2011 16:05:00', '22/07/2011 16:06:00', '22/07/2011 16:07:00', '22/07/2011 16:08:00', '22/07/2011 16:09:00', '22/07/2011 16:10:00', '22/07/2011 16:11:00', '22/07/2011 16:12:00', '22/07/2011 16:13:00', '22/07/2011 16:14:00', '22/07/2011 16:15:00', '22/07/2011 16:16:00', '22/07/2011 16:17:00', '22/07/2011 16:18:00', '22/07/2011 16:19:00', '22/07/2011 16:20:00', '22/07/2011 16:21:00', '22/07/2011 16:22:00', '22/07/2011 16:23:00', '22/07/2011 16:24:00', '22/07/2011 16:25:00', '22/07/2011 16:26:00', '22/07/2011 16:27:00', '22/07/2011 16:28:00', '22/07/2011 16:29:00', '22/07/2011 16:30:00', '22/07/2011 16:31:00', '22/07/2011 16:32:00', '22/07/2011 16:33:00', '22/07/2011 16:34:00', '22/07/2011 16:35:00', '22/07/2011 16:36:00', '22/07/2011 16:37:00', '22/07/2011 16:38:00', '22/07/2011 16:39:00', '22/07/2011 16:40:00', '22/07/2011 16:41:00', '22/07/2011 16:42:00', '22/07/2011 16:43:00', '22/07/2011 16:44:00', '22/07/2011 16:45:00', '22/07/2011 16:46:00', '22/07/2011 16:47:00', '22/07/2011 16:48:00', '22/07/2011 16:49:00', '22/07/2011 16:50:00', '22/07/2011 16:51:00', '22/07/2011 16:52:00', '22/07/2011 16:53:00', '22/07/2011 16:54:00', '22/07/2011 16:55:00', '22/07/2011 16:56:00', '22/07/2011 16:57:00', '22/07/2011 16:58:00', '22/07/2011 16:59:00', '22/07/2011 17:00:00', '22/07/2011 17:01:00', '22/07/2011 17:02:00', '22/07/2011 17:03:00', '22/07/2011 17:04:00', '22/07/2011 17:05:00', '22/07/2011 17:06:00', '22/07/2011 17:07:00', '22/07/2011 17:08:00', '22/07/2011 17:09:00', '22/07/2011 17:10:00', '22/07/2011 17:11:00', '22/07/2011 17:12:00', '22/07/2011 17:13:00', '22/07/2011 17:14:00', '22/07/2011 17:15:00', '22/07/2011 17:16:00', '22/07/2011 17:17:00', '22/07/2011 17:18:00', '22/07/2011 17:19:00', '22/07/2011 17:20:00', '22/07/2011 17:21:00', '22/07/2011 17:22:00', '22/07/2011 17:23:00', '22/07/2011 17:24:00', '22/07/2011 17:25:00', '22/07/2011 17:26:00', '22/07/2011 17:27:00', '22/07/2011 17:28:00', '22/07/2011 17:29:00', '22/07/2011 17:30:00', '22/07/2011 17:31:00', '22/07/2011 17:32:00', '22/07/2011 17:33:00', '22/07/2011 17:34:00', '22/07/2011 17:35:00', '22/07/2011 17:36:00', '22/07/2011 17:37:00', '22/07/2011 17:38:00', '22/07/2011 17:39:00', '22/07/2011 17:40:00', '22/07/2011 17:41:00', '22/07/2011 17:42:00', '22/07/2011 17:43:00', '22/07/2011 17:44:00', '22/07/2011 17:45:00', '22/07/2011 17:46:00', '22/07/2011 17:47:00', '22/07/2011 17:48:00', '22/07/2011 17:49:00', '22/07/2011 17:50:00', '22/07/2011 17:51:00', '22/07/2011 17:52:00', '22/07/2011 17:53:00', '22/07/2011 17:54:00', '22/07/2011 17:55:00', '22/07/2011 17:56:00', '22/07/2011 17:57:00', '22/07/2011 17:58:00', '22/07/2011 17:59:00', '22/07/2011 18:00:00', '22/07/2011 18:01:00', '22/07/2011 18:02:00', '22/07/2011 18:03:00', '22/07/2011 18:04:00', '22/07/2011 18:05:00', '22/07/2011 18:06:00', '22/07/2011 18:07:00', '22/07/2011 18:08:00', '22/07/2011 18:09:00', '22/07/2011 18:10:00', '22/07/2011 18:11:00', '22/07/2011 18:12:00', '22/07/2011 18:13:00', '22/07/2011 18:14:00', '22/07/2011 18:15:00', '22/07/2011 18:16:00', '22/07/2011 18:17:00', '22/07/2011 18:18:00', '22/07/2011 18:19:00', '22/07/2011 18:20:00', '22/07/2011 18:21:00', '22/07/2011 18:22:00', '22/07/2011 18:23:00', '22/07/2011 18:24:00', '22/07/2011 18:25:00', '22/07/2011 18:26:00', '22/07/2011 18:27:00', '22/07/2011 18:28:00', '22/07/2011 18:29:00', '22/07/2011 18:30:00', '22/07/2011 18:31:00', '22/07/2011 18:32:00', '22/07/2011 18:33:00', '22/07/2011 18:34:00', '22/07/2011 18:35:00', '22/07/2011 18:36:00', '22/07/2011 18:37:00', '22/07/2011 18:38:00', '22/07/2011 18:39:00', '22/07/2011 18:40:00', '22/07/2011 18:41:00', '22/07/2011 18:42:00', '22/07/2011 18:43:00', '22/07/2011 18:44:00', '22/07/2011 18:45:00', '22/07/2011 18:46:00', '22/07/2011 18:47:00', '22/07/2011 18:48:00', '22/07/2011 18:49:00', '22/07/2011 18:50:00', '22/07/2011 18:51:00', '22/07/2011 18:52:00', '22/07/2011 18:53:00', '22/07/2011 18:54:00', '22/07/2011 18:55:00', '22/07/2011 18:56:00', '22/07/2011 18:57:00', '22/07/2011 18:58:00', '22/07/2011 18:59:00', '22/07/2011 19:00:00', '22/07/2011 19:01:00', '22/07/2011 19:02:00', '22/07/2011 19:03:00', '22/07/2011 19:04:00', '22/07/2011 19:05:00', '22/07/2011 19:06:00', '22/07/2011 19:07:00', '22/07/2011 19:08:00', '22/07/2011 19:09:00', '22/07/2011 19:10:00', '22/07/2011 19:11:00', '22/07/2011 19:12:00', '22/07/2011 19:13:00', '22/07/2011 19:14:00', '22/07/2011 19:15:00', '22/07/2011 19:16:00', '22/07/2011 19:17:00', '22/07/2011 19:18:00', '22/07/2011 19:19:00', '22/07/2011 19:20:00', '22/07/2011 19:21:00', '22/07/2011 19:22:00', '22/07/2011 19:23:00', '22/07/2011 19:24:00', '22/07/2011 19:25:00', '22/07/2011 19:26:00', '22/07/2011 19:27:00', '22/07/2011 19:28:00', '22/07/2011 19:29:00', '22/07/2011 19:30:00', '22/07/2011 19:31:00', '22/07/2011 19:32:00', '22/07/2011 19:33:00', '22/07/2011 19:34:00', '22/07/2011 19:35:00', '22/07/2011 19:36:00', '22/07/2011 19:37:00', '22/07/2011 19:38:00', '22/07/2011 19:39:00', '22/07/2011 19:40:00', '22/07/2011 19:41:00', '22/07/2011 19:42:00', '22/07/2011 19:43:00', '22/07/2011 19:44:00', '22/07/2011 19:45:00', '22/07/2011 19:46:00', '22/07/2011 19:47:00', '22/07/2011 19:48:00', '22/07/2011 19:49:00', '22/07/2011 19:50:00', '22/07/2011 19:51:00', '22/07/2011 19:52:00', '22/07/2011 19:53:00', '22/07/2011 19:54:00', '22/07/2011 19:55:00', '22/07/2011 19:56:00', '22/07/2011 19:57:00', '22/07/2011 19:58:00', '22/07/2011 19:59:00', '22/07/2011 20:00:00', '22/07/2011 20:01:00', '22/07/2011 20:02:00', '22/07/2011 20:03:00', '22/07/2011 20:04:00', '22/07/2011 20:05:00', '22/07/2011 20:06:00', '22/07/2011 20:07:00', '22/07/2011 20:08:00', '22/07/2011 20:09:00', '22/07/2011 20:10:00', '22/07/2011 20:11:00', '22/07/2011 20:12:00', '22/07/2011 20:13:00', '22/07/2011 20:14:00', '22/07/2011 20:15:00', '22/07/2011 20:16:00', '22/07/2011 20:17:00', '22/07/2011 20:18:00', '22/07/2011 20:19:00', '22/07/2011 20:20:00', '22/07/2011 20:21:00', '22/07/2011 20:22:00', '22/07/2011 20:23:00', '22/07/2011 20:24:00', '22/07/2011 20:25:00', '22/07/2011 20:26:00', '22/07/2011 20:27:00', '22/07/2011 20:28:00', '22/07/2011 20:29:00', '22/07/2011 20:30:00', '22/07/2011 20:31:00', '22/07/2011 20:32:00', '22/07/2011 20:33:00', '22/07/2011 20:34:00', '22/07/2011 20:35:00', '22/07/2011 20:36:00', '22/07/2011 20:37:00', '22/07/2011 20:38:00', '22/07/2011 20:39:00', '22/07/2011 20:40:00', '22/07/2011 20:41:00', '22/07/2011 20:42:00', '22/07/2011 20:43:00', '22/07/2011 20:44:00', '22/07/2011 20:45:00', '22/07/2011 20:46:00', '22/07/2011 20:47:00', '22/07/2011 20:48:00', '22/07/2011 20:49:00', '22/07/2011 20:50:00', '22/07/2011 20:51:00', '22/07/2011 20:52:00', '22/07/2011 20:53:00', '22/07/2011 20:54:00', '22/07/2011 20:55:00', '22/07/2011 20:56:00', '22/07/2011 20:57:00', '22/07/2011 20:58:00', '22/07/2011 20:59:00', '22/07/2011 21:00:00', '22/07/2011 21:01:00', '22/07/2011 21:02:00', '22/07/2011 21:03:00', '22/07/2011 21:04:00', '22/07/2011 21:05:00', '22/07/2011 21:06:00', '22/07/2011 21:07:00', '22/07/2011 21:08:00', '22/07/2011 21:09:00', '22/07/2011 21:10:00', '22/07/2011 21:11:00', '22/07/2011 21:12:00', '22/07/2011 21:13:00', '22/07/2011 21:14:00', '22/07/2011 21:15:00', '22/07/2011 21:16:00', '22/07/2011 21:17:00', '22/07/2011 21:18:00', '22/07/2011 21:19:00', '22/07/2011 21:20:00', '22/07/2011 21:21:00', '22/07/2011 21:22:00', '22/07/2011 21:23:00', '22/07/2011 21:24:00', '22/07/2011 21:25:00', '22/07/2011 21:26:00', '22/07/2011 21:27:00', '22/07/2011 21:28:00', '22/07/2011 21:29:00', '22/07/2011 21:30:00', '22/07/2011 21:31:00', '22/07/2011 21:32:00', '22/07/2011 21:33:00', '22/07/2011 21:34:00', '22/07/2011 21:35:00', '22/07/2011 21:36:00', '22/07/2011 21:37:00', '22/07/2011 21:38:00', '22/07/2011 21:39:00', '22/07/2011 21:40:00', '22/07/2011 21:41:00', '22/07/2011 21:42:00', '22/07/2011 21:43:00', '22/07/2011 21:44:00', '22/07/2011 21:45:00', '22/07/2011 21:46:00', '22/07/2011 21:47:00', '22/07/2011 21:48:00', '22/07/2011 21:49:00', '22/07/2011 21:50:00', '22/07/2011 21:51:00', '22/07/2011 21:52:00', '22/07/2011 21:53:00', '22/07/2011 21:54:00', '22/07/2011 21:55:00', '22/07/2011 21:56:00', '22/07/2011 21:57:00', '22/07/2011 21:58:00', '22/07/2011 21:59:00', '22/07/2011 22:00:00', '22/07/2011 22:01:00', '22/07/2011 22:02:00', '22/07/2011 22:03:00', '22/07/2011 22:04:00', '22/07/2011 22:05:00', '22/07/2011 22:06:00', '22/07/2011 22:07:00', '22/07/2011 22:08:00', '22/07/2011 22:09:00', '22/07/2011 22:10:00', '22/07/2011 22:11:00', '22/07/2011 22:12:00', '22/07/2011 22:13:00', '22/07/2011 22:14:00', '22/07/2011 22:15:00', '22/07/2011 22:16:00', '22/07/2011 22:17:00', '22/07/2011 22:18:00', '22/07/2011 22:19:00', '22/07/2011 22:20:00', '22/07/2011 22:21:00', '22/07/2011 22:22:00', '22/07/2011 22:23:00', '22/07/2011 22:24:00', '22/07/2011 22:25:00', '22/07/2011 22:26:00', '22/07/2011 22:27:00', '22/07/2011 22:28:00', '22/07/2011 22:29:00', '22/07/2011 22:30:00', '22/07/2011 22:31:00', '22/07/2011 22:32:00', '22/07/2011 22:33:00', '22/07/2011 22:34:00', '22/07/2011 22:35:00', '22/07/2011 22:36:00', '22/07/2011 22:37:00', '22/07/2011 22:38:00', '22/07/2011 22:39:00', '22/07/2011 22:40:00', '22/07/2011 22:41:00', '22/07/2011 22:42:00', '22/07/2011 22:43:00', '22/07/2011 22:44:00', '22/07/2011 22:45:00', '22/07/2011 22:46:00', '22/07/2011 22:47:00', '22/07/2011 22:48:00', '22/07/2011 22:49:00', '22/07/2011 22:50:00', '22/07/2011 22:51:00', '22/07/2011 22:52:00', '22/07/2011 22:53:00', '22/07/2011 22:54:00', '22/07/2011 22:55:00', '22/07/2011 22:56:00', '22/07/2011 22:57:00', '22/07/2011 22:58:00', '22/07/2011 22:59:00', '22/07/2011 23:00:00', '22/07/2011 23:01:00', '22/07/2011 23:02:00', '22/07/2011 23:03:00', '22/07/2011 23:04:00', '22/07/2011 23:05:00', '22/07/2011 23:06:00', '22/07/2011 23:07:00', '22/07/2011 23:08:00', '22/07/2011 23:09:00', '22/07/2011 23:10:00', '22/07/2011 23:11:00', '22/07/2011 23:12:00', '22/07/2011 23:13:00', '22/07/2011 23:14:00', '22/07/2011 23:15:00', '22/07/2011 23:16:00', '22/07/2011 23:17:00', '22/07/2011 23:18:00', '22/07/2011 23:19:00', '22/07/2011 23:20:00', '22/07/2011 23:21:00', '22/07/2011 23:22:00', '22/07/2011 23:23:00', '22/07/2011 23:24:00', '22/07/2011 23:25:00', '22/07/2011 23:26:00', '22/07/2011 23:27:00', '22/07/2011 23:28:00', '22/07/2011 23:29:00', '22/07/2011 23:30:00', '22/07/2011 23:31:00', '22/07/2011 23:32:00', '22/07/2011 23:33:00', '22/07/2011 23:34:00', '22/07/2011 23:35:00', '22/07/2011 23:36:00', '22/07/2011 23:37:00', '22/07/2011 23:38:00', '22/07/2011 23:39:00', '22/07/2011 23:40:00', '22/07/2011 23:41:00', '22/07/2011 23:42:00', '22/07/2011 23:43:00', '22/07/2011 23:44:00', '22/07/2011 23:45:00', '22/07/2011 23:46:00', '22/07/2011 23:47:00', '22/07/2011 23:48:00', '22/07/2011 23:49:00', '22/07/2011 23:50:00', '22/07/2011 23:51:00', '22/07/2011 23:52:00', '22/07/2011 23:53:00', '22/07/2011 23:54:00', '22/07/2011 23:55:00', '22/07/2011 23:56:00', '22/07/2011 23:57:00', '22/07/2011 23:58:00', '22/07/2011 23:59:00', '23/07/2011 0:00:00', '23/07/2011 0:01:00', '23/07/2011 0:02:00', '23/07/2011 0:03:00', '23/07/2011 0:04:00', '23/07/2011 0:05:00', '23/07/2011 0:06:00', '23/07/2011 0:07:00', '23/07/2011 0:08:00', '23/07/2011 0:09:00', '23/07/2011 0:10:00', '23/07/2011 0:11:00', '23/07/2011 0:12:00', '23/07/2011 0:13:00', '23/07/2011 0:14:00', '23/07/2011 0:15:00', '23/07/2011 0:16:00', '23/07/2011 0:17:00', '23/07/2011 0:18:00', '23/07/2011 0:19:00', '23/07/2011 0:20:00', '23/07/2011 0:21:00', '23/07/2011 0:22:00', '23/07/2011 0:23:00', '23/07/2011 0:24:00', '23/07/2011 0:25:00', '23/07/2011 0:26:00', '23/07/2011 0:27:00', '23/07/2011 0:28:00', '23/07/2011 0:29:00', '23/07/2011 0:30:00', '23/07/2011 0:31:00', '23/07/2011 0:32:00', '23/07/2011 0:33:00', '23/07/2011 0:34:00', '23/07/2011 0:35:00', '23/07/2011 0:36:00', '23/07/2011 0:37:00', '23/07/2011 0:38:00', '23/07/2011 0:39:00', '23/07/2011 0:40:00', '23/07/2011 0:41:00', '23/07/2011 0:42:00', '23/07/2011 0:43:00', '23/07/2011 0:44:00', '23/07/2011 0:45:00', '23/07/2011 0:46:00', '23/07/2011 0:47:00', '23/07/2011 0:48:00', '23/07/2011 0:49:00', '23/07/2011 0:50:00', '23/07/2011 0:51:00', '23/07/2011 0:52:00', '23/07/2011 0:53:00', '23/07/2011 0:54:00', '23/07/2011 0:55:00', '23/07/2011 0:56:00', '23/07/2011 0:57:00', '23/07/2011 0:58:00', '23/07/2011 0:59:00', '23/07/2011 1:00:00', '23/07/2011 1:01:00', '23/07/2011 1:02:00', '23/07/2011 1:03:00', '23/07/2011 1:04:00', '23/07/2011 1:05:00', '23/07/2011 1:06:00', '23/07/2011 1:07:00', '23/07/2011 1:08:00', '23/07/2011 1:09:00', '23/07/2011 1:10:00', '23/07/2011 1:11:00', '23/07/2011 1:12:00', '23/07/2011 1:13:00', '23/07/2011 1:14:00', '23/07/2011 1:15:00', '23/07/2011 1:16:00', '23/07/2011 1:17:00', '23/07/2011 1:18:00', '23/07/2011 1:19:00', '23/07/2011 1:20:00', '23/07/2011 1:21:00', '23/07/2011 1:22:00', '23/07/2011 1:23:00', '23/07/2011 1:24:00', '23/07/2011 1:25:00', '23/07/2011 1:26:00', '23/07/2011 1:27:00', '23/07/2011 1:28:00', '23/07/2011 1:29:00', '23/07/2011 1:30:00', '23/07/2011 1:31:00', '23/07/2011 1:32:00', '23/07/2011 1:33:00', '23/07/2011 1:34:00', '23/07/2011 1:35:00', '23/07/2011 1:36:00', '23/07/2011 1:37:00', '23/07/2011 1:38:00', '23/07/2011 1:39:00', '23/07/2011 1:40:00', '23/07/2011 1:41:00', '23/07/2011 1:42:00', '23/07/2011 1:43:00', '23/07/2011 1:44:00', '23/07/2011 1:45:00', '23/07/2011 1:46:00', '23/07/2011 1:47:00', '23/07/2011 1:48:00', '23/07/2011 1:49:00', '23/07/2011 1:50:00', '23/07/2011 1:51:00', '23/07/2011 1:52:00', '23/07/2011 1:53:00', '23/07/2011 1:54:00', '23/07/2011 1:55:00', '23/07/2011 1:56:00', '23/07/2011 1:57:00', '23/07/2011 1:58:00', '23/07/2011 1:59:00', '23/07/2011 2:00:00', '23/07/2011 2:01:00', '23/07/2011 2:02:00', '23/07/2011 2:03:00', '23/07/2011 2:04:00', '23/07/2011 2:05:00', '23/07/2011 2:06:00', '23/07/2011 2:07:00', '23/07/2011 2:08:00', '23/07/2011 2:09:00', '23/07/2011 2:10:00', '23/07/2011 2:11:00', '23/07/2011 2:12:00', '23/07/2011 2:13:00', '23/07/2011 2:14:00', '23/07/2011 2:15:00', '23/07/2011 2:16:00', '23/07/2011 2:17:00', '23/07/2011 2:18:00', '23/07/2011 2:19:00', '23/07/2011 2:20:00', '23/07/2011 2:21:00', '23/07/2011 2:22:00', '23/07/2011 2:23:00', '23/07/2011 2:24:00', '23/07/2011 2:25:00', '23/07/2011 2:26:00', '23/07/2011 2:27:00', '23/07/2011 2:28:00', '23/07/2011 2:29:00', '23/07/2011 2:30:00', '23/07/2011 2:31:00', '23/07/2011 2:32:00', '23/07/2011 2:33:00', '23/07/2011 2:34:00', '23/07/2011 2:35:00', '23/07/2011 2:36:00', '23/07/2011 2:37:00', '23/07/2011 2:38:00', '23/07/2011 2:39:00', '23/07/2011 2:40:00', '23/07/2011 2:41:00', '23/07/2011 2:42:00', '23/07/2011 2:43:00', '23/07/2011 2:44:00', '23/07/2011 2:45:00', '23/07/2011 2:46:00', '23/07/2011 2:47:00', '23/07/2011 2:48:00', '23/07/2011 2:49:00', '23/07/2011 2:50:00', '23/07/2011 2:51:00', '23/07/2011 2:52:00', '23/07/2011 2:53:00', '23/07/2011 2:54:00', '23/07/2011 2:55:00', '23/07/2011 2:56:00', '23/07/2011 2:57:00', '23/07/2011 2:58:00', '23/07/2011 2:59:00', '23/07/2011 3:00:00', '23/07/2011 3:01:00', '23/07/2011 3:02:00', '23/07/2011 3:03:00', '23/07/2011 3:04:00', '23/07/2011 3:05:00', '23/07/2011 3:06:00', '23/07/2011 3:07:00', '23/07/2011 3:08:00', '23/07/2011 3:09:00', '23/07/2011 3:10:00', '23/07/2011 3:11:00', '23/07/2011 3:12:00', '23/07/2011 3:13:00', '23/07/2011 3:14:00', '23/07/2011 3:15:00', '23/07/2011 3:16:00', '23/07/2011 3:17:00', '23/07/2011 3:18:00', '23/07/2011 3:19:00', '23/07/2011 3:20:00', '23/07/2011 3:21:00', '23/07/2011 3:22:00', '23/07/2011 3:23:00', '23/07/2011 3:24:00', '23/07/2011 3:25:00', '23/07/2011 3:26:00', '23/07/2011 3:27:00', '23/07/2011 3:28:00', '23/07/2011 3:29:00', '23/07/2011 3:30:00', '23/07/2011 3:31:00', '23/07/2011 3:32:00', '23/07/2011 3:33:00', '23/07/2011 3:34:00', '23/07/2011 3:35:00', '23/07/2011 3:36:00', '23/07/2011 3:37:00', '23/07/2011 3:38:00', '23/07/2011 3:39:00', '23/07/2011 3:40:00', '23/07/2011 3:41:00', '23/07/2011 3:42:00', '23/07/2011 3:43:00', '23/07/2011 3:44:00', '23/07/2011 3:45:00', '23/07/2011 3:46:00', '23/07/2011 3:47:00', '23/07/2011 3:48:00', '23/07/2011 3:49:00', '23/07/2011 3:50:00', '23/07/2011 3:51:00', '23/07/2011 3:52:00', '23/07/2011 3:53:00', '23/07/2011 3:54:00', '23/07/2011 3:55:00', '23/07/2011 3:56:00', '23/07/2011 3:57:00', '23/07/2011 3:58:00', '23/07/2011 3:59:00', '23/07/2011 4:00:00', '23/07/2011 4:01:00', '23/07/2011 4:02:00', '23/07/2011 4:03:00', '23/07/2011 4:04:00', '23/07/2011 4:05:00', '23/07/2011 4:06:00', '23/07/2011 4:07:00', '23/07/2011 4:08:00', '23/07/2011 4:09:00', '23/07/2011 4:10:00', '23/07/2011 4:11:00', '23/07/2011 4:12:00', '23/07/2011 4:13:00', '23/07/2011 4:14:00', '23/07/2011 4:15:00', '23/07/2011 4:16:00', '23/07/2011 4:17:00', '23/07/2011 4:18:00', '23/07/2011 4:19:00', '23/07/2011 4:20:00', '23/07/2011 4:21:00', '23/07/2011 4:22:00', '23/07/2011 4:23:00', '23/07/2011 4:24:00', '23/07/2011 4:25:00', '23/07/2011 4:26:00', '23/07/2011 4:27:00', '23/07/2011 4:28:00', '23/07/2011 4:29:00', '23/07/2011 4:30:00', '23/07/2011 4:31:00', '23/07/2011 4:32:00', '23/07/2011 4:33:00', '23/07/2011 4:34:00', '23/07/2011 4:35:00', '23/07/2011 4:36:00', '23/07/2011 4:37:00', '23/07/2011 4:38:00', '23/07/2011 4:39:00', '23/07/2011 4:40:00', '23/07/2011 4:41:00', '23/07/2011 4:42:00', '23/07/2011 4:43:00', '23/07/2011 4:44:00', '23/07/2011 4:45:00', '23/07/2011 4:46:00', '23/07/2011 4:47:00', '23/07/2011 4:48:00', '23/07/2011 4:49:00', '23/07/2011 4:50:00', '23/07/2011 4:51:00', '23/07/2011 4:52:00', '23/07/2011 4:53:00', '23/07/2011 4:54:00', '23/07/2011 4:55:00', '23/07/2011 4:56:00', '23/07/2011 4:57:00', '23/07/2011 4:58:00', '23/07/2011 4:59:00', '23/07/2011 5:00:00', '23/07/2011 5:01:00', '23/07/2011 5:02:00', '23/07/2011 5:03:00', '23/07/2011 5:04:00', '23/07/2011 5:05:00', '23/07/2011 5:06:00', '23/07/2011 5:07:00', '23/07/2011 5:08:00', '23/07/2011 5:09:00', '23/07/2011 5:10:00', '23/07/2011 5:11:00', '23/07/2011 5:12:00', '23/07/2011 5:13:00', '23/07/2011 5:14:00', '23/07/2011 5:15:00', '23/07/2011 5:16:00', '23/07/2011 5:17:00', '23/07/2011 5:18:00', '23/07/2011 5:19:00', '23/07/2011 5:20:00', '23/07/2011 5:21:00', '23/07/2011 5:22:00', '23/07/2011 5:23:00', '23/07/2011 5:24:00', '23/07/2011 5:25:00', '23/07/2011 5:26:00', '23/07/2011 5:27:00', '23/07/2011 5:28:00', '23/07/2011 5:29:00', '23/07/2011 5:30:00', '23/07/2011 5:31:00', '23/07/2011 5:32:00', '23/07/2011 5:33:00', '23/07/2011 5:34:00', '23/07/2011 5:35:00', '23/07/2011 5:36:00', '23/07/2011 5:37:00', '23/07/2011 5:38:00', '23/07/2011 5:39:00', '23/07/2011 5:40:00', '23/07/2011 5:41:00', '23/07/2011 5:42:00', '23/07/2011 5:43:00', '23/07/2011 5:44:00', '23/07/2011 5:45:00', '23/07/2011 5:46:00', '23/07/2011 5:47:00', '23/07/2011 5:48:00', '23/07/2011 5:49:00', '23/07/2011 5:50:00', '23/07/2011 5:51:00', '23/07/2011 5:52:00', '23/07/2011 5:53:00', '23/07/2011 5:54:00', '23/07/2011 5:55:00', '23/07/2011 5:56:00', '23/07/2011 5:57:00', '23/07/2011 5:58:00', '23/07/2011 5:59:00', '23/07/2011 6:00:00', '23/07/2011 6:01:00', '23/07/2011 6:02:00', '23/07/2011 6:03:00', '23/07/2011 6:04:00', '23/07/2011 6:05:00', '23/07/2011 6:06:00', '23/07/2011 6:07:00', '23/07/2011 6:08:00', '23/07/2011 6:09:00', '23/07/2011 6:10:00', '23/07/2011 6:11:00', '23/07/2011 6:12:00', '23/07/2011 6:13:00', '23/07/2011 6:14:00', '23/07/2011 6:15:00', '23/07/2011 6:16:00', '23/07/2011 6:17:00', '23/07/2011 6:18:00', '23/07/2011 6:19:00', '23/07/2011 6:20:00', '23/07/2011 6:21:00', '23/07/2011 6:22:00', '23/07/2011 6:23:00', '23/07/2011 6:24:00', '23/07/2011 6:25:00', '23/07/2011 6:26:00', '23/07/2011 6:27:00', '23/07/2011 6:28:00', '23/07/2011 6:29:00', '23/07/2011 6:30:00', '23/07/2011 6:31:00', '23/07/2011 6:32:00', '23/07/2011 6:33:00', '23/07/2011 6:34:00', '23/07/2011 6:35:00', '23/07/2011 6:36:00', '23/07/2011 6:37:00', '23/07/2011 6:38:00', '23/07/2011 6:39:00', '23/07/2011 6:40:00', '23/07/2011 6:41:00', '23/07/2011 6:42:00', '23/07/2011 6:43:00', '23/07/2011 6:44:00', '23/07/2011 6:45:00', '23/07/2011 6:46:00', '23/07/2011 6:47:00', '23/07/2011 6:48:00', '23/07/2011 6:49:00', '23/07/2011 6:50:00', '23/07/2011 6:51:00', '23/07/2011 6:52:00', '23/07/2011 6:53:00', '23/07/2011 6:54:00', '23/07/2011 6:55:00', '23/07/2011 6:56:00', '23/07/2011 6:57:00', '23/07/2011 6:58:00', '23/07/2011 6:59:00', '23/07/2011 7:00:00', '23/07/2011 7:01:00', '23/07/2011 7:02:00', '23/07/2011 7:03:00', '23/07/2011 7:04:00', '23/07/2011 7:05:00', '23/07/2011 7:06:00', '23/07/2011 7:07:00', '23/07/2011 7:08:00', '23/07/2011 7:09:00', '23/07/2011 7:10:00', '23/07/2011 7:11:00', '23/07/2011 7:12:00', '23/07/2011 7:13:00', '23/07/2011 7:14:00', '23/07/2011 7:15:00', '23/07/2011 7:16:00', '23/07/2011 7:17:00', '23/07/2011 7:18:00', '23/07/2011 7:19:00', '23/07/2011 7:20:00', '23/07/2011 7:21:00', '23/07/2011 7:22:00', '23/07/2011 7:23:00', '23/07/2011 7:24:00', '23/07/2011 7:25:00', '23/07/2011 7:26:00', '23/07/2011 7:27:00', '23/07/2011 7:28:00', '23/07/2011 7:29:00', '23/07/2011 7:30:00', '23/07/2011 7:31:00', '23/07/2011 7:32:00', '23/07/2011 7:33:00', '23/07/2011 7:34:00', '23/07/2011 7:35:00', '23/07/2011 7:36:00', '23/07/2011 7:37:00', '23/07/2011 7:38:00', '23/07/2011 7:39:00', '23/07/2011 7:40:00', '23/07/2011 7:41:00', '23/07/2011 7:42:00', '23/07/2011 7:43:00', '23/07/2011 7:44:00', '23/07/2011 7:45:00', '23/07/2011 7:46:00', '23/07/2011 7:47:00', '23/07/2011 7:48:00', '23/07/2011 7:49:00', '23/07/2011 7:50:00', '23/07/2011 7:51:00', '23/07/2011 7:52:00', '23/07/2011 7:53:00', '23/07/2011 7:54:00', '23/07/2011 7:55:00', '23/07/2011 7:56:00', '23/07/2011 7:57:00', '23/07/2011 7:58:00', '23/07/2011 7:59:00', '23/07/2011 8:00:00', '23/07/2011 8:01:00', '23/07/2011 8:02:00', '23/07/2011 8:03:00', '23/07/2011 8:04:00', '23/07/2011 8:05:00', '23/07/2011 8:06:00', '23/07/2011 8:07:00', '23/07/2011 8:08:00', '23/07/2011 8:09:00', '23/07/2011 8:10:00', '23/07/2011 8:11:00', '23/07/2011 8:12:00', '23/07/2011 8:13:00', '23/07/2011 8:14:00', '23/07/2011 8:15:00', '23/07/2011 8:16:00', '23/07/2011 8:17:00', '23/07/2011 8:18:00', '23/07/2011 8:19:00', '23/07/2011 8:20:00', '23/07/2011 8:21:00', '23/07/2011 8:22:00', '23/07/2011 8:23:00', '23/07/2011 8:24:00', '23/07/2011 8:25:00', '23/07/2011 8:26:00', '23/07/2011 8:27:00', '23/07/2011 8:28:00', '23/07/2011 8:29:00', '23/07/2011 8:30:00', '23/07/2011 8:31:00', '23/07/2011 8:32:00', '23/07/2011 8:33:00', '23/07/2011 8:34:00', '23/07/2011 8:35:00', '23/07/2011 8:36:00', '23/07/2011 8:37:00', '23/07/2011 8:38:00', '23/07/2011 8:39:00', '23/07/2011 8:40:00', '23/07/2011 8:41:00', '23/07/2011 8:42:00', '23/07/2011 8:43:00', '23/07/2011 8:44:00', '23/07/2011 8:45:00', '23/07/2011 8:46:00', '23/07/2011 8:47:00', '23/07/2011 8:48:00', '23/07/2011 8:49:00', '23/07/2011 8:50:00', '23/07/2011 8:51:00', '23/07/2011 8:52:00', '23/07/2011 8:53:00', '23/07/2011 8:54:00', '23/07/2011 8:55:00', '23/07/2011 8:56:00', '23/07/2011 8:57:00', '23/07/2011 8:58:00', '23/07/2011 8:59:00', '23/07/2011 9:00:00', '23/07/2011 9:01:00', '23/07/2011 9:02:00', '23/07/2011 9:03:00', '23/07/2011 9:04:00', '23/07/2011 9:05:00', '23/07/2011 9:06:00', '23/07/2011 9:07:00', '23/07/2011 9:08:00', '23/07/2011 9:09:00', '23/07/2011 9:10:00', '23/07/2011 9:11:00', '23/07/2011 9:12:00', '23/07/2011 9:13:00', '23/07/2011 9:14:00', '23/07/2011 9:15:00', '23/07/2011 9:16:00', '23/07/2011 9:17:00', '23/07/2011 9:18:00', '23/07/2011 9:19:00', '23/07/2011 9:20:00', '23/07/2011 9:21:00', '23/07/2011 9:22:00', '23/07/2011 9:23:00', '23/07/2011 9:24:00', '23/07/2011 9:25:00', '23/07/2011 9:26:00', '23/07/2011 9:27:00', '23/07/2011 9:28:00', '23/07/2011 9:29:00', '23/07/2011 9:30:00', '23/07/2011 9:31:00', '23/07/2011 9:32:00', '23/07/2011 9:33:00', '23/07/2011 9:34:00', '23/07/2011 9:35:00', '23/07/2011 9:36:00', '23/07/2011 9:37:00', '23/07/2011 9:38:00', '23/07/2011 9:39:00', '23/07/2011 9:40:00', '23/07/2011 9:41:00', '23/07/2011 9:42:00', '23/07/2011 9:43:00', '23/07/2011 9:44:00', '23/07/2011 9:45:00', '23/07/2011 9:46:00', '23/07/2011 9:47:00', '23/07/2011 9:48:00', '23/07/2011 9:49:00', '23/07/2011 9:50:00', '23/07/2011 9:51:00', '23/07/2011 9:52:00', '23/07/2011 9:53:00', '23/07/2011 9:54:00', '23/07/2011 9:55:00', '23/07/2011 9:56:00', '23/07/2011 9:57:00', '23/07/2011 9:58:00', '23/07/2011 9:59:00', '23/07/2011 10:00:00', '23/07/2011 10:01:00', '23/07/2011 10:02:00', '23/07/2011 10:03:00', '23/07/2011 10:04:00', '23/07/2011 10:05:00', '23/07/2011 10:06:00', '23/07/2011 10:07:00', '23/07/2011 10:08:00', '23/07/2011 10:09:00', '23/07/2011 10:10:00', '23/07/2011 10:11:00', '23/07/2011 10:12:00', '23/07/2011 10:13:00', '23/07/2011 10:14:00', '23/07/2011 10:15:00', '23/07/2011 10:16:00', '23/07/2011 10:17:00', '23/07/2011 10:18:00', '23/07/2011 10:19:00', '23/07/2011 10:20:00', '23/07/2011 10:21:00', '23/07/2011 10:22:00', '23/07/2011 10:23:00', '23/07/2011 10:24:00', '23/07/2011 10:25:00', '23/07/2011 10:26:00', '23/07/2011 10:27:00', '23/07/2011 10:28:00', '23/07/2011 10:29:00', '23/07/2011 10:30:00', '23/07/2011 10:31:00', '23/07/2011 10:32:00', '23/07/2011 10:33:00', '23/07/2011 10:34:00', '23/07/2011 10:35:00', '23/07/2011 10:36:00', '23/07/2011 10:37:00', '23/07/2011 10:38:00', '23/07/2011 10:39:00', '23/07/2011 10:40:00', '23/07/2011 10:41:00', '23/07/2011 10:42:00', '23/07/2011 10:43:00', '23/07/2011 10:44:00', '23/07/2011 10:45:00', '23/07/2011 10:46:00', '23/07/2011 10:47:00', '23/07/2011 10:48:00', '23/07/2011 10:49:00', '23/07/2011 10:50:00', '23/07/2011 10:51:00', '23/07/2011 10:52:00', '23/07/2011 10:53:00', '23/07/2011 10:54:00', '23/07/2011 10:55:00', '23/07/2011 10:56:00', '23/07/2011 10:57:00', '23/07/2011 10:58:00', '23/07/2011 10:59:00', '23/07/2011 11:00:00', '23/07/2011 11:01:00', '23/07/2011 11:02:00', '23/07/2011 11:03:00', '23/07/2011 11:04:00', '23/07/2011 11:05:00', '23/07/2011 11:06:00', '23/07/2011 11:07:00', '23/07/2011 11:08:00', '23/07/2011 11:09:00', '23/07/2011 11:10:00', '23/07/2011 11:11:00', '23/07/2011 11:12:00', '23/07/2011 11:13:00', '23/07/2011 11:14:00', '23/07/2011 11:15:00', '23/07/2011 11:16:00', '23/07/2011 11:17:00', '23/07/2011 11:18:00', '23/07/2011 11:19:00', '23/07/2011 11:20:00', '23/07/2011 11:21:00', '23/07/2011 11:22:00', '23/07/2011 11:23:00', '23/07/2011 11:24:00', '23/07/2011 11:25:00', '23/07/2011 11:26:00', '23/07/2011 11:27:00', '23/07/2011 11:28:00', '23/07/2011 11:29:00', '23/07/2011 11:30:00', '23/07/2011 11:31:00', '23/07/2011 11:32:00', '23/07/2011 11:33:00', '23/07/2011 11:34:00', '23/07/2011 11:35:00', '23/07/2011 11:36:00', '23/07/2011 11:37:00', '23/07/2011 11:38:00', '23/07/2011 11:39:00', '23/07/2011 11:40:00', '23/07/2011 11:41:00', '23/07/2011 11:42:00', '23/07/2011 11:43:00', '23/07/2011 11:44:00', '23/07/2011 11:45:00', '23/07/2011 11:46:00', '23/07/2011 11:47:00', '23/07/2011 11:48:00', '23/07/2011 11:49:00', '23/07/2011 11:50:00', '23/07/2011 11:51:00', '23/07/2011 11:52:00', '23/07/2011 11:53:00', '23/07/2011 11:54:00', '23/07/2011 11:55:00', '23/07/2011 11:56:00', '23/07/2011 11:57:00', '23/07/2011 11:58:00', '23/07/2011 11:59:00', '23/07/2011 12:00:00', '23/07/2011 12:01:00', '23/07/2011 12:02:00', '23/07/2011 12:03:00', '23/07/2011 12:04:00', '23/07/2011 12:05:00', '23/07/2011 12:06:00', '23/07/2011 12:07:00', '23/07/2011 12:08:00', '23/07/2011 12:09:00', '23/07/2011 12:10:00', '23/07/2011 12:11:00', '23/07/2011 12:12:00', '23/07/2011 12:13:00', '23/07/2011 12:14:00', '23/07/2011 12:15:00', '23/07/2011 12:16:00', '23/07/2011 12:17:00', '23/07/2011 12:18:00', '23/07/2011 12:19:00', '23/07/2011 12:20:00', '23/07/2011 12:21:00', '23/07/2011 12:22:00', '23/07/2011 12:23:00', '23/07/2011 12:24:00', '23/07/2011 12:25:00', '23/07/2011 12:26:00', '23/07/2011 12:27:00', '23/07/2011 12:28:00', '23/07/2011 12:29:00', '23/07/2011 12:30:00', '23/07/2011 12:31:00', '23/07/2011 12:32:00', '23/07/2011 12:33:00', '23/07/2011 12:34:00', '23/07/2011 12:35:00', '23/07/2011 12:36:00', '23/07/2011 12:37:00', '23/07/2011 12:38:00', '23/07/2011 12:39:00', '23/07/2011 12:40:00', '23/07/2011 12:41:00', '23/07/2011 12:42:00', '23/07/2011 12:43:00', '23/07/2011 12:44:00', '23/07/2011 12:45:00', '23/07/2011 12:46:00', '23/07/2011 12:47:00', '23/07/2011 12:48:00', '23/07/2011 12:49:00', '23/07/2011 12:50:00', '23/07/2011 12:51:00', '23/07/2011 12:52:00', '23/07/2011 12:53:00', '23/07/2011 12:54:00', '23/07/2011 12:55:00', '23/07/2011 12:56:00', '23/07/2011 12:57:00', '23/07/2011 12:58:00', '23/07/2011 12:59:00', '23/07/2011 13:00:00', '23/07/2011 13:01:00', '23/07/2011 13:02:00', '23/07/2011 13:03:00', '23/07/2011 13:04:00', '23/07/2011 13:05:00', '23/07/2011 13:06:00', '23/07/2011 13:07:00', '23/07/2011 13:08:00', '23/07/2011 13:09:00', '23/07/2011 13:10:00', '23/07/2011 13:11:00', '23/07/2011 13:12:00', '23/07/2011 13:13:00', '23/07/2011 13:14:00', '23/07/2011 13:15:00', '23/07/2011 13:16:00', '23/07/2011 13:17:00', '23/07/2011 13:18:00', '23/07/2011 13:19:00', '23/07/2011 13:20:00', '23/07/2011 13:21:00', '23/07/2011 13:22:00', '23/07/2011 13:23:00', '23/07/2011 13:24:00', '23/07/2011 13:25:00', '23/07/2011 13:26:00', '23/07/2011 13:27:00', '23/07/2011 13:28:00', '23/07/2011 13:29:00', '23/07/2011 13:30:00', '23/07/2011 13:31:00', '23/07/2011 13:32:00', '23/07/2011 13:33:00', '23/07/2011 13:34:00', '23/07/2011 13:35:00', '23/07/2011 13:36:00', '23/07/2011 13:37:00', '23/07/2011 13:38:00', '23/07/2011 13:39:00', '23/07/2011 13:40:00', '23/07/2011 13:41:00', '23/07/2011 13:42:00', '23/07/2011 13:43:00', '23/07/2011 13:44:00', '23/07/2011 13:45:00', '23/07/2011 13:46:00', '23/07/2011 13:47:00', '23/07/2011 13:48:00', '23/07/2011 13:49:00', '23/07/2011 13:50:00', '23/07/2011 13:51:00', '23/07/2011 13:52:00', '23/07/2011 13:53:00', '23/07/2011 13:54:00', '23/07/2011 13:55:00', '23/07/2011 13:56:00', '23/07/2011 13:57:00', '23/07/2011 13:58:00', '23/07/2011 13:59:00', '23/07/2011 14:00:00', '23/07/2011 14:01:00', '23/07/2011 14:02:00', '23/07/2011 14:03:00', '23/07/2011 14:04:00', '23/07/2011 14:05:00', '23/07/2011 14:06:00', '23/07/2011 14:07:00', '23/07/2011 14:08:00', '23/07/2011 14:09:00', '23/07/2011 14:10:00', '23/07/2011 14:11:00', '23/07/2011 14:12:00', '23/07/2011 14:13:00', '23/07/2011 14:14:00', '23/07/2011 14:15:00', '23/07/2011 14:16:00', '23/07/2011 14:17:00', '23/07/2011 14:18:00', '23/07/2011 14:19:00', '23/07/2011 14:20:00', '23/07/2011 14:21:00', '23/07/2011 14:22:00', '23/07/2011 14:23:00', '23/07/2011 14:24:00', '23/07/2011 14:25:00', '23/07/2011 14:26:00', '23/07/2011 14:27:00', '23/07/2011 14:28:00', '23/07/2011 14:29:00', '23/07/2011 14:30:00', '23/07/2011 14:31:00', '23/07/2011 14:32:00', '23/07/2011 14:33:00', '23/07/2011 14:34:00', '23/07/2011 14:35:00', '23/07/2011 14:36:00', '23/07/2011 14:37:00', '23/07/2011 14:38:00', '23/07/2011 14:39:00', '23/07/2011 14:40:00', '23/07/2011 14:41:00', '23/07/2011 14:42:00', '23/07/2011 14:43:00', '23/07/2011 14:44:00', '23/07/2011 14:45:00', '23/07/2011 14:46:00', '23/07/2011 14:47:00', '23/07/2011 14:48:00', '23/07/2011 14:49:00', '23/07/2011 14:50:00', '23/07/2011 14:51:00', '23/07/2011 14:52:00', '23/07/2011 14:53:00', '23/07/2011 14:54:00', '23/07/2011 14:55:00', '23/07/2011 14:56:00', '23/07/2011 14:57:00', '23/07/2011 14:58:00', '23/07/2011 14:59:00', '23/07/2011 15:00:00', '23/07/2011 15:01:00', '23/07/2011 15:02:00', '23/07/2011 15:03:00', '23/07/2011 15:04:00', '23/07/2011 15:05:00', '23/07/2011 15:06:00', '23/07/2011 15:07:00', '23/07/2011 15:08:00', '23/07/2011 15:09:00', '23/07/2011 15:10:00', '23/07/2011 15:11:00', '23/07/2011 15:12:00', '23/07/2011 15:13:00', '23/07/2011 15:14:00', '23/07/2011 15:15:00', '23/07/2011 15:16:00', '23/07/2011 15:17:00', '23/07/2011 15:18:00', '23/07/2011 15:19:00', '23/07/2011 15:20:00', '23/07/2011 15:21:00', '23/07/2011 15:22:00', '23/07/2011 15:23:00', '23/07/2011 15:24:00', '23/07/2011 15:25:00', '23/07/2011 15:26:00', '23/07/2011 15:27:00', '23/07/2011 15:28:00', '23/07/2011 15:29:00', '23/07/2011 15:30:00', '23/07/2011 15:31:00', '23/07/2011 15:32:00', '23/07/2011 15:33:00', '23/07/2011 15:34:00', '23/07/2011 15:35:00', '23/07/2011 15:36:00', '23/07/2011 15:37:00', '23/07/2011 15:38:00', '23/07/2011 15:39:00', '23/07/2011 15:40:00', '23/07/2011 15:41:00', '23/07/2011 15:42:00', '23/07/2011 15:43:00', '23/07/2011 15:44:00', '23/07/2011 15:45:00', '23/07/2011 15:46:00', '23/07/2011 15:47:00', '23/07/2011 15:48:00', '23/07/2011 15:49:00', '23/07/2011 15:50:00', '23/07/2011 15:51:00', '23/07/2011 15:52:00', '23/07/2011 15:53:00', '23/07/2011 15:54:00', '23/07/2011 15:55:00', '23/07/2011 15:56:00', '23/07/2011 15:57:00', '23/07/2011 15:58:00', '23/07/2011 15:59:00', '23/07/2011 16:00:00', '23/07/2011 16:01:00', '23/07/2011 16:02:00', '23/07/2011 16:03:00', '23/07/2011 16:04:00', '23/07/2011 16:05:00', '23/07/2011 16:06:00', '23/07/2011 16:07:00', '23/07/2011 16:08:00', '23/07/2011 16:09:00', '23/07/2011 16:10:00', '23/07/2011 16:11:00', '23/07/2011 16:12:00', '23/07/2011 16:13:00', '23/07/2011 16:14:00', '23/07/2011 16:15:00', '23/07/2011 16:16:00', '23/07/2011 16:17:00', '23/07/2011 16:18:00', '23/07/2011 16:19:00', '23/07/2011 16:20:00', '23/07/2011 16:21:00', '23/07/2011 16:22:00', '23/07/2011 16:23:00', '23/07/2011 16:24:00', '23/07/2011 16:25:00', '23/07/2011 16:26:00', '23/07/2011 16:27:00', '23/07/2011 16:28:00', '23/07/2011 16:29:00', '23/07/2011 16:30:00', '23/07/2011 16:31:00', '23/07/2011 16:32:00', '23/07/2011 16:33:00', '23/07/2011 16:34:00', '23/07/2011 16:35:00', '23/07/2011 16:36:00', '23/07/2011 16:37:00', '23/07/2011 16:38:00', '23/07/2011 16:39:00', '23/07/2011 16:40:00', '23/07/2011 16:41:00', '23/07/2011 16:42:00', '23/07/2011 16:43:00', '23/07/2011 16:44:00', '23/07/2011 16:45:00', '23/07/2011 16:46:00', '23/07/2011 16:47:00', '23/07/2011 16:48:00', '23/07/2011 16:49:00', '23/07/2011 16:50:00', '23/07/2011 16:51:00', '23/07/2011 16:52:00', '23/07/2011 16:53:00', '23/07/2011 16:54:00', '23/07/2011 16:55:00', '23/07/2011 16:56:00', '23/07/2011 16:57:00', '23/07/2011 16:58:00', '23/07/2011 16:59:00', '23/07/2011 17:00:00', '23/07/2011 17:01:00', '23/07/2011 17:02:00', '23/07/2011 17:03:00', '23/07/2011 17:04:00', '23/07/2011 17:05:00', '23/07/2011 17:06:00', '23/07/2011 17:07:00', '23/07/2011 17:08:00', '23/07/2011 17:09:00', '23/07/2011 17:10:00', '23/07/2011 17:11:00', '23/07/2011 17:12:00', '23/07/2011 17:13:00', '23/07/2011 17:14:00', '23/07/2011 17:15:00', '23/07/2011 17:16:00', '23/07/2011 17:17:00', '23/07/2011 17:18:00', '23/07/2011 17:19:00', '23/07/2011 17:20:00', '23/07/2011 17:21:00', '23/07/2011 17:22:00', '23/07/2011 17:23:00', '23/07/2011 17:24:00', '23/07/2011 17:25:00', '23/07/2011 17:26:00', '23/07/2011 17:27:00', '23/07/2011 17:28:00', '23/07/2011 17:29:00', '23/07/2011 17:30:00', '23/07/2011 17:31:00', '23/07/2011 17:32:00', '23/07/2011 17:33:00', '23/07/2011 17:34:00', '23/07/2011 17:35:00', '23/07/2011 17:36:00', '23/07/2011 17:37:00', '23/07/2011 17:38:00', '23/07/2011 17:39:00', '23/07/2011 17:40:00', '23/07/2011 17:41:00', '23/07/2011 17:42:00', '23/07/2011 17:43:00', '23/07/2011 17:44:00', '23/07/2011 17:45:00', '23/07/2011 17:46:00', '23/07/2011 17:47:00', '23/07/2011 17:48:00', '23/07/2011 17:49:00', '23/07/2011 17:50:00', '23/07/2011 17:51:00', '23/07/2011 17:52:00', '23/07/2011 17:53:00', '23/07/2011 17:54:00', '23/07/2011 17:55:00', '23/07/2011 17:56:00', '23/07/2011 17:57:00', '23/07/2011 17:58:00', '23/07/2011 17:59:00', '23/07/2011 18:00:00', '23/07/2011 18:01:00', '23/07/2011 18:02:00', '23/07/2011 18:03:00', '23/07/2011 18:04:00', '23/07/2011 18:05:00', '23/07/2011 18:06:00', '23/07/2011 18:07:00', '23/07/2011 18:08:00', '23/07/2011 18:09:00', '23/07/2011 18:10:00', '23/07/2011 18:11:00', '23/07/2011 18:12:00', '23/07/2011 18:13:00', '23/07/2011 18:14:00', '23/07/2011 18:15:00', '23/07/2011 18:16:00', '23/07/2011 18:17:00', '23/07/2011 18:18:00', '23/07/2011 18:19:00', '23/07/2011 18:20:00', '23/07/2011 18:21:00', '23/07/2011 18:22:00', '23/07/2011 18:23:00', '23/07/2011 18:24:00', '23/07/2011 18:25:00', '23/07/2011 18:26:00', '23/07/2011 18:27:00', '23/07/2011 18:28:00', '23/07/2011 18:29:00', '23/07/2011 18:30:00', '23/07/2011 18:31:00', '23/07/2011 18:32:00', '23/07/2011 18:33:00', '23/07/2011 18:34:00', '23/07/2011 18:35:00', '23/07/2011 18:36:00', '23/07/2011 18:37:00', '23/07/2011 18:38:00', '23/07/2011 18:39:00', '23/07/2011 18:40:00', '23/07/2011 18:41:00', '23/07/2011 18:42:00', '23/07/2011 18:43:00', '23/07/2011 18:44:00', '23/07/2011 18:45:00', '23/07/2011 18:46:00', '23/07/2011 18:47:00', '23/07/2011 18:48:00', '23/07/2011 18:49:00', '23/07/2011 18:50:00', '23/07/2011 18:51:00', '23/07/2011 18:52:00', '23/07/2011 18:53:00', '23/07/2011 18:54:00', '23/07/2011 18:55:00', '23/07/2011 18:56:00', '23/07/2011 18:57:00', '23/07/2011 18:58:00', '23/07/2011 18:59:00', '23/07/2011 19:00:00', '23/07/2011 19:01:00', '23/07/2011 19:02:00', '23/07/2011 19:03:00', '23/07/2011 19:04:00', '23/07/2011 19:05:00', '23/07/2011 19:06:00', '23/07/2011 19:07:00', '23/07/2011 19:08:00', '23/07/2011 19:09:00', '23/07/2011 19:10:00', '23/07/2011 19:11:00', '23/07/2011 19:12:00', '23/07/2011 19:13:00', '23/07/2011 19:14:00', '23/07/2011 19:15:00', '23/07/2011 19:16:00', '23/07/2011 19:17:00', '23/07/2011 19:18:00', '23/07/2011 19:19:00', '23/07/2011 19:20:00', '23/07/2011 19:21:00', '23/07/2011 19:22:00', '23/07/2011 19:23:00', '23/07/2011 19:24:00', '23/07/2011 19:25:00', '23/07/2011 19:26:00', '23/07/2011 19:27:00', '23/07/2011 19:28:00', '23/07/2011 19:29:00', '23/07/2011 19:30:00', '23/07/2011 19:31:00', '23/07/2011 19:32:00', '23/07/2011 19:33:00', '23/07/2011 19:34:00', '23/07/2011 19:35:00', '23/07/2011 19:36:00', '23/07/2011 19:37:00', '23/07/2011 19:38:00', '23/07/2011 19:39:00', '23/07/2011 19:40:00', '23/07/2011 19:41:00', '23/07/2011 19:42:00', '23/07/2011 19:43:00', '23/07/2011 19:44:00', '23/07/2011 19:45:00', '23/07/2011 19:46:00', '23/07/2011 19:47:00', '23/07/2011 19:48:00', '23/07/2011 19:49:00', '23/07/2011 19:50:00', '23/07/2011 19:51:00', '23/07/2011 19:52:00', '23/07/2011 19:53:00', '23/07/2011 19:54:00', '23/07/2011 19:55:00', '23/07/2011 19:56:00', '23/07/2011 19:57:00', '23/07/2011 19:58:00', '23/07/2011 19:59:00', '23/07/2011 20:00:00', '23/07/2011 20:01:00', '23/07/2011 20:02:00', '23/07/2011 20:03:00', '23/07/2011 20:04:00', '23/07/2011 20:05:00', '23/07/2011 20:06:00', '23/07/2011 20:07:00', '23/07/2011 20:08:00', '23/07/2011 20:09:00', '23/07/2011 20:10:00', '23/07/2011 20:11:00', '23/07/2011 20:12:00', '23/07/2011 20:13:00', '23/07/2011 20:14:00', '23/07/2011 20:15:00', '23/07/2011 20:16:00', '23/07/2011 20:17:00', '23/07/2011 20:18:00', '23/07/2011 20:19:00', '23/07/2011 20:20:00', '23/07/2011 20:21:00', '23/07/2011 20:22:00', '23/07/2011 20:23:00', '23/07/2011 20:24:00', '23/07/2011 20:25:00', '23/07/2011 20:26:00', '23/07/2011 20:27:00', '23/07/2011 20:28:00', '23/07/2011 20:29:00', '23/07/2011 20:30:00', '23/07/2011 20:31:00', '23/07/2011 20:32:00', '23/07/2011 20:33:00', '23/07/2011 20:34:00', '23/07/2011 20:35:00', '23/07/2011 20:36:00', '23/07/2011 20:37:00', '23/07/2011 20:38:00', '23/07/2011 20:39:00', '23/07/2011 20:40:00', '23/07/2011 20:41:00', '23/07/2011 20:42:00', '23/07/2011 20:43:00', '23/07/2011 20:44:00', '23/07/2011 20:45:00', '23/07/2011 20:46:00', '23/07/2011 20:47:00', '23/07/2011 20:48:00', '23/07/2011 20:49:00', '23/07/2011 20:50:00', '23/07/2011 20:51:00', '23/07/2011 20:52:00', '23/07/2011 20:53:00', '23/07/2011 20:54:00', '23/07/2011 20:55:00', '23/07/2011 20:56:00', '23/07/2011 20:57:00', '23/07/2011 20:58:00', '23/07/2011 20:59:00', '23/07/2011 21:00:00', '23/07/2011 21:01:00', '23/07/2011 21:02:00', '23/07/2011 21:03:00', '23/07/2011 21:04:00', '23/07/2011 21:05:00', '23/07/2011 21:06:00', '23/07/2011 21:07:00', '23/07/2011 21:08:00', '23/07/2011 21:09:00', '23/07/2011 21:10:00', '23/07/2011 21:11:00', '23/07/2011 21:12:00', '23/07/2011 21:13:00', '23/07/2011 21:14:00', '23/07/2011 21:15:00', '23/07/2011 21:16:00', '23/07/2011 21:17:00', '23/07/2011 21:18:00', '23/07/2011 21:19:00', '23/07/2011 21:20:00', '23/07/2011 21:21:00', '23/07/2011 21:22:00', '23/07/2011 21:23:00', '23/07/2011 21:24:00', '23/07/2011 21:25:00', '23/07/2011 21:26:00', '23/07/2011 21:27:00', '23/07/2011 21:28:00', '23/07/2011 21:29:00', '23/07/2011 21:30:00', '23/07/2011 21:31:00', '23/07/2011 21:32:00', '23/07/2011 21:33:00', '23/07/2011 21:34:00', '23/07/2011 21:35:00', '23/07/2011 21:36:00', '23/07/2011 21:37:00', '23/07/2011 21:38:00', '23/07/2011 21:39:00', '23/07/2011 21:40:00', '23/07/2011 21:41:00', '23/07/2011 21:42:00', '23/07/2011 21:43:00', '23/07/2011 21:44:00', '23/07/2011 21:45:00', '23/07/2011 21:46:00', '23/07/2011 21:47:00', '23/07/2011 21:48:00', '23/07/2011 21:49:00', '23/07/2011 21:50:00', '23/07/2011 21:51:00', '23/07/2011 21:52:00', '23/07/2011 21:53:00', '23/07/2011 21:54:00', '23/07/2011 21:55:00', '23/07/2011 21:56:00', '23/07/2011 21:57:00', '23/07/2011 21:58:00', '23/07/2011 21:59:00', '23/07/2011 22:00:00', '23/07/2011 22:01:00', '23/07/2011 22:02:00', '23/07/2011 22:03:00', '23/07/2011 22:04:00', '23/07/2011 22:05:00', '23/07/2011 22:06:00', '23/07/2011 22:07:00', '23/07/2011 22:08:00', '23/07/2011 22:09:00', '23/07/2011 22:10:00', '23/07/2011 22:11:00', '23/07/2011 22:12:00', '23/07/2011 22:13:00', '23/07/2011 22:14:00', '23/07/2011 22:15:00', '23/07/2011 22:16:00', '23/07/2011 22:17:00', '23/07/2011 22:18:00', '23/07/2011 22:19:00', '23/07/2011 22:20:00', '23/07/2011 22:21:00', '23/07/2011 22:22:00', '23/07/2011 22:23:00', '23/07/2011 22:24:00', '23/07/2011 22:25:00', '23/07/2011 22:26:00', '23/07/2011 22:27:00', '23/07/2011 22:28:00', '23/07/2011 22:29:00', '23/07/2011 22:30:00', '23/07/2011 22:31:00', '23/07/2011 22:32:00', '23/07/2011 22:33:00', '23/07/2011 22:34:00', '23/07/2011 22:35:00', '23/07/2011 22:36:00', '23/07/2011 22:37:00', '23/07/2011 22:38:00', '23/07/2011 22:39:00', '23/07/2011 22:40:00', '23/07/2011 22:41:00', '23/07/2011 22:42:00', '23/07/2011 22:43:00', '23/07/2011 22:44:00', '23/07/2011 22:45:00', '23/07/2011 22:46:00', '23/07/2011 22:47:00', '23/07/2011 22:48:00', '23/07/2011 22:49:00', '23/07/2011 22:50:00', '23/07/2011 22:51:00', '23/07/2011 22:52:00', '23/07/2011 22:53:00', '23/07/2011 22:54:00', '23/07/2011 22:55:00', '23/07/2011 22:56:00', '23/07/2011 22:57:00', '23/07/2011 22:58:00', '23/07/2011 22:59:00', '23/07/2011 23:00:00', '23/07/2011 23:01:00', '23/07/2011 23:02:00', '23/07/2011 23:03:00', '23/07/2011 23:04:00', '23/07/2011 23:05:00', '23/07/2011 23:06:00', '23/07/2011 23:07:00', '23/07/2011 23:08:00', '23/07/2011 23:09:00', '23/07/2011 23:10:00', '23/07/2011 23:11:00', '23/07/2011 23:12:00', '23/07/2011 23:13:00', '23/07/2011 23:14:00', '23/07/2011 23:15:00', '23/07/2011 23:16:00', '23/07/2011 23:17:00', '23/07/2011 23:18:00', '23/07/2011 23:19:00', '23/07/2011 23:20:00', '23/07/2011 23:21:00', '23/07/2011 23:22:00', '23/07/2011 23:23:00', '23/07/2011 23:24:00', '23/07/2011 23:25:00', '23/07/2011 23:26:00', '23/07/2011 23:27:00', '23/07/2011 23:28:00', '23/07/2011 23:29:00', '23/07/2011 23:30:00', '23/07/2011 23:31:00', '23/07/2011 23:32:00', '23/07/2011 23:33:00', '23/07/2011 23:34:00', '23/07/2011 23:35:00', '23/07/2011 23:36:00', '23/07/2011 23:37:00', '23/07/2011 23:38:00', '23/07/2011 23:39:00', '23/07/2011 23:40:00', '23/07/2011 23:41:00', '23/07/2011 23:42:00', '23/07/2011 23:43:00', '23/07/2011 23:44:00', '23/07/2011 23:45:00', '23/07/2011 23:46:00', '23/07/2011 23:47:00', '23/07/2011 23:48:00', '23/07/2011 23:49:00', '23/07/2011 23:50:00', '23/07/2011 23:51:00', '23/07/2011 23:52:00', '23/07/2011 23:53:00', '23/07/2011 23:54:00', '23/07/2011 23:55:00', '23/07/2011 23:56:00', '23/07/2011 23:57:00', '23/07/2011 23:58:00', '23/07/2011 23:59:00', '24/07/2011 0:00:00', '24/07/2011 0:01:00', '24/07/2011 0:02:00', '24/07/2011 0:03:00', '24/07/2011 0:04:00', '24/07/2011 0:05:00', '24/07/2011 0:06:00', '24/07/2011 0:07:00', '24/07/2011 0:08:00', '24/07/2011 0:09:00', '24/07/2011 0:10:00', '24/07/2011 0:11:00', '24/07/2011 0:12:00', '24/07/2011 0:13:00', '24/07/2011 0:14:00', '24/07/2011 0:15:00', '24/07/2011 0:16:00', '24/07/2011 0:17:00', '24/07/2011 0:18:00', '24/07/2011 0:19:00', '24/07/2011 0:20:00', '24/07/2011 0:21:00', '24/07/2011 0:22:00', '24/07/2011 0:23:00', '24/07/2011 0:24:00', '24/07/2011 0:25:00', '24/07/2011 0:26:00', '24/07/2011 0:27:00', '24/07/2011 0:28:00', '24/07/2011 0:29:00', '24/07/2011 0:30:00', '24/07/2011 0:31:00', '24/07/2011 0:32:00', '24/07/2011 0:33:00', '24/07/2011 0:34:00', '24/07/2011 0:35:00', '24/07/2011 0:36:00', '24/07/2011 0:37:00', '24/07/2011 0:38:00', '24/07/2011 0:39:00', '24/07/2011 0:40:00', '24/07/2011 0:41:00', '24/07/2011 0:42:00', '24/07/2011 0:43:00', '24/07/2011 0:44:00', '24/07/2011 0:45:00', '24/07/2011 0:46:00', '24/07/2011 0:47:00', '24/07/2011 0:48:00', '24/07/2011 0:49:00', '24/07/2011 0:50:00', '24/07/2011 0:51:00', '24/07/2011 0:52:00', '24/07/2011 0:53:00', '24/07/2011 0:54:00', '24/07/2011 0:55:00', '24/07/2011 0:56:00', '24/07/2011 0:57:00', '24/07/2011 0:58:00', '24/07/2011 0:59:00', '24/07/2011 1:00:00', '24/07/2011 1:01:00', '24/07/2011 1:02:00', '24/07/2011 1:03:00', '24/07/2011 1:04:00', '24/07/2011 1:05:00', '24/07/2011 1:06:00', '24/07/2011 1:07:00', '24/07/2011 1:08:00', '24/07/2011 1:09:00', '24/07/2011 1:10:00', '24/07/2011 1:11:00', '24/07/2011 1:12:00', '24/07/2011 1:13:00', '24/07/2011 1:14:00', '24/07/2011 1:15:00', '24/07/2011 1:16:00', '24/07/2011 1:17:00', '24/07/2011 1:18:00', '24/07/2011 1:19:00', '24/07/2011 1:20:00', '24/07/2011 1:21:00', '24/07/2011 1:22:00', '24/07/2011 1:23:00', '24/07/2011 1:24:00', '24/07/2011 1:25:00', '24/07/2011 1:26:00', '24/07/2011 1:27:00', '24/07/2011 1:28:00', '24/07/2011 1:29:00', '24/07/2011 1:30:00', '24/07/2011 1:31:00', '24/07/2011 1:32:00', '24/07/2011 1:33:00', '24/07/2011 1:34:00', '24/07/2011 1:35:00', '24/07/2011 1:36:00', '24/07/2011 1:37:00', '24/07/2011 1:38:00', '24/07/2011 1:39:00', '24/07/2011 1:40:00', '24/07/2011 1:41:00', '24/07/2011 1:42:00', '24/07/2011 1:43:00', '24/07/2011 1:44:00', '24/07/2011 1:45:00', '24/07/2011 1:46:00', '24/07/2011 1:47:00', '24/07/2011 1:48:00', '24/07/2011 1:49:00', '24/07/2011 1:50:00', '24/07/2011 1:51:00', '24/07/2011 1:52:00', '24/07/2011 1:53:00', '24/07/2011 1:54:00', '24/07/2011 1:55:00', '24/07/2011 1:56:00', '24/07/2011 1:57:00', '24/07/2011 1:58:00', '24/07/2011 1:59:00', '24/07/2011 2:00:00', '24/07/2011 2:01:00', '24/07/2011 2:02:00', '24/07/2011 2:03:00', '24/07/2011 2:04:00', '24/07/2011 2:05:00', '24/07/2011 2:06:00', '24/07/2011 2:07:00', '24/07/2011 2:08:00', '24/07/2011 2:09:00', '24/07/2011 2:10:00', '24/07/2011 2:11:00', '24/07/2011 2:12:00', '24/07/2011 2:13:00', '24/07/2011 2:14:00', '24/07/2011 2:15:00', '24/07/2011 2:16:00', '24/07/2011 2:17:00', '24/07/2011 2:18:00', '24/07/2011 2:19:00', '24/07/2011 2:20:00', '24/07/2011 2:21:00', '24/07/2011 2:22:00', '24/07/2011 2:23:00', '24/07/2011 2:24:00', '24/07/2011 2:25:00', '24/07/2011 2:26:00', '24/07/2011 2:27:00', '24/07/2011 2:28:00', '24/07/2011 2:29:00', '24/07/2011 2:30:00', '24/07/2011 2:31:00', '24/07/2011 2:32:00', '24/07/2011 2:33:00', '24/07/2011 2:34:00', '24/07/2011 2:35:00', '24/07/2011 2:36:00', '24/07/2011 2:37:00', '24/07/2011 2:38:00', '24/07/2011 2:39:00', '24/07/2011 2:40:00', '24/07/2011 2:41:00', '24/07/2011 2:42:00', '24/07/2011 2:43:00', '24/07/2011 2:44:00', '24/07/2011 2:45:00', '24/07/2011 2:46:00', '24/07/2011 2:47:00', '24/07/2011 2:48:00', '24/07/2011 2:49:00', '24/07/2011 2:50:00', '24/07/2011 2:51:00', '24/07/2011 2:52:00', '24/07/2011 2:53:00', '24/07/2011 2:54:00', '24/07/2011 2:55:00', '24/07/2011 2:56:00', '24/07/2011 2:57:00', '24/07/2011 2:58:00', '24/07/2011 2:59:00', '24/07/2011 3:00:00', '24/07/2011 3:01:00', '24/07/2011 3:02:00', '24/07/2011 3:03:00', '24/07/2011 3:04:00', '24/07/2011 3:05:00', '24/07/2011 3:06:00', '24/07/2011 3:07:00', '24/07/2011 3:08:00', '24/07/2011 3:09:00', '24/07/2011 3:10:00', '24/07/2011 3:11:00', '24/07/2011 3:12:00', '24/07/2011 3:13:00', '24/07/2011 3:14:00', '24/07/2011 3:15:00', '24/07/2011 3:16:00', '24/07/2011 3:17:00', '24/07/2011 3:18:00', '24/07/2011 3:19:00', '24/07/2011 3:20:00', '24/07/2011 3:21:00', '24/07/2011 3:22:00', '24/07/2011 3:23:00', '24/07/2011 3:24:00', '24/07/2011 3:25:00', '24/07/2011 3:26:00', '24/07/2011 3:27:00', '24/07/2011 3:28:00', '24/07/2011 3:29:00', '24/07/2011 3:30:00', '24/07/2011 3:31:00', '24/07/2011 3:32:00', '24/07/2011 3:33:00', '24/07/2011 3:34:00', '24/07/2011 3:35:00', '24/07/2011 3:36:00', '24/07/2011 3:37:00', '24/07/2011 3:38:00', '24/07/2011 3:39:00', '24/07/2011 3:40:00', '24/07/2011 3:41:00', '24/07/2011 3:42:00', '24/07/2011 3:43:00', '24/07/2011 3:44:00', '24/07/2011 3:45:00', '24/07/2011 3:46:00', '24/07/2011 3:47:00', '24/07/2011 3:48:00', '24/07/2011 3:49:00', '24/07/2011 3:50:00', '24/07/2011 3:51:00', '24/07/2011 3:52:00', '24/07/2011 3:53:00', '24/07/2011 3:54:00', '24/07/2011 3:55:00', '24/07/2011 3:56:00', '24/07/2011 3:57:00', '24/07/2011 3:58:00', '24/07/2011 3:59:00', '24/07/2011 4:00:00', '24/07/2011 4:01:00', '24/07/2011 4:02:00', '24/07/2011 4:03:00', '24/07/2011 4:04:00', '24/07/2011 4:05:00', '24/07/2011 4:06:00', '24/07/2011 4:07:00', '24/07/2011 4:08:00', '24/07/2011 4:09:00', '24/07/2011 4:10:00', '24/07/2011 4:11:00', '24/07/2011 4:12:00', '24/07/2011 4:13:00', '24/07/2011 4:14:00', '24/07/2011 4:15:00', '24/07/2011 4:16:00', '24/07/2011 4:17:00', '24/07/2011 4:18:00', '24/07/2011 4:19:00', '24/07/2011 4:20:00', '24/07/2011 4:21:00', '24/07/2011 4:22:00', '24/07/2011 4:23:00', '24/07/2011 4:24:00', '24/07/2011 4:25:00', '24/07/2011 4:26:00', '24/07/2011 4:27:00', '24/07/2011 4:28:00', '24/07/2011 4:29:00', '24/07/2011 4:30:00', '24/07/2011 4:31:00', '24/07/2011 4:32:00', '24/07/2011 4:33:00', '24/07/2011 4:34:00', '24/07/2011 4:35:00', '24/07/2011 4:36:00', '24/07/2011 4:37:00', '24/07/2011 4:38:00', '24/07/2011 4:39:00', '24/07/2011 4:40:00', '24/07/2011 4:41:00', '24/07/2011 4:42:00', '24/07/2011 4:43:00', '24/07/2011 4:44:00', '24/07/2011 4:45:00', '24/07/2011 4:46:00', '24/07/2011 4:47:00', '24/07/2011 4:48:00', '24/07/2011 4:49:00', '24/07/2011 4:50:00', '24/07/2011 4:51:00', '24/07/2011 4:52:00', '24/07/2011 4:53:00', '24/07/2011 4:54:00', '24/07/2011 4:55:00', '24/07/2011 4:56:00', '24/07/2011 4:57:00', '24/07/2011 4:58:00', '24/07/2011 4:59:00', '24/07/2011 5:00:00', '24/07/2011 5:01:00', '24/07/2011 5:02:00', '24/07/2011 5:03:00', '24/07/2011 5:04:00', '24/07/2011 5:05:00', '24/07/2011 5:06:00', '24/07/2011 5:07:00', '24/07/2011 5:08:00', '24/07/2011 5:09:00', '24/07/2011 5:10:00', '24/07/2011 5:11:00', '24/07/2011 5:12:00', '24/07/2011 5:13:00', '24/07/2011 5:14:00', '24/07/2011 5:15:00', '24/07/2011 5:16:00', '24/07/2011 5:17:00', '24/07/2011 5:18:00', '24/07/2011 5:19:00', '24/07/2011 5:20:00', '24/07/2011 5:21:00', '24/07/2011 5:22:00', '24/07/2011 5:23:00', '24/07/2011 5:24:00', '24/07/2011 5:25:00', '24/07/2011 5:26:00', '24/07/2011 5:27:00', '24/07/2011 5:28:00', '24/07/2011 5:29:00', '24/07/2011 5:30:00', '24/07/2011 5:31:00', '24/07/2011 5:32:00', '24/07/2011 5:33:00', '24/07/2011 5:34:00', '24/07/2011 5:35:00', '24/07/2011 5:36:00', '24/07/2011 5:37:00', '24/07/2011 5:38:00', '24/07/2011 5:39:00', '24/07/2011 5:40:00', '24/07/2011 5:41:00', '24/07/2011 5:42:00', '24/07/2011 5:43:00', '24/07/2011 5:44:00', '24/07/2011 5:45:00', '24/07/2011 5:46:00', '24/07/2011 5:47:00', '24/07/2011 5:48:00', '24/07/2011 5:49:00', '24/07/2011 5:50:00', '24/07/2011 5:51:00', '24/07/2011 5:52:00', '24/07/2011 5:53:00', '24/07/2011 5:54:00', '24/07/2011 5:55:00', '24/07/2011 5:56:00', '24/07/2011 5:57:00', '24/07/2011 5:58:00', '24/07/2011 5:59:00', '24/07/2011 6:00:00', '24/07/2011 6:01:00', '24/07/2011 6:02:00', '24/07/2011 6:03:00', '24/07/2011 6:04:00', '24/07/2011 6:05:00', '24/07/2011 6:06:00', '24/07/2011 6:07:00', '24/07/2011 6:08:00', '24/07/2011 6:09:00', '24/07/2011 6:10:00', '24/07/2011 6:11:00', '24/07/2011 6:12:00', '24/07/2011 6:13:00', '24/07/2011 6:14:00', '24/07/2011 6:15:00', '24/07/2011 6:16:00', '24/07/2011 6:17:00', '24/07/2011 6:18:00', '24/07/2011 6:19:00', '24/07/2011 6:20:00', '24/07/2011 6:21:00', '24/07/2011 6:22:00', '24/07/2011 6:23:00', '24/07/2011 6:24:00', '24/07/2011 6:25:00', '24/07/2011 6:26:00', '24/07/2011 6:27:00', '24/07/2011 6:28:00', '24/07/2011 6:29:00', '24/07/2011 6:30:00', '24/07/2011 6:31:00', '24/07/2011 6:32:00', '24/07/2011 6:33:00', '24/07/2011 6:34:00', '24/07/2011 6:35:00', '24/07/2011 6:36:00', '24/07/2011 6:37:00', '24/07/2011 6:38:00', '24/07/2011 6:39:00', '24/07/2011 6:40:00', '24/07/2011 6:41:00', '24/07/2011 6:42:00', '24/07/2011 6:43:00', '24/07/2011 6:44:00', '24/07/2011 6:45:00', '24/07/2011 6:46:00', '24/07/2011 6:47:00', '24/07/2011 6:48:00', '24/07/2011 6:49:00', '24/07/2011 6:50:00', '24/07/2011 6:51:00', '24/07/2011 6:52:00', '24/07/2011 6:53:00', '24/07/2011 6:54:00', '24/07/2011 6:55:00', '24/07/2011 6:56:00', '24/07/2011 6:57:00', '24/07/2011 6:58:00', '24/07/2011 6:59:00', '24/07/2011 7:00:00', '24/07/2011 7:01:00', '24/07/2011 7:02:00', '24/07/2011 7:03:00', '24/07/2011 7:04:00', '24/07/2011 7:05:00', '24/07/2011 7:06:00', '24/07/2011 7:07:00', '24/07/2011 7:08:00', '24/07/2011 7:09:00', '24/07/2011 7:10:00', '24/07/2011 7:11:00', '24/07/2011 7:12:00', '24/07/2011 7:13:00', '24/07/2011 7:14:00', '24/07/2011 7:15:00', '24/07/2011 7:16:00', '24/07/2011 7:17:00', '24/07/2011 7:18:00', '24/07/2011 7:19:00', '24/07/2011 7:20:00', '24/07/2011 7:21:00', '24/07/2011 7:22:00', '24/07/2011 7:23:00', '24/07/2011 7:24:00', '24/07/2011 7:25:00', '24/07/2011 7:26:00', '24/07/2011 7:27:00', '24/07/2011 7:28:00', '24/07/2011 7:29:00', '24/07/2011 7:30:00', '24/07/2011 7:31:00', '24/07/2011 7:32:00', '24/07/2011 7:33:00', '24/07/2011 7:34:00', '24/07/2011 7:35:00', '24/07/2011 7:36:00', '24/07/2011 7:37:00', '24/07/2011 7:38:00', '24/07/2011 7:39:00', '24/07/2011 7:40:00', '24/07/2011 7:41:00', '24/07/2011 7:42:00', '24/07/2011 7:43:00', '24/07/2011 7:44:00', '24/07/2011 7:45:00', '24/07/2011 7:46:00', '24/07/2011 7:47:00', '24/07/2011 7:48:00', '24/07/2011 7:49:00', '24/07/2011 7:50:00', '24/07/2011 7:51:00', '24/07/2011 7:52:00', '24/07/2011 7:53:00', '24/07/2011 7:54:00', '24/07/2011 7:55:00', '24/07/2011 7:56:00', '24/07/2011 7:57:00', '24/07/2011 7:58:00', '24/07/2011 7:59:00', '24/07/2011 8:00:00', '24/07/2011 8:01:00', '24/07/2011 8:02:00', '24/07/2011 8:03:00', '24/07/2011 8:04:00', '24/07/2011 8:05:00', '24/07/2011 8:06:00', '24/07/2011 8:07:00', '24/07/2011 8:08:00', '24/07/2011 8:09:00', '24/07/2011 8:10:00', '24/07/2011 8:11:00', '24/07/2011 8:12:00', '24/07/2011 8:13:00', '24/07/2011 8:14:00', '24/07/2011 8:15:00', '24/07/2011 8:16:00', '24/07/2011 8:17:00', '24/07/2011 8:18:00', '24/07/2011 8:19:00', '24/07/2011 8:20:00', '24/07/2011 8:21:00', '24/07/2011 8:22:00', '24/07/2011 8:23:00', '24/07/2011 8:24:00', '24/07/2011 8:25:00', '24/07/2011 8:26:00', '24/07/2011 8:27:00', '24/07/2011 8:28:00', '24/07/2011 8:29:00', '24/07/2011 8:30:00', '24/07/2011 8:31:00', '24/07/2011 8:32:00', '24/07/2011 8:33:00', '24/07/2011 8:34:00', '24/07/2011 8:35:00', '24/07/2011 8:36:00', '24/07/2011 8:37:00', '24/07/2011 8:38:00', '24/07/2011 8:39:00', '24/07/2011 8:40:00', '24/07/2011 8:41:00', '24/07/2011 8:42:00', '24/07/2011 8:43:00', '24/07/2011 8:44:00', '24/07/2011 8:45:00', '24/07/2011 8:46:00', '24/07/2011 8:47:00', '24/07/2011 8:48:00', '24/07/2011 8:49:00', '24/07/2011 8:50:00', '24/07/2011 8:51:00', '24/07/2011 8:52:00', '24/07/2011 8:53:00', '24/07/2011 8:54:00', '24/07/2011 8:55:00', '24/07/2011 8:56:00', '24/07/2011 8:57:00', '24/07/2011 8:58:00', '24/07/2011 8:59:00', '24/07/2011 9:00:00', '24/07/2011 9:01:00', '24/07/2011 9:02:00', '24/07/2011 9:03:00', '24/07/2011 9:04:00', '24/07/2011 9:05:00', '24/07/2011 9:06:00', '24/07/2011 9:07:00', '24/07/2011 9:08:00', '24/07/2011 9:09:00', '24/07/2011 9:10:00', '24/07/2011 9:11:00', '24/07/2011 9:12:00', '24/07/2011 9:13:00', '24/07/2011 9:14:00', '24/07/2011 9:15:00', '24/07/2011 9:16:00', '24/07/2011 9:17:00', '24/07/2011 9:18:00', '24/07/2011 9:19:00', '24/07/2011 9:20:00', '24/07/2011 9:21:00', '24/07/2011 9:22:00', '24/07/2011 9:23:00', '24/07/2011 9:24:00', '24/07/2011 9:25:00', '24/07/2011 9:26:00', '24/07/2011 9:27:00', '24/07/2011 9:28:00', '24/07/2011 9:29:00', '24/07/2011 9:30:00', '24/07/2011 9:31:00', '24/07/2011 9:32:00', '24/07/2011 9:33:00', '24/07/2011 9:34:00', '24/07/2011 9:35:00', '24/07/2011 9:36:00', '24/07/2011 9:37:00', '24/07/2011 9:38:00', '24/07/2011 9:39:00', '24/07/2011 9:40:00', '24/07/2011 9:41:00', '24/07/2011 9:42:00', '24/07/2011 9:43:00', '24/07/2011 9:44:00', '24/07/2011 9:45:00', '24/07/2011 9:46:00', '24/07/2011 9:47:00', '24/07/2011 9:48:00', '24/07/2011 9:49:00', '24/07/2011 9:50:00', '24/07/2011 9:51:00', '24/07/2011 9:52:00', '24/07/2011 9:53:00', '24/07/2011 9:54:00', '24/07/2011 9:55:00', '24/07/2011 9:56:00', '24/07/2011 9:57:00', '24/07/2011 9:58:00', '24/07/2011 9:59:00', '24/07/2011 10:00:00', '24/07/2011 10:01:00', '24/07/2011 10:02:00', '24/07/2011 10:03:00', '24/07/2011 10:04:00', '24/07/2011 10:05:00', '24/07/2011 10:06:00', '24/07/2011 10:07:00', '24/07/2011 10:08:00', '24/07/2011 10:09:00', '24/07/2011 10:10:00', '24/07/2011 10:11:00', '24/07/2011 10:12:00', '24/07/2011 10:13:00', '24/07/2011 10:14:00', '24/07/2011 10:15:00', '24/07/2011 10:16:00', '24/07/2011 10:17:00', '24/07/2011 10:18:00', '24/07/2011 10:19:00', '24/07/2011 10:20:00', '24/07/2011 10:21:00', '24/07/2011 10:22:00', '24/07/2011 10:23:00', '24/07/2011 10:24:00', '24/07/2011 10:25:00', '24/07/2011 10:26:00', '24/07/2011 10:27:00', '24/07/2011 10:28:00', '24/07/2011 10:29:00', '24/07/2011 10:30:00', '24/07/2011 10:31:00', '24/07/2011 10:32:00', '24/07/2011 10:33:00', '24/07/2011 10:34:00', '24/07/2011 10:35:00', '24/07/2011 10:36:00', '24/07/2011 10:37:00', '24/07/2011 10:38:00', '24/07/2011 10:39:00', '24/07/2011 10:40:00', '24/07/2011 10:41:00', '24/07/2011 10:42:00', '24/07/2011 10:43:00', '24/07/2011 10:44:00', '24/07/2011 10:45:00', '24/07/2011 10:46:00', '24/07/2011 10:47:00', '24/07/2011 10:48:00', '24/07/2011 10:49:00', '24/07/2011 10:50:00', '24/07/2011 10:51:00', '24/07/2011 10:52:00', '24/07/2011 10:53:00', '24/07/2011 10:54:00', '24/07/2011 10:55:00', '24/07/2011 10:56:00', '24/07/2011 10:57:00', '24/07/2011 10:58:00', '24/07/2011 10:59:00', '24/07/2011 11:00:00', '24/07/2011 11:01:00', '24/07/2011 11:02:00', '24/07/2011 11:03:00', '24/07/2011 11:04:00', '24/07/2011 11:05:00', '24/07/2011 11:06:00', '24/07/2011 11:07:00', '24/07/2011 11:08:00', '24/07/2011 11:09:00', '24/07/2011 11:10:00', '24/07/2011 11:11:00', '24/07/2011 11:12:00', '24/07/2011 11:13:00', '24/07/2011 11:14:00', '24/07/2011 11:15:00', '24/07/2011 11:16:00', '24/07/2011 11:17:00', '24/07/2011 11:18:00', '24/07/2011 11:19:00', '24/07/2011 11:20:00', '24/07/2011 11:21:00', '24/07/2011 11:22:00', '24/07/2011 11:23:00', '24/07/2011 11:24:00', '24/07/2011 11:25:00', '24/07/2011 11:26:00', '24/07/2011 11:27:00', '24/07/2011 11:28:00', '24/07/2011 11:29:00', '24/07/2011 11:30:00', '24/07/2011 11:31:00', '24/07/2011 11:32:00', '24/07/2011 11:33:00', '24/07/2011 11:34:00', '24/07/2011 11:35:00', '24/07/2011 11:36:00', '24/07/2011 11:37:00', '24/07/2011 11:38:00', '24/07/2011 11:39:00', '24/07/2011 11:40:00', '24/07/2011 11:41:00', '24/07/2011 11:42:00', '24/07/2011 11:43:00', '24/07/2011 11:44:00', '24/07/2011 11:45:00', '24/07/2011 11:46:00', '24/07/2011 11:47:00', '24/07/2011 11:48:00', '24/07/2011 11:49:00', '24/07/2011 11:50:00', '24/07/2011 11:51:00', '24/07/2011 11:52:00', '24/07/2011 11:53:00', '24/07/2011 11:54:00', '24/07/2011 11:55:00', '24/07/2011 11:56:00', '24/07/2011 11:57:00', '24/07/2011 11:58:00', '24/07/2011 11:59:00', '24/07/2011 12:00:00', '24/07/2011 12:01:00', '24/07/2011 12:02:00', '24/07/2011 12:03:00', '24/07/2011 12:04:00', '24/07/2011 12:05:00', '24/07/2011 12:06:00', '24/07/2011 12:07:00', '24/07/2011 12:08:00', '24/07/2011 12:09:00', '24/07/2011 12:10:00', '24/07/2011 12:11:00', '24/07/2011 12:12:00', '24/07/2011 12:13:00', '24/07/2011 12:14:00', '24/07/2011 12:15:00', '24/07/2011 12:16:00', '24/07/2011 12:17:00', '24/07/2011 12:18:00', '24/07/2011 12:19:00', '24/07/2011 12:20:00', '24/07/2011 12:21:00', '24/07/2011 12:22:00', '24/07/2011 12:23:00', '24/07/2011 12:24:00', '24/07/2011 12:25:00', '24/07/2011 12:26:00', '24/07/2011 12:27:00', '24/07/2011 12:28:00', '24/07/2011 12:29:00', '24/07/2011 12:30:00', '24/07/2011 12:31:00', '24/07/2011 12:32:00', '24/07/2011 12:33:00', '24/07/2011 12:34:00', '24/07/2011 12:35:00', '24/07/2011 12:36:00', '24/07/2011 12:37:00', '24/07/2011 12:38:00', '24/07/2011 12:39:00', '24/07/2011 12:40:00', '24/07/2011 12:41:00', '24/07/2011 12:42:00', '24/07/2011 12:43:00', '24/07/2011 12:44:00', '24/07/2011 12:45:00', '24/07/2011 12:46:00', '24/07/2011 12:47:00', '24/07/2011 12:48:00', '24/07/2011 12:49:00', '24/07/2011 12:50:00', '24/07/2011 12:51:00', '24/07/2011 12:52:00', '24/07/2011 12:53:00', '24/07/2011 12:54:00', '24/07/2011 12:55:00', '24/07/2011 12:56:00', '24/07/2011 12:57:00', '24/07/2011 12:58:00', '24/07/2011 12:59:00', '24/07/2011 13:00:00', '24/07/2011 13:01:00', '24/07/2011 13:02:00', '24/07/2011 13:03:00', '24/07/2011 13:04:00', '24/07/2011 13:05:00', '24/07/2011 13:06:00', '24/07/2011 13:07:00', '24/07/2011 13:08:00', '24/07/2011 13:09:00', '24/07/2011 13:10:00', '24/07/2011 13:11:00', '24/07/2011 13:12:00', '24/07/2011 13:13:00', '24/07/2011 13:14:00', '24/07/2011 13:15:00', '24/07/2011 13:16:00', '24/07/2011 13:17:00', '24/07/2011 13:18:00', '24/07/2011 13:19:00', '24/07/2011 13:20:00', '24/07/2011 13:21:00', '24/07/2011 13:22:00', '24/07/2011 13:23:00', '24/07/2011 13:24:00', '24/07/2011 13:25:00', '24/07/2011 13:26:00', '24/07/2011 13:27:00', '24/07/2011 13:28:00', '24/07/2011 13:29:00', '24/07/2011 13:30:00', '24/07/2011 13:31:00', '24/07/2011 13:32:00', '24/07/2011 13:33:00', '24/07/2011 13:34:00', '24/07/2011 13:35:00', '24/07/2011 13:36:00', '24/07/2011 13:37:00', '24/07/2011 13:38:00', '24/07/2011 13:39:00', '24/07/2011 13:40:00', '24/07/2011 13:41:00', '24/07/2011 13:42:00', '24/07/2011 13:43:00', '24/07/2011 13:44:00', '24/07/2011 13:45:00', '24/07/2011 13:46:00', '24/07/2011 13:47:00', '24/07/2011 13:48:00', '24/07/2011 13:49:00', '24/07/2011 13:50:00', '24/07/2011 13:51:00', '24/07/2011 13:52:00', '24/07/2011 13:53:00', '24/07/2011 13:54:00', '24/07/2011 13:55:00', '24/07/2011 13:56:00', '24/07/2011 13:57:00', '24/07/2011 13:58:00', '24/07/2011 13:59:00', '24/07/2011 14:00:00', '24/07/2011 14:01:00', '24/07/2011 14:02:00', '24/07/2011 14:03:00', '24/07/2011 14:04:00', '24/07/2011 14:05:00', '24/07/2011 14:06:00', '24/07/2011 14:07:00', '24/07/2011 14:08:00', '24/07/2011 14:09:00', '24/07/2011 14:10:00', '24/07/2011 14:11:00', '24/07/2011 14:12:00', '24/07/2011 14:13:00', '24/07/2011 14:14:00', '24/07/2011 14:15:00', '24/07/2011 14:16:00', '24/07/2011 14:17:00', '24/07/2011 14:18:00', '24/07/2011 14:19:00', '24/07/2011 14:20:00', '24/07/2011 14:21:00', '24/07/2011 14:22:00', '24/07/2011 14:23:00', '24/07/2011 14:24:00', '24/07/2011 14:25:00', '24/07/2011 14:26:00', '24/07/2011 14:27:00', '24/07/2011 14:28:00', '24/07/2011 14:29:00', '24/07/2011 14:30:00', '24/07/2011 14:31:00', '24/07/2011 14:32:00', '24/07/2011 14:33:00', '24/07/2011 14:34:00', '24/07/2011 14:35:00', '24/07/2011 14:36:00', '24/07/2011 14:37:00', '24/07/2011 14:38:00', '24/07/2011 14:39:00', '24/07/2011 14:40:00', '24/07/2011 14:41:00', '24/07/2011 14:42:00', '24/07/2011 14:43:00', '24/07/2011 14:44:00', '24/07/2011 14:45:00', '24/07/2011 14:46:00', '24/07/2011 14:47:00', '24/07/2011 14:48:00', '24/07/2011 14:49:00', '24/07/2011 14:50:00', '24/07/2011 14:51:00', '24/07/2011 14:52:00', '24/07/2011 14:53:00', '24/07/2011 14:54:00', '24/07/2011 14:55:00', '24/07/2011 14:56:00', '24/07/2011 14:57:00', '24/07/2011 14:58:00', '24/07/2011 14:59:00', '24/07/2011 15:00:00', '24/07/2011 15:01:00', '24/07/2011 15:02:00', '24/07/2011 15:03:00', '24/07/2011 15:04:00', '24/07/2011 15:05:00', '24/07/2011 15:06:00', '24/07/2011 15:07:00', '24/07/2011 15:08:00', '24/07/2011 15:09:00', '24/07/2011 15:10:00', '24/07/2011 15:11:00', '24/07/2011 15:12:00', '24/07/2011 15:13:00', '24/07/2011 15:14:00', '24/07/2011 15:15:00', '24/07/2011 15:16:00', '24/07/2011 15:17:00', '24/07/2011 15:18:00', '24/07/2011 15:19:00', '24/07/2011 15:20:00', '24/07/2011 15:21:00', '24/07/2011 15:22:00', '24/07/2011 15:23:00', '24/07/2011 15:24:00', '24/07/2011 15:25:00', '24/07/2011 15:26:00', '24/07/2011 15:27:00', '24/07/2011 15:28:00', '24/07/2011 15:29:00', '24/07/2011 15:30:00', '24/07/2011 15:31:00', '24/07/2011 15:32:00', '24/07/2011 15:33:00', '24/07/2011 15:34:00', '24/07/2011 15:35:00', '24/07/2011 15:36:00', '24/07/2011 15:37:00', '24/07/2011 15:38:00', '24/07/2011 15:39:00', '24/07/2011 15:40:00', '24/07/2011 15:41:00', '24/07/2011 15:42:00', '24/07/2011 15:43:00', '24/07/2011 15:44:00', '24/07/2011 15:45:00', '24/07/2011 15:46:00', '24/07/2011 15:47:00', '24/07/2011 15:48:00', '24/07/2011 15:49:00', '24/07/2011 15:50:00', '24/07/2011 15:51:00', '24/07/2011 15:52:00', '24/07/2011 15:53:00', '24/07/2011 15:54:00', '24/07/2011 15:55:00', '24/07/2011 15:56:00', '24/07/2011 15:57:00', '24/07/2011 15:58:00', '24/07/2011 15:59:00', '24/07/2011 16:00:00', '24/07/2011 16:01:00', '24/07/2011 16:02:00', '24/07/2011 16:03:00', '24/07/2011 16:04:00', '24/07/2011 16:05:00', '24/07/2011 16:06:00', '24/07/2011 16:07:00', '24/07/2011 16:08:00', '24/07/2011 16:09:00', '24/07/2011 16:10:00', '24/07/2011 16:11:00', '24/07/2011 16:12:00', '24/07/2011 16:13:00', '24/07/2011 16:14:00', '24/07/2011 16:15:00', '24/07/2011 16:16:00', '24/07/2011 16:17:00', '24/07/2011 16:18:00', '24/07/2011 16:19:00', '24/07/2011 16:20:00', '24/07/2011 16:21:00', '24/07/2011 16:22:00', '24/07/2011 16:23:00', '24/07/2011 16:24:00', '24/07/2011 16:25:00', '24/07/2011 16:26:00', '24/07/2011 16:27:00', '24/07/2011 16:28:00', '24/07/2011 16:29:00', '24/07/2011 16:30:00', '24/07/2011 16:31:00', '24/07/2011 16:32:00', '24/07/2011 16:33:00', '24/07/2011 16:34:00', '24/07/2011 16:35:00', '24/07/2011 16:36:00', '24/07/2011 16:37:00', '24/07/2011 16:38:00', '24/07/2011 16:39:00', '24/07/2011 16:40:00', '24/07/2011 16:41:00', '24/07/2011 16:42:00', '24/07/2011 16:43:00', '24/07/2011 16:44:00', '24/07/2011 16:45:00', '24/07/2011 16:46:00', '24/07/2011 16:47:00', '24/07/2011 16:48:00', '24/07/2011 16:49:00', '24/07/2011 16:50:00', '24/07/2011 16:51:00', '24/07/2011 16:52:00', '24/07/2011 16:53:00', '24/07/2011 16:54:00', '24/07/2011 16:55:00', '24/07/2011 16:56:00', '24/07/2011 16:57:00', '24/07/2011 16:58:00', '24/07/2011 16:59:00', '24/07/2011 17:00:00', '24/07/2011 17:01:00', '24/07/2011 17:02:00', '24/07/2011 17:03:00', '24/07/2011 17:04:00', '24/07/2011 17:05:00', '24/07/2011 17:06:00', '24/07/2011 17:07:00', '24/07/2011 17:08:00', '24/07/2011 17:09:00', '24/07/2011 17:10:00', '24/07/2011 17:11:00', '24/07/2011 17:12:00', '24/07/2011 17:13:00', '24/07/2011 17:14:00', '24/07/2011 17:15:00', '24/07/2011 17:16:00', '24/07/2011 17:17:00', '24/07/2011 17:18:00', '24/07/2011 17:19:00', '24/07/2011 17:20:00', '24/07/2011 17:21:00', '24/07/2011 17:22:00', '24/07/2011 17:23:00', '24/07/2011 17:24:00', '24/07/2011 17:25:00', '24/07/2011 17:26:00', '24/07/2011 17:27:00', '24/07/2011 17:28:00', '24/07/2011 17:29:00', '24/07/2011 17:30:00', '24/07/2011 17:31:00', '24/07/2011 17:32:00', '24/07/2011 17:33:00', '24/07/2011 17:34:00', '24/07/2011 17:35:00', '24/07/2011 17:36:00', '24/07/2011 17:37:00', '24/07/2011 17:38:00', '24/07/2011 17:39:00', '24/07/2011 17:40:00', '24/07/2011 17:41:00', '24/07/2011 17:42:00', '24/07/2011 17:43:00', '24/07/2011 17:44:00', '24/07/2011 17:45:00', '24/07/2011 17:46:00', '24/07/2011 17:47:00', '24/07/2011 17:48:00', '24/07/2011 17:49:00', '24/07/2011 17:50:00', '24/07/2011 17:51:00', '24/07/2011 17:52:00', '24/07/2011 17:53:00', '24/07/2011 17:54:00', '24/07/2011 17:55:00', '24/07/2011 17:56:00', '24/07/2011 17:57:00', '24/07/2011 17:58:00', '24/07/2011 17:59:00', '24/07/2011 18:00:00', '24/07/2011 18:01:00', '24/07/2011 18:02:00', '24/07/2011 18:03:00', '24/07/2011 18:04:00', '24/07/2011 18:05:00', '24/07/2011 18:06:00', '24/07/2011 18:07:00', '24/07/2011 18:08:00', '24/07/2011 18:09:00', '24/07/2011 18:10:00', '24/07/2011 18:11:00', '24/07/2011 18:12:00', '24/07/2011 18:13:00', '24/07/2011 18:14:00', '24/07/2011 18:15:00', '24/07/2011 18:16:00', '24/07/2011 18:17:00', '24/07/2011 18:18:00', '24/07/2011 18:19:00', '24/07/2011 18:20:00', '24/07/2011 18:21:00', '24/07/2011 18:22:00', '24/07/2011 18:23:00', '24/07/2011 18:24:00', '24/07/2011 18:25:00', '24/07/2011 18:26:00', '24/07/2011 18:27:00', '24/07/2011 18:28:00', '24/07/2011 18:29:00', '24/07/2011 18:30:00', '24/07/2011 18:31:00', '24/07/2011 18:32:00', '24/07/2011 18:33:00', '24/07/2011 18:34:00', '24/07/2011 18:35:00', '24/07/2011 18:36:00', '24/07/2011 18:37:00', '24/07/2011 18:38:00', '24/07/2011 18:39:00', '24/07/2011 18:40:00', '24/07/2011 18:41:00', '24/07/2011 18:42:00', '24/07/2011 18:43:00', '24/07/2011 18:44:00', '24/07/2011 18:45:00', '24/07/2011 18:46:00', '24/07/2011 18:47:00', '24/07/2011 18:48:00', '24/07/2011 18:49:00', '24/07/2011 18:50:00', '24/07/2011 18:51:00', '24/07/2011 18:52:00', '24/07/2011 18:53:00', '24/07/2011 18:54:00', '24/07/2011 18:55:00', '24/07/2011 18:56:00', '24/07/2011 18:57:00', '24/07/2011 18:58:00', '24/07/2011 18:59:00', '24/07/2011 19:00:00', '24/07/2011 19:01:00', '24/07/2011 19:02:00', '24/07/2011 19:03:00', '24/07/2011 19:04:00', '24/07/2011 19:05:00', '24/07/2011 19:06:00', '24/07/2011 19:07:00', '24/07/2011 19:08:00', '24/07/2011 19:09:00', '24/07/2011 19:10:00', '24/07/2011 19:11:00', '24/07/2011 19:12:00', '24/07/2011 19:13:00', '24/07/2011 19:14:00', '24/07/2011 19:15:00', '24/07/2011 19:16:00', '24/07/2011 19:17:00', '24/07/2011 19:18:00', '24/07/2011 19:19:00', '24/07/2011 19:20:00', '24/07/2011 19:21:00', '24/07/2011 19:22:00', '24/07/2011 19:23:00', '24/07/2011 19:24:00', '24/07/2011 19:25:00', '24/07/2011 19:26:00', '24/07/2011 19:27:00', '24/07/2011 19:28:00', '24/07/2011 19:29:00', '24/07/2011 19:30:00', '24/07/2011 19:31:00', '24/07/2011 19:32:00', '24/07/2011 19:33:00', '24/07/2011 19:34:00', '24/07/2011 19:35:00', '24/07/2011 19:36:00', '24/07/2011 19:37:00', '24/07/2011 19:38:00', '24/07/2011 19:39:00', '24/07/2011 19:40:00', '24/07/2011 19:41:00', '24/07/2011 19:42:00', '24/07/2011 19:43:00', '24/07/2011 19:44:00', '24/07/2011 19:45:00', '24/07/2011 19:46:00', '24/07/2011 19:47:00', '24/07/2011 19:48:00', '24/07/2011 19:49:00', '24/07/2011 19:50:00', '24/07/2011 19:51:00', '24/07/2011 19:52:00', '24/07/2011 19:53:00', '24/07/2011 19:54:00', '24/07/2011 19:55:00', '24/07/2011 19:56:00', '24/07/2011 19:57:00', '24/07/2011 19:58:00', '24/07/2011 19:59:00', '24/07/2011 20:00:00', '24/07/2011 20:01:00', '24/07/2011 20:02:00', '24/07/2011 20:03:00', '24/07/2011 20:04:00', '24/07/2011 20:05:00', '24/07/2011 20:06:00', '24/07/2011 20:07:00', '24/07/2011 20:08:00', '24/07/2011 20:09:00', '24/07/2011 20:10:00', '24/07/2011 20:11:00', '24/07/2011 20:12:00', '24/07/2011 20:13:00', '24/07/2011 20:14:00', '24/07/2011 20:15:00', '24/07/2011 20:16:00', '24/07/2011 20:17:00', '24/07/2011 20:18:00', '24/07/2011 20:19:00', '24/07/2011 20:20:00', '24/07/2011 20:21:00', '24/07/2011 20:22:00', '24/07/2011 20:23:00', '24/07/2011 20:24:00', '24/07/2011 20:25:00', '24/07/2011 20:26:00', '24/07/2011 20:27:00', '24/07/2011 20:28:00', '24/07/2011 20:29:00', '24/07/2011 20:30:00', '24/07/2011 20:31:00', '24/07/2011 20:32:00', '24/07/2011 20:33:00', '24/07/2011 20:34:00', '24/07/2011 20:35:00', '24/07/2011 20:36:00', '24/07/2011 20:37:00', '24/07/2011 20:38:00', '24/07/2011 20:39:00', '24/07/2011 20:40:00', '24/07/2011 20:41:00', '24/07/2011 20:42:00', '24/07/2011 20:43:00', '24/07/2011 20:44:00', '24/07/2011 20:45:00', '24/07/2011 20:46:00', '24/07/2011 20:47:00', '24/07/2011 20:48:00', '24/07/2011 20:49:00', '24/07/2011 20:50:00', '24/07/2011 20:51:00', '24/07/2011 20:52:00', '24/07/2011 20:53:00', '24/07/2011 20:54:00', '24/07/2011 20:55:00', '24/07/2011 20:56:00', '24/07/2011 20:57:00', '24/07/2011 20:58:00', '24/07/2011 20:59:00', '24/07/2011 21:00:00', '24/07/2011 21:01:00', '24/07/2011 21:02:00', '24/07/2011 21:03:00', '24/07/2011 21:04:00', '24/07/2011 21:05:00', '24/07/2011 21:06:00', '24/07/2011 21:07:00', '24/07/2011 21:08:00', '24/07/2011 21:09:00', '24/07/2011 21:10:00', '24/07/2011 21:11:00', '24/07/2011 21:12:00', '24/07/2011 21:13:00', '24/07/2011 21:14:00', '24/07/2011 21:15:00', '24/07/2011 21:16:00', '24/07/2011 21:17:00', '24/07/2011 21:18:00', '24/07/2011 21:19:00', '24/07/2011 21:20:00', '24/07/2011 21:21:00', '24/07/2011 21:22:00', '24/07/2011 21:23:00', '24/07/2011 21:24:00', '24/07/2011 21:25:00', '24/07/2011 21:26:00', '24/07/2011 21:27:00', '24/07/2011 21:28:00', '24/07/2011 21:29:00', '24/07/2011 21:30:00', '24/07/2011 21:31:00', '24/07/2011 21:32:00', '24/07/2011 21:33:00', '24/07/2011 21:34:00', '24/07/2011 21:35:00', '24/07/2011 21:36:00', '24/07/2011 21:37:00', '24/07/2011 21:38:00', '24/07/2011 21:39:00', '24/07/2011 21:40:00', '24/07/2011 21:41:00', '24/07/2011 21:42:00', '24/07/2011 21:43:00', '24/07/2011 21:44:00', '24/07/2011 21:45:00', '24/07/2011 21:46:00', '24/07/2011 21:47:00', '24/07/2011 21:48:00', '24/07/2011 21:49:00', '24/07/2011 21:50:00', '24/07/2011 21:51:00', '24/07/2011 21:52:00', '24/07/2011 21:53:00', '24/07/2011 21:54:00', '24/07/2011 21:55:00', '24/07/2011 21:56:00', '24/07/2011 21:57:00', '24/07/2011 21:58:00', '24/07/2011 21:59:00', '24/07/2011 22:00:00', '24/07/2011 22:01:00', '24/07/2011 22:02:00', '24/07/2011 22:03:00', '24/07/2011 22:04:00', '24/07/2011 22:05:00', '24/07/2011 22:06:00', '24/07/2011 22:07:00', '24/07/2011 22:08:00', '24/07/2011 22:09:00', '24/07/2011 22:10:00', '24/07/2011 22:11:00', '24/07/2011 22:12:00', '24/07/2011 22:13:00', '24/07/2011 22:14:00', '24/07/2011 22:15:00', '24/07/2011 22:16:00', '24/07/2011 22:17:00', '24/07/2011 22:18:00', '24/07/2011 22:19:00', '24/07/2011 22:20:00', '24/07/2011 22:21:00', '24/07/2011 22:22:00', '24/07/2011 22:23:00', '24/07/2011 22:24:00', '24/07/2011 22:25:00', '24/07/2011 22:26:00', '24/07/2011 22:27:00', '24/07/2011 22:28:00', '24/07/2011 22:29:00', '24/07/2011 22:30:00', '24/07/2011 22:31:00', '24/07/2011 22:32:00', '24/07/2011 22:33:00', '24/07/2011 22:34:00', '24/07/2011 22:35:00', '24/07/2011 22:36:00', '24/07/2011 22:37:00', '24/07/2011 22:38:00', '24/07/2011 22:39:00', '24/07/2011 22:40:00', '24/07/2011 22:41:00', '24/07/2011 22:42:00', '24/07/2011 22:43:00', '24/07/2011 22:44:00', '24/07/2011 22:45:00', '24/07/2011 22:46:00', '24/07/2011 22:47:00', '24/07/2011 22:48:00', '24/07/2011 22:49:00', '24/07/2011 22:50:00', '24/07/2011 22:51:00', '24/07/2011 22:52:00', '24/07/2011 22:53:00', '24/07/2011 22:54:00', '24/07/2011 22:55:00', '24/07/2011 22:56:00', '24/07/2011 22:57:00', '24/07/2011 22:58:00', '24/07/2011 22:59:00', '24/07/2011 23:00:00', '24/07/2011 23:01:00', '24/07/2011 23:02:00', '24/07/2011 23:03:00', '24/07/2011 23:04:00', '24/07/2011 23:05:00', '24/07/2011 23:06:00', '24/07/2011 23:07:00', '24/07/2011 23:08:00', '24/07/2011 23:09:00', '24/07/2011 23:10:00', '24/07/2011 23:11:00', '24/07/2011 23:12:00', '24/07/2011 23:13:00', '24/07/2011 23:14:00', '24/07/2011 23:15:00', '24/07/2011 23:16:00', '24/07/2011 23:17:00', '24/07/2011 23:18:00', '24/07/2011 23:19:00', '24/07/2011 23:20:00', '24/07/2011 23:21:00', '24/07/2011 23:22:00', '24/07/2011 23:23:00', '24/07/2011 23:24:00', '24/07/2011 23:25:00', '24/07/2011 23:26:00', '24/07/2011 23:27:00', '24/07/2011 23:28:00', '24/07/2011 23:29:00', '24/07/2011 23:30:00', '24/07/2011 23:31:00', '24/07/2011 23:32:00', '24/07/2011 23:33:00', '24/07/2011 23:34:00', '24/07/2011 23:35:00', '24/07/2011 23:36:00', '24/07/2011 23:37:00', '24/07/2011 23:38:00', '24/07/2011 23:39:00', '24/07/2011 23:40:00', '24/07/2011 23:41:00', '24/07/2011 23:42:00', '24/07/2011 23:43:00', '24/07/2011 23:44:00', '24/07/2011 23:45:00', '24/07/2011 23:46:00', '24/07/2011 23:47:00', '24/07/2011 23:48:00', '24/07/2011 23:49:00', '24/07/2011 23:50:00', '24/07/2011 23:51:00', '24/07/2011 23:52:00', '24/07/2011 23:53:00', '24/07/2011 23:54:00', '24/07/2011 23:55:00', '24/07/2011 23:56:00', '24/07/2011 23:57:00', '24/07/2011 23:58:00', '24/07/2011 23:59:00', '25/07/2011 0:00:00', '25/07/2011 0:01:00', '25/07/2011 0:02:00', '25/07/2011 0:03:00', '25/07/2011 0:04:00', '25/07/2011 0:05:00', '25/07/2011 0:06:00', '25/07/2011 0:07:00', '25/07/2011 0:08:00', '25/07/2011 0:09:00', '25/07/2011 0:10:00', '25/07/2011 0:11:00', '25/07/2011 0:12:00', '25/07/2011 0:13:00', '25/07/2011 0:14:00', '25/07/2011 0:15:00', '25/07/2011 0:16:00', '25/07/2011 0:17:00', '25/07/2011 0:18:00', '25/07/2011 0:19:00', '25/07/2011 0:20:00', '25/07/2011 0:21:00', '25/07/2011 0:22:00', '25/07/2011 0:23:00', '25/07/2011 0:24:00', '25/07/2011 0:25:00', '25/07/2011 0:26:00', '25/07/2011 0:27:00', '25/07/2011 0:28:00', '25/07/2011 0:29:00', '25/07/2011 0:30:00', '25/07/2011 0:31:00', '25/07/2011 0:32:00', '25/07/2011 0:33:00', '25/07/2011 0:34:00', '25/07/2011 0:35:00', '25/07/2011 0:36:00', '25/07/2011 0:37:00', '25/07/2011 0:38:00', '25/07/2011 0:39:00', '25/07/2011 0:40:00', '25/07/2011 0:41:00', '25/07/2011 0:42:00', '25/07/2011 0:43:00', '25/07/2011 0:44:00', '25/07/2011 0:45:00', '25/07/2011 0:46:00', '25/07/2011 0:47:00', '25/07/2011 0:48:00', '25/07/2011 0:49:00', '25/07/2011 0:50:00', '25/07/2011 0:51:00', '25/07/2011 0:52:00', '25/07/2011 0:53:00', '25/07/2011 0:54:00', '25/07/2011 0:55:00', '25/07/2011 0:56:00', '25/07/2011 0:57:00', '25/07/2011 0:58:00', '25/07/2011 0:59:00', '25/07/2011 1:00:00', '25/07/2011 1:01:00', '25/07/2011 1:02:00', '25/07/2011 1:03:00', '25/07/2011 1:04:00', '25/07/2011 1:05:00', '25/07/2011 1:06:00', '25/07/2011 1:07:00', '25/07/2011 1:08:00', '25/07/2011 1:09:00', '25/07/2011 1:10:00', '25/07/2011 1:11:00', '25/07/2011 1:12:00', '25/07/2011 1:13:00', '25/07/2011 1:14:00', '25/07/2011 1:15:00', '25/07/2011 1:16:00', '25/07/2011 1:17:00', '25/07/2011 1:18:00', '25/07/2011 1:19:00', '25/07/2011 1:20:00', '25/07/2011 1:21:00', '25/07/2011 1:22:00', '25/07/2011 1:23:00', '25/07/2011 1:24:00', '25/07/2011 1:25:00', '25/07/2011 1:26:00', '25/07/2011 1:27:00', '25/07/2011 1:28:00', '25/07/2011 1:29:00', '25/07/2011 1:30:00', '25/07/2011 1:31:00', '25/07/2011 1:32:00', '25/07/2011 1:33:00', '25/07/2011 1:34:00', '25/07/2011 1:35:00', '25/07/2011 1:36:00', '25/07/2011 1:37:00', '25/07/2011 1:38:00', '25/07/2011 1:39:00', '25/07/2011 1:40:00', '25/07/2011 1:41:00', '25/07/2011 1:42:00', '25/07/2011 1:43:00', '25/07/2011 1:44:00', '25/07/2011 1:45:00', '25/07/2011 1:46:00', '25/07/2011 1:47:00', '25/07/2011 1:48:00', '25/07/2011 1:49:00', '25/07/2011 1:50:00', '25/07/2011 1:51:00', '25/07/2011 1:52:00', '25/07/2011 1:53:00', '25/07/2011 1:54:00', '25/07/2011 1:55:00', '25/07/2011 1:56:00', '25/07/2011 1:57:00', '25/07/2011 1:58:00', '25/07/2011 1:59:00', '25/07/2011 2:00:00', '25/07/2011 2:01:00', '25/07/2011 2:02:00', '25/07/2011 2:03:00', '25/07/2011 2:04:00', '25/07/2011 2:05:00', '25/07/2011 2:06:00', '25/07/2011 2:07:00', '25/07/2011 2:08:00', '25/07/2011 2:09:00', '25/07/2011 2:10:00', '25/07/2011 2:11:00', '25/07/2011 2:12:00', '25/07/2011 2:13:00', '25/07/2011 2:14:00', '25/07/2011 2:15:00', '25/07/2011 2:16:00', '25/07/2011 2:17:00', '25/07/2011 2:18:00', '25/07/2011 2:19:00', '25/07/2011 2:20:00', '25/07/2011 2:21:00', '25/07/2011 2:22:00', '25/07/2011 2:23:00', '25/07/2011 2:24:00', '25/07/2011 2:25:00', '25/07/2011 2:26:00', '25/07/2011 2:27:00', '25/07/2011 2:28:00', '25/07/2011 2:29:00', '25/07/2011 2:30:00', '25/07/2011 2:31:00', '25/07/2011 2:32:00', '25/07/2011 2:33:00', '25/07/2011 2:34:00', '25/07/2011 2:35:00', '25/07/2011 2:36:00', '25/07/2011 2:37:00', '25/07/2011 2:38:00', '25/07/2011 2:39:00', '25/07/2011 2:40:00', '25/07/2011 2:41:00', '25/07/2011 2:42:00', '25/07/2011 2:43:00', '25/07/2011 2:44:00', '25/07/2011 2:45:00', '25/07/2011 2:46:00', '25/07/2011 2:47:00', '25/07/2011 2:48:00', '25/07/2011 2:49:00', '25/07/2011 2:50:00', '25/07/2011 2:51:00', '25/07/2011 2:52:00', '25/07/2011 2:53:00', '25/07/2011 2:54:00', '25/07/2011 2:55:00', '25/07/2011 2:56:00', '25/07/2011 2:57:00', '25/07/2011 2:58:00', '25/07/2011 2:59:00', '25/07/2011 3:00:00', '25/07/2011 3:01:00', '25/07/2011 3:02:00', '25/07/2011 3:03:00', '25/07/2011 3:04:00', '25/07/2011 3:05:00', '25/07/2011 3:06:00', '25/07/2011 3:07:00', '25/07/2011 3:08:00', '25/07/2011 3:09:00', '25/07/2011 3:10:00', '25/07/2011 3:11:00', '25/07/2011 3:12:00', '25/07/2011 3:13:00', '25/07/2011 3:14:00', '25/07/2011 3:15:00', '25/07/2011 3:16:00', '25/07/2011 3:17:00', '25/07/2011 3:18:00', '25/07/2011 3:19:00', '25/07/2011 3:20:00', '25/07/2011 3:21:00', '25/07/2011 3:22:00', '25/07/2011 3:23:00', '25/07/2011 3:24:00', '25/07/2011 3:25:00', '25/07/2011 3:26:00', '25/07/2011 3:27:00', '25/07/2011 3:28:00', '25/07/2011 3:29:00', '25/07/2011 3:30:00', '25/07/2011 3:31:00', '25/07/2011 3:32:00', '25/07/2011 3:33:00', '25/07/2011 3:34:00', '25/07/2011 3:35:00', '25/07/2011 3:36:00', '25/07/2011 3:37:00', '25/07/2011 3:38:00', '25/07/2011 3:39:00', '25/07/2011 3:40:00', '25/07/2011 3:41:00', '25/07/2011 3:42:00', '25/07/2011 3:43:00', '25/07/2011 3:44:00', '25/07/2011 3:45:00', '25/07/2011 3:46:00', '25/07/2011 3:47:00', '25/07/2011 3:48:00', '25/07/2011 3:49:00', '25/07/2011 3:50:00', '25/07/2011 3:51:00', '25/07/2011 3:52:00', '25/07/2011 3:53:00', '25/07/2011 3:54:00', '25/07/2011 3:55:00', '25/07/2011 3:56:00', '25/07/2011 3:57:00', '25/07/2011 3:58:00', '25/07/2011 3:59:00', '25/07/2011 4:00:00', '25/07/2011 4:01:00', '25/07/2011 4:02:00', '25/07/2011 4:03:00', '25/07/2011 4:04:00', '25/07/2011 4:05:00', '25/07/2011 4:06:00', '25/07/2011 4:07:00', '25/07/2011 4:08:00', '25/07/2011 4:09:00', '25/07/2011 4:10:00', '25/07/2011 4:11:00', '25/07/2011 4:12:00', '25/07/2011 4:13:00', '25/07/2011 4:14:00', '25/07/2011 4:15:00', '25/07/2011 4:16:00', '25/07/2011 4:17:00', '25/07/2011 4:18:00', '25/07/2011 4:19:00', '25/07/2011 4:20:00', '25/07/2011 4:21:00', '25/07/2011 4:22:00', '25/07/2011 4:23:00', '25/07/2011 4:24:00', '25/07/2011 4:25:00', '25/07/2011 4:26:00', '25/07/2011 4:27:00', '25/07/2011 4:28:00', '25/07/2011 4:29:00', '25/07/2011 4:30:00', '25/07/2011 4:31:00', '25/07/2011 4:32:00', '25/07/2011 4:33:00', '25/07/2011 4:34:00', '25/07/2011 4:35:00', '25/07/2011 4:36:00', '25/07/2011 4:37:00', '25/07/2011 4:38:00', '25/07/2011 4:39:00', '25/07/2011 4:40:00', '25/07/2011 4:41:00', '25/07/2011 4:42:00', '25/07/2011 4:43:00', '25/07/2011 4:44:00', '25/07/2011 4:45:00', '25/07/2011 4:46:00', '25/07/2011 4:47:00', '25/07/2011 4:48:00', '25/07/2011 4:49:00', '25/07/2011 4:50:00', '25/07/2011 4:51:00', '25/07/2011 4:52:00', '25/07/2011 4:53:00', '25/07/2011 4:54:00', '25/07/2011 4:55:00', '25/07/2011 4:56:00', '25/07/2011 4:57:00', '25/07/2011 4:58:00', '25/07/2011 4:59:00', '25/07/2011 5:00:00', '25/07/2011 5:01:00', '25/07/2011 5:02:00', '25/07/2011 5:03:00', '25/07/2011 5:04:00', '25/07/2011 5:05:00', '25/07/2011 5:06:00', '25/07/2011 5:07:00', '25/07/2011 5:08:00', '25/07/2011 5:09:00', '25/07/2011 5:10:00', '25/07/2011 5:11:00', '25/07/2011 5:12:00', '25/07/2011 5:13:00', '25/07/2011 5:14:00', '25/07/2011 5:15:00', '25/07/2011 5:16:00', '25/07/2011 5:17:00', '25/07/2011 5:18:00', '25/07/2011 5:19:00', '25/07/2011 5:20:00', '25/07/2011 5:21:00', '25/07/2011 5:22:00', '25/07/2011 5:23:00', '25/07/2011 5:24:00', '25/07/2011 5:25:00', '25/07/2011 5:26:00', '25/07/2011 5:27:00', '25/07/2011 5:28:00', '25/07/2011 5:29:00', '25/07/2011 5:30:00', '25/07/2011 5:31:00', '25/07/2011 5:32:00', '25/07/2011 5:33:00', '25/07/2011 5:34:00', '25/07/2011 5:35:00', '25/07/2011 5:36:00', '25/07/2011 5:37:00', '25/07/2011 5:38:00', '25/07/2011 5:39:00', '25/07/2011 5:40:00', '25/07/2011 5:41:00', '25/07/2011 5:42:00', '25/07/2011 5:43:00', '25/07/2011 5:44:00', '25/07/2011 5:45:00', '25/07/2011 5:46:00', '25/07/2011 5:47:00', '25/07/2011 5:48:00', '25/07/2011 5:49:00', '25/07/2011 5:50:00', '25/07/2011 5:51:00', '25/07/2011 5:52:00', '25/07/2011 5:53:00', '25/07/2011 5:54:00', '25/07/2011 5:55:00', '25/07/2011 5:56:00', '25/07/2011 5:57:00', '25/07/2011 5:58:00', '25/07/2011 5:59:00', '25/07/2011 6:00:00', '25/07/2011 6:01:00', '25/07/2011 6:02:00', '25/07/2011 6:03:00', '25/07/2011 6:04:00', '25/07/2011 6:05:00', '25/07/2011 6:06:00', '25/07/2011 6:07:00', '25/07/2011 6:08:00', '25/07/2011 6:09:00', '25/07/2011 6:10:00', '25/07/2011 6:11:00', '25/07/2011 6:12:00', '25/07/2011 6:13:00', '25/07/2011 6:14:00', '25/07/2011 6:15:00', '25/07/2011 6:16:00', '25/07/2011 6:17:00', '25/07/2011 6:18:00', '25/07/2011 6:19:00', '25/07/2011 6:20:00', '25/07/2011 6:21:00', '25/07/2011 6:22:00', '25/07/2011 6:23:00', '25/07/2011 6:24:00', '25/07/2011 6:25:00', '25/07/2011 6:26:00', '25/07/2011 6:27:00', '25/07/2011 6:28:00', '25/07/2011 6:29:00', '25/07/2011 6:30:00', '25/07/2011 6:31:00', '25/07/2011 6:32:00', '25/07/2011 6:33:00', '25/07/2011 6:34:00', '25/07/2011 6:35:00', '25/07/2011 6:36:00', '25/07/2011 6:37:00', '25/07/2011 6:38:00', '25/07/2011 6:39:00', '25/07/2011 6:40:00', '25/07/2011 6:41:00', '25/07/2011 6:42:00', '25/07/2011 6:43:00', '25/07/2011 6:44:00', '25/07/2011 6:45:00', '25/07/2011 6:46:00', '25/07/2011 6:47:00', '25/07/2011 6:48:00', '25/07/2011 6:49:00', '25/07/2011 6:50:00', '25/07/2011 6:51:00', '25/07/2011 6:52:00', '25/07/2011 6:53:00', '25/07/2011 6:54:00', '25/07/2011 6:55:00', '25/07/2011 6:56:00', '25/07/2011 6:57:00', '25/07/2011 6:58:00', '25/07/2011 6:59:00', '25/07/2011 7:00:00', '25/07/2011 7:01:00', '25/07/2011 7:02:00', '25/07/2011 7:03:00', '25/07/2011 7:04:00', '25/07/2011 7:05:00', '25/07/2011 7:06:00', '25/07/2011 7:07:00', '25/07/2011 7:08:00', '25/07/2011 7:09:00', '25/07/2011 7:10:00', '25/07/2011 7:11:00', '25/07/2011 7:12:00', '25/07/2011 7:13:00', '25/07/2011 7:14:00', '25/07/2011 7:15:00', '25/07/2011 7:16:00', '25/07/2011 7:17:00', '25/07/2011 7:18:00', '25/07/2011 7:19:00', '25/07/2011 7:20:00', '25/07/2011 7:21:00', '25/07/2011 7:22:00', '25/07/2011 7:23:00', '25/07/2011 7:24:00', '25/07/2011 7:25:00', '25/07/2011 7:26:00', '25/07/2011 7:27:00', '25/07/2011 7:28:00', '25/07/2011 7:29:00', '25/07/2011 7:30:00', '25/07/2011 7:31:00', '25/07/2011 7:32:00', '25/07/2011 7:33:00', '25/07/2011 7:34:00', '25/07/2011 7:35:00', '25/07/2011 7:36:00', '25/07/2011 7:37:00', '25/07/2011 7:38:00', '25/07/2011 7:39:00', '25/07/2011 7:40:00', '25/07/2011 7:41:00', '25/07/2011 7:42:00', '25/07/2011 7:43:00', '25/07/2011 7:44:00', '25/07/2011 7:45:00', '25/07/2011 7:46:00', '25/07/2011 7:47:00', '25/07/2011 7:48:00', '25/07/2011 7:49:00', '25/07/2011 7:50:00', '25/07/2011 7:51:00', '25/07/2011 7:52:00', '25/07/2011 7:53:00', '25/07/2011 7:54:00', '25/07/2011 7:55:00', '25/07/2011 7:56:00', '25/07/2011 7:57:00', '25/07/2011 7:58:00', '25/07/2011 7:59:00', '25/07/2011 8:00:00', '25/07/2011 8:01:00', '25/07/2011 8:02:00', '25/07/2011 8:03:00', '25/07/2011 8:04:00', '25/07/2011 8:05:00', '25/07/2011 8:06:00', '25/07/2011 8:07:00', '25/07/2011 8:08:00', '25/07/2011 8:09:00', '25/07/2011 8:10:00', '25/07/2011 8:11:00', '25/07/2011 8:12:00', '25/07/2011 8:13:00', '25/07/2011 8:14:00', '25/07/2011 8:15:00', '25/07/2011 8:16:00', '25/07/2011 8:17:00', '25/07/2011 8:18:00', '25/07/2011 8:19:00', '25/07/2011 8:20:00', '25/07/2011 8:21:00', '25/07/2011 8:22:00', '25/07/2011 8:23:00', '25/07/2011 8:24:00', '25/07/2011 8:25:00', '25/07/2011 8:26:00', '25/07/2011 8:27:00', '25/07/2011 8:28:00', '25/07/2011 8:29:00', '25/07/2011 8:30:00', '25/07/2011 8:31:00', '25/07/2011 8:32:00', '25/07/2011 8:33:00', '25/07/2011 8:34:00', '25/07/2011 8:35:00', '25/07/2011 8:36:00', '25/07/2011 8:37:00', '25/07/2011 8:38:00', '25/07/2011 8:39:00', '25/07/2011 8:40:00', '25/07/2011 8:41:00', '25/07/2011 8:42:00', '25/07/2011 8:43:00', '25/07/2011 8:44:00', '25/07/2011 8:45:00', '25/07/2011 8:46:00', '25/07/2011 8:47:00', '25/07/2011 8:48:00', '25/07/2011 8:49:00', '25/07/2011 8:50:00', '25/07/2011 8:51:00', '25/07/2011 8:52:00', '25/07/2011 8:53:00', '25/07/2011 8:54:00', '25/07/2011 8:55:00', '25/07/2011 8:56:00', '25/07/2011 8:57:00', '25/07/2011 8:58:00', '25/07/2011 8:59:00', '25/07/2011 9:00:00', '25/07/2011 9:01:00', '25/07/2011 9:02:00', '25/07/2011 9:03:00', '25/07/2011 9:04:00', '25/07/2011 9:05:00', '25/07/2011 9:06:00', '25/07/2011 9:07:00', '25/07/2011 9:08:00', '25/07/2011 9:09:00', '25/07/2011 9:10:00', '25/07/2011 9:11:00', '25/07/2011 9:12:00', '25/07/2011 9:13:00', '25/07/2011 9:14:00', '25/07/2011 9:15:00', '25/07/2011 9:16:00', '25/07/2011 9:17:00', '25/07/2011 9:18:00', '25/07/2011 9:19:00', '25/07/2011 9:20:00', '25/07/2011 9:21:00', '25/07/2011 9:22:00', '25/07/2011 9:23:00', '25/07/2011 9:24:00', '25/07/2011 9:25:00', '25/07/2011 9:26:00', '25/07/2011 9:27:00', '25/07/2011 9:28:00', '25/07/2011 9:29:00', '25/07/2011 9:30:00', '25/07/2011 9:31:00', '25/07/2011 9:32:00', '25/07/2011 9:33:00', '25/07/2011 9:34:00', '25/07/2011 9:35:00', '25/07/2011 9:36:00', '25/07/2011 9:37:00', '25/07/2011 9:38:00', '25/07/2011 9:39:00', '25/07/2011 9:40:00', '25/07/2011 9:41:00', '25/07/2011 9:42:00', '25/07/2011 9:43:00', '25/07/2011 9:44:00', '25/07/2011 9:45:00', '25/07/2011 9:46:00', '25/07/2011 9:47:00', '25/07/2011 9:48:00', '25/07/2011 9:49:00', '25/07/2011 9:50:00', '25/07/2011 9:51:00', '25/07/2011 9:52:00', '25/07/2011 9:53:00', '25/07/2011 9:54:00', '25/07/2011 9:55:00', '25/07/2011 9:56:00', '25/07/2011 9:57:00', '25/07/2011 9:58:00', '25/07/2011 9:59:00', '25/07/2011 10:00:00', '25/07/2011 10:01:00', '25/07/2011 10:02:00', '25/07/2011 10:03:00', '25/07/2011 10:04:00', '25/07/2011 10:05:00', '25/07/2011 10:06:00', '25/07/2011 10:07:00', '25/07/2011 10:08:00', '25/07/2011 10:09:00', '25/07/2011 10:10:00', '25/07/2011 10:11:00', '25/07/2011 10:12:00', '25/07/2011 10:13:00', '25/07/2011 10:14:00', '25/07/2011 10:15:00', '25/07/2011 10:16:00', '25/07/2011 10:17:00', '25/07/2011 10:18:00', '25/07/2011 10:19:00', '25/07/2011 10:20:00', '25/07/2011 10:21:00', '25/07/2011 10:22:00', '25/07/2011 10:23:00', '25/07/2011 10:24:00', '25/07/2011 10:25:00', '25/07/2011 10:26:00', '25/07/2011 10:27:00', '25/07/2011 10:28:00', '25/07/2011 10:29:00', '25/07/2011 10:30:00', '25/07/2011 10:31:00', '25/07/2011 10:32:00', '25/07/2011 10:33:00', '25/07/2011 10:34:00', '25/07/2011 10:35:00', '25/07/2011 10:36:00', '25/07/2011 10:37:00', '25/07/2011 10:38:00', '25/07/2011 10:39:00', '25/07/2011 10:40:00', '25/07/2011 10:41:00', '25/07/2011 10:42:00', '25/07/2011 10:43:00', '25/07/2011 10:44:00', '25/07/2011 10:45:00', '25/07/2011 10:46:00', '25/07/2011 10:47:00', '25/07/2011 10:48:00', '25/07/2011 10:49:00', '25/07/2011 10:50:00', '25/07/2011 10:51:00', '25/07/2011 10:52:00', '25/07/2011 10:53:00', '25/07/2011 10:54:00', '25/07/2011 10:55:00', '25/07/2011 10:56:00', '25/07/2011 10:57:00', '25/07/2011 10:58:00', '25/07/2011 10:59:00', '25/07/2011 11:00:00', '25/07/2011 11:01:00', '25/07/2011 11:02:00', '25/07/2011 11:03:00', '25/07/2011 11:04:00', '25/07/2011 11:05:00', '25/07/2011 11:06:00', '25/07/2011 11:07:00', '25/07/2011 11:08:00', '25/07/2011 11:09:00', '25/07/2011 11:10:00', '25/07/2011 11:11:00', '25/07/2011 11:12:00', '25/07/2011 11:13:00', '25/07/2011 11:14:00', '25/07/2011 11:15:00', '25/07/2011 11:16:00', '25/07/2011 11:17:00', '25/07/2011 11:18:00', '25/07/2011 11:19:00', '25/07/2011 11:20:00', '25/07/2011 11:21:00', '25/07/2011 11:22:00', '25/07/2011 11:23:00', '25/07/2011 11:24:00', '25/07/2011 11:25:00', '25/07/2011 11:26:00', '25/07/2011 11:27:00', '25/07/2011 11:28:00', '25/07/2011 11:29:00', '25/07/2011 11:30:00', '25/07/2011 11:31:00', '25/07/2011 11:32:00', '25/07/2011 11:33:00', '25/07/2011 11:34:00', '25/07/2011 11:35:00', '25/07/2011 11:36:00', '25/07/2011 11:37:00', '25/07/2011 11:38:00', '25/07/2011 11:39:00', '25/07/2011 11:40:00', '25/07/2011 11:41:00', '25/07/2011 11:42:00', '25/07/2011 11:43:00', '25/07/2011 11:44:00', '25/07/2011 11:45:00', '25/07/2011 11:46:00', '25/07/2011 11:47:00', '25/07/2011 11:48:00', '25/07/2011 11:49:00', '25/07/2011 11:50:00', '25/07/2011 11:51:00', '25/07/2011 11:52:00', '25/07/2011 11:53:00', '25/07/2011 11:54:00', '25/07/2011 11:55:00', '25/07/2011 11:56:00', '25/07/2011 11:57:00', '25/07/2011 11:58:00', '25/07/2011 11:59:00', '25/07/2011 12:00:00', '25/07/2011 12:01:00', '25/07/2011 12:02:00', '25/07/2011 12:03:00', '25/07/2011 12:04:00', '25/07/2011 12:05:00', '25/07/2011 12:06:00', '25/07/2011 12:07:00', '25/07/2011 12:08:00', '25/07/2011 12:09:00', '25/07/2011 12:10:00', '25/07/2011 12:11:00', '25/07/2011 12:12:00', '25/07/2011 12:13:00', '25/07/2011 12:14:00', '25/07/2011 12:15:00', '25/07/2011 12:16:00', '25/07/2011 12:17:00', '25/07/2011 12:18:00', '25/07/2011 12:19:00', '25/07/2011 12:20:00', '25/07/2011 12:21:00', '25/07/2011 12:22:00', '25/07/2011 12:23:00', '25/07/2011 12:24:00', '25/07/2011 12:25:00', '25/07/2011 12:26:00', '25/07/2011 12:27:00', '25/07/2011 12:28:00', '25/07/2011 12:29:00', '25/07/2011 12:30:00', '25/07/2011 12:31:00', '25/07/2011 12:32:00', '25/07/2011 12:33:00', '25/07/2011 12:34:00', '25/07/2011 12:35:00', '25/07/2011 12:36:00', '25/07/2011 12:37:00', '25/07/2011 12:38:00', '25/07/2011 12:39:00', '25/07/2011 12:40:00', '25/07/2011 12:41:00', '25/07/2011 12:42:00', '25/07/2011 12:43:00', '25/07/2011 12:44:00', '25/07/2011 12:45:00', '25/07/2011 12:46:00', '25/07/2011 12:47:00', '25/07/2011 12:48:00', '25/07/2011 12:49:00', '25/07/2011 12:50:00', '25/07/2011 12:51:00', '25/07/2011 12:52:00', '25/07/2011 12:53:00', '25/07/2011 12:54:00', '25/07/2011 12:55:00', '25/07/2011 12:56:00', '25/07/2011 12:57:00', '25/07/2011 12:58:00', '25/07/2011 12:59:00', '25/07/2011 13:00:00', '25/07/2011 13:01:00', '25/07/2011 13:02:00', '25/07/2011 13:03:00', '25/07/2011 13:04:00', '25/07/2011 13:05:00', '25/07/2011 13:06:00', '25/07/2011 13:07:00', '25/07/2011 13:08:00', '25/07/2011 13:09:00', '25/07/2011 13:10:00', '25/07/2011 13:11:00', '25/07/2011 13:12:00', '25/07/2011 13:13:00', '25/07/2011 13:14:00', '25/07/2011 13:15:00', '25/07/2011 13:16:00', '25/07/2011 13:17:00', '25/07/2011 13:18:00', '25/07/2011 13:19:00', '25/07/2011 13:20:00', '25/07/2011 13:21:00', '25/07/2011 13:22:00', '25/07/2011 13:23:00', '25/07/2011 13:24:00', '25/07/2011 13:25:00', '25/07/2011 13:26:00', '25/07/2011 13:27:00', '25/07/2011 13:28:00', '25/07/2011 13:29:00', '25/07/2011 13:30:00', '25/07/2011 13:31:00', '25/07/2011 13:32:00', '25/07/2011 13:33:00', '25/07/2011 13:34:00', '25/07/2011 13:35:00', '25/07/2011 13:36:00', '25/07/2011 13:37:00', '25/07/2011 13:38:00', '25/07/2011 13:39:00', '25/07/2011 13:40:00', '25/07/2011 13:41:00', '25/07/2011 13:42:00', '25/07/2011 13:43:00', '25/07/2011 13:44:00', '25/07/2011 13:45:00', '25/07/2011 13:46:00', '25/07/2011 13:47:00', '25/07/2011 13:48:00', '25/07/2011 13:49:00', '25/07/2011 13:50:00', '25/07/2011 13:51:00', '25/07/2011 13:52:00', '25/07/2011 13:53:00', '25/07/2011 13:54:00', '25/07/2011 13:55:00', '25/07/2011 13:56:00', '25/07/2011 13:57:00', '25/07/2011 13:58:00', '25/07/2011 13:59:00', '25/07/2011 14:00:00', '25/07/2011 14:01:00', '25/07/2011 14:02:00', '25/07/2011 14:03:00', '25/07/2011 14:04:00', '25/07/2011 14:05:00', '25/07/2011 14:06:00', '25/07/2011 14:07:00', '25/07/2011 14:08:00', '25/07/2011 14:09:00', '25/07/2011 14:10:00', '25/07/2011 14:11:00', '25/07/2011 14:12:00', '25/07/2011 14:13:00', '25/07/2011 14:14:00', '25/07/2011 14:15:00', '25/07/2011 14:16:00', '25/07/2011 14:17:00', '25/07/2011 14:18:00', '25/07/2011 14:19:00', '25/07/2011 14:20:00', '25/07/2011 14:21:00', '25/07/2011 14:22:00', '25/07/2011 14:23:00', '25/07/2011 14:24:00', '25/07/2011 14:25:00', '25/07/2011 14:26:00', '25/07/2011 14:27:00', '25/07/2011 14:28:00', '25/07/2011 14:29:00', '25/07/2011 14:30:00', '25/07/2011 14:31:00', '25/07/2011 14:32:00', '25/07/2011 14:33:00', '25/07/2011 14:34:00', '25/07/2011 14:35:00', '25/07/2011 14:36:00', '25/07/2011 14:37:00', '25/07/2011 14:38:00', '25/07/2011 14:39:00', '25/07/2011 14:40:00', '25/07/2011 14:41:00', '25/07/2011 14:42:00', '25/07/2011 14:43:00', '25/07/2011 14:44:00', '25/07/2011 14:45:00', '25/07/2011 14:46:00', '25/07/2011 14:47:00', '25/07/2011 14:48:00', '25/07/2011 14:49:00', '25/07/2011 14:50:00', '25/07/2011 14:51:00', '25/07/2011 14:52:00', '25/07/2011 14:53:00', '25/07/2011 14:54:00', '25/07/2011 14:55:00', '25/07/2011 14:56:00', '25/07/2011 14:57:00', '25/07/2011 14:58:00', '25/07/2011 14:59:00', '25/07/2011 15:00:00', '25/07/2011 15:01:00', '25/07/2011 15:02:00', '25/07/2011 15:03:00', '25/07/2011 15:04:00', '25/07/2011 15:05:00', '25/07/2011 15:06:00', '25/07/2011 15:07:00', '25/07/2011 15:08:00', '25/07/2011 15:09:00', '25/07/2011 15:10:00', '25/07/2011 15:11:00', '25/07/2011 15:12:00', '25/07/2011 15:13:00', '25/07/2011 15:14:00', '25/07/2011 15:15:00', '25/07/2011 15:16:00', '25/07/2011 15:17:00', '25/07/2011 15:18:00', '25/07/2011 15:19:00', '25/07/2011 15:20:00', '25/07/2011 15:21:00', '25/07/2011 15:22:00', '25/07/2011 15:23:00', '25/07/2011 15:24:00', '25/07/2011 15:25:00', '25/07/2011 15:26:00', '25/07/2011 15:27:00', '25/07/2011 15:28:00', '25/07/2011 15:29:00', '25/07/2011 15:30:00', '25/07/2011 15:31:00', '25/07/2011 15:32:00', '25/07/2011 15:33:00', '25/07/2011 15:34:00', '25/07/2011 15:35:00', '25/07/2011 15:36:00', '25/07/2011 15:37:00', '25/07/2011 15:38:00', '25/07/2011 15:39:00', '25/07/2011 15:40:00', '25/07/2011 15:41:00', '25/07/2011 15:42:00', '25/07/2011 15:43:00', '25/07/2011 15:44:00', '25/07/2011 15:45:00', '25/07/2011 15:46:00', '25/07/2011 15:47:00', '25/07/2011 15:48:00', '25/07/2011 15:49:00', '25/07/2011 15:50:00', '25/07/2011 15:51:00', '25/07/2011 15:52:00', '25/07/2011 15:53:00', '25/07/2011 15:54:00', '25/07/2011 15:55:00', '25/07/2011 15:56:00', '25/07/2011 15:57:00', '25/07/2011 15:58:00', '25/07/2011 15:59:00', '25/07/2011 16:00:00', '25/07/2011 16:01:00', '25/07/2011 16:02:00', '25/07/2011 16:03:00', '25/07/2011 16:04:00', '25/07/2011 16:05:00', '25/07/2011 16:06:00', '25/07/2011 16:07:00', '25/07/2011 16:08:00', '25/07/2011 16:09:00', '25/07/2011 16:10:00', '25/07/2011 16:11:00', '25/07/2011 16:12:00', '25/07/2011 16:13:00', '25/07/2011 16:14:00', '25/07/2011 16:15:00', '25/07/2011 16:16:00', '25/07/2011 16:17:00', '25/07/2011 16:18:00', '25/07/2011 16:19:00', '25/07/2011 16:20:00', '25/07/2011 16:21:00', '25/07/2011 16:22:00', '25/07/2011 16:23:00', '25/07/2011 16:24:00', '25/07/2011 16:25:00', '25/07/2011 16:26:00', '25/07/2011 16:27:00', '25/07/2011 16:28:00', '25/07/2011 16:29:00', '25/07/2011 16:30:00', '25/07/2011 16:31:00', '25/07/2011 16:32:00', '25/07/2011 16:33:00', '25/07/2011 16:34:00', '25/07/2011 16:35:00', '25/07/2011 16:36:00', '25/07/2011 16:37:00', '25/07/2011 16:38:00', '25/07/2011 16:39:00', '25/07/2011 16:40:00', '25/07/2011 16:41:00', '25/07/2011 16:42:00', '25/07/2011 16:43:00', '25/07/2011 16:44:00', '25/07/2011 16:45:00', '25/07/2011 16:46:00', '25/07/2011 16:47:00', '25/07/2011 16:48:00', '25/07/2011 16:49:00', '25/07/2011 16:50:00', '25/07/2011 16:51:00', '25/07/2011 16:52:00', '25/07/2011 16:53:00', '25/07/2011 16:54:00', '25/07/2011 16:55:00', '25/07/2011 16:56:00', '25/07/2011 16:57:00', '25/07/2011 16:58:00', '25/07/2011 16:59:00', '25/07/2011 17:00:00', '25/07/2011 17:01:00', '25/07/2011 17:02:00', '25/07/2011 17:03:00', '25/07/2011 17:04:00', '25/07/2011 17:05:00', '25/07/2011 17:06:00', '25/07/2011 17:07:00', '25/07/2011 17:08:00', '25/07/2011 17:09:00', '25/07/2011 17:10:00', '25/07/2011 17:11:00', '25/07/2011 17:12:00', '25/07/2011 17:13:00', '25/07/2011 17:14:00', '25/07/2011 17:15:00', '25/07/2011 17:16:00', '25/07/2011 17:17:00', '25/07/2011 17:18:00', '25/07/2011 17:19:00', '25/07/2011 17:20:00', '25/07/2011 17:21:00', '25/07/2011 17:22:00', '25/07/2011 17:23:00', '25/07/2011 17:24:00', '25/07/2011 17:25:00', '25/07/2011 17:26:00', '25/07/2011 17:27:00', '25/07/2011 17:28:00', '25/07/2011 17:29:00', '25/07/2011 17:30:00', '25/07/2011 17:31:00', '25/07/2011 17:32:00', '25/07/2011 17:33:00', '25/07/2011 17:34:00', '25/07/2011 17:35:00', '25/07/2011 17:36:00', '25/07/2011 17:37:00', '25/07/2011 17:38:00', '25/07/2011 17:39:00', '25/07/2011 17:40:00', '25/07/2011 17:41:00', '25/07/2011 17:42:00', '25/07/2011 17:43:00', '25/07/2011 17:44:00', '25/07/2011 17:45:00', '25/07/2011 17:46:00', '25/07/2011 17:47:00', '25/07/2011 17:48:00', '25/07/2011 17:49:00', '25/07/2011 17:50:00', '25/07/2011 17:51:00', '25/07/2011 17:52:00', '25/07/2011 17:53:00', '25/07/2011 17:54:00', '25/07/2011 17:55:00', '25/07/2011 17:56:00', '25/07/2011 17:57:00', '25/07/2011 17:58:00', '25/07/2011 17:59:00', '25/07/2011 18:00:00', '25/07/2011 18:01:00', '25/07/2011 18:02:00', '25/07/2011 18:03:00', '25/07/2011 18:04:00', '25/07/2011 18:05:00', '25/07/2011 18:06:00', '25/07/2011 18:07:00', '25/07/2011 18:08:00', '25/07/2011 18:09:00', '25/07/2011 18:10:00', '25/07/2011 18:11:00', '25/07/2011 18:12:00', '25/07/2011 18:13:00', '25/07/2011 18:14:00', '25/07/2011 18:15:00', '25/07/2011 18:16:00', '25/07/2011 18:17:00', '25/07/2011 18:18:00', '25/07/2011 18:19:00', '25/07/2011 18:20:00', '25/07/2011 18:21:00', '25/07/2011 18:22:00', '25/07/2011 18:23:00', '25/07/2011 18:24:00', '25/07/2011 18:25:00', '25/07/2011 18:26:00', '25/07/2011 18:27:00', '25/07/2011 18:28:00', '25/07/2011 18:29:00', '25/07/2011 18:30:00', '25/07/2011 18:31:00', '25/07/2011 18:32:00', '25/07/2011 18:33:00', '25/07/2011 18:34:00', '25/07/2011 18:35:00', '25/07/2011 18:36:00', '25/07/2011 18:37:00', '25/07/2011 18:38:00', '25/07/2011 18:39:00', '25/07/2011 18:40:00', '25/07/2011 18:41:00', '25/07/2011 18:42:00', '25/07/2011 18:43:00', '25/07/2011 18:44:00', '25/07/2011 18:45:00', '25/07/2011 18:46:00', '25/07/2011 18:47:00', '25/07/2011 18:48:00', '25/07/2011 18:49:00', '25/07/2011 18:50:00', '25/07/2011 18:51:00', '25/07/2011 18:52:00', '25/07/2011 18:53:00', '25/07/2011 18:54:00', '25/07/2011 18:55:00', '25/07/2011 18:56:00', '25/07/2011 18:57:00', '25/07/2011 18:58:00', '25/07/2011 18:59:00', '25/07/2011 19:00:00', '25/07/2011 19:01:00', '25/07/2011 19:02:00', '25/07/2011 19:03:00', '25/07/2011 19:04:00', '25/07/2011 19:05:00', '25/07/2011 19:06:00', '25/07/2011 19:07:00', '25/07/2011 19:08:00', '25/07/2011 19:09:00', '25/07/2011 19:10:00', '25/07/2011 19:11:00', '25/07/2011 19:12:00', '25/07/2011 19:13:00', '25/07/2011 19:14:00', '25/07/2011 19:15:00', '25/07/2011 19:16:00', '25/07/2011 19:17:00', '25/07/2011 19:18:00', '25/07/2011 19:19:00', '25/07/2011 19:20:00', '25/07/2011 19:21:00', '25/07/2011 19:22:00', '25/07/2011 19:23:00', '25/07/2011 19:24:00', '25/07/2011 19:25:00', '25/07/2011 19:26:00', '25/07/2011 19:27:00', '25/07/2011 19:28:00', '25/07/2011 19:29:00', '25/07/2011 19:30:00', '25/07/2011 19:31:00', '25/07/2011 19:32:00', '25/07/2011 19:33:00', '25/07/2011 19:34:00', '25/07/2011 19:35:00', '25/07/2011 19:36:00', '25/07/2011 19:37:00', '25/07/2011 19:38:00', '25/07/2011 19:39:00', '25/07/2011 19:40:00', '25/07/2011 19:41:00', '25/07/2011 19:42:00', '25/07/2011 19:43:00', '25/07/2011 19:44:00', '25/07/2011 19:45:00', '25/07/2011 19:46:00', '25/07/2011 19:47:00', '25/07/2011 19:48:00', '25/07/2011 19:49:00', '25/07/2011 19:50:00', '25/07/2011 19:51:00', '25/07/2011 19:52:00', '25/07/2011 19:53:00', '25/07/2011 19:54:00', '25/07/2011 19:55:00', '25/07/2011 19:56:00', '25/07/2011 19:57:00', '25/07/2011 19:58:00', '25/07/2011 19:59:00', '25/07/2011 20:00:00', '25/07/2011 20:01:00', '25/07/2011 20:02:00', '25/07/2011 20:03:00', '25/07/2011 20:04:00', '25/07/2011 20:05:00', '25/07/2011 20:06:00', '25/07/2011 20:07:00', '25/07/2011 20:08:00', '25/07/2011 20:09:00', '25/07/2011 20:10:00', '25/07/2011 20:11:00', '25/07/2011 20:12:00', '25/07/2011 20:13:00', '25/07/2011 20:14:00', '25/07/2011 20:15:00', '25/07/2011 20:16:00', '25/07/2011 20:17:00', '25/07/2011 20:18:00', '25/07/2011 20:19:00', '25/07/2011 20:20:00', '25/07/2011 20:21:00', '25/07/2011 20:22:00', '25/07/2011 20:23:00', '25/07/2011 20:24:00', '25/07/2011 20:25:00', '25/07/2011 20:26:00', '25/07/2011 20:27:00', '25/07/2011 20:28:00', '25/07/2011 20:29:00', '25/07/2011 20:30:00', '25/07/2011 20:31:00', '25/07/2011 20:32:00', '25/07/2011 20:33:00', '25/07/2011 20:34:00', '25/07/2011 20:35:00', '25/07/2011 20:36:00', '25/07/2011 20:37:00', '25/07/2011 20:38:00', '25/07/2011 20:39:00', '25/07/2011 20:40:00', '25/07/2011 20:41:00', '25/07/2011 20:42:00', '25/07/2011 20:43:00', '25/07/2011 20:44:00', '25/07/2011 20:45:00', '25/07/2011 20:46:00', '25/07/2011 20:47:00', '25/07/2011 20:48:00', '25/07/2011 20:49:00', '25/07/2011 20:50:00', '25/07/2011 20:51:00', '25/07/2011 20:52:00', '25/07/2011 20:53:00', '25/07/2011 20:54:00', '25/07/2011 20:55:00', '25/07/2011 20:56:00', '25/07/2011 20:57:00', '25/07/2011 20:58:00', '25/07/2011 20:59:00', '25/07/2011 21:00:00', '25/07/2011 21:01:00', '25/07/2011 21:02:00', '25/07/2011 21:03:00', '25/07/2011 21:04:00', '25/07/2011 21:05:00', '25/07/2011 21:06:00', '25/07/2011 21:07:00', '25/07/2011 21:08:00', '25/07/2011 21:09:00', '25/07/2011 21:10:00', '25/07/2011 21:11:00', '25/07/2011 21:12:00', '25/07/2011 21:13:00', '25/07/2011 21:14:00', '25/07/2011 21:15:00', '25/07/2011 21:16:00', '25/07/2011 21:17:00', '25/07/2011 21:18:00', '25/07/2011 21:19:00', '25/07/2011 21:20:00', '25/07/2011 21:21:00', '25/07/2011 21:22:00', '25/07/2011 21:23:00', '25/07/2011 21:24:00', '25/07/2011 21:25:00', '25/07/2011 21:26:00', '25/07/2011 21:27:00', '25/07/2011 21:28:00', '25/07/2011 21:29:00', '25/07/2011 21:30:00', '25/07/2011 21:31:00', '25/07/2011 21:32:00', '25/07/2011 21:33:00', '25/07/2011 21:34:00', '25/07/2011 21:35:00', '25/07/2011 21:36:00', '25/07/2011 21:37:00', '25/07/2011 21:38:00', '25/07/2011 21:39:00', '25/07/2011 21:40:00', '25/07/2011 21:41:00', '25/07/2011 21:42:00', '25/07/2011 21:43:00', '25/07/2011 21:44:00', '25/07/2011 21:45:00', '25/07/2011 21:46:00', '25/07/2011 21:47:00', '25/07/2011 21:48:00', '25/07/2011 21:49:00', '25/07/2011 21:50:00', '25/07/2011 21:51:00', '25/07/2011 21:52:00', '25/07/2011 21:53:00', '25/07/2011 21:54:00', '25/07/2011 21:55:00', '25/07/2011 21:56:00', '25/07/2011 21:57:00', '25/07/2011 21:58:00', '25/07/2011 21:59:00', '25/07/2011 22:00:00', '25/07/2011 22:01:00', '25/07/2011 22:02:00', '25/07/2011 22:03:00', '25/07/2011 22:04:00', '25/07/2011 22:05:00', '25/07/2011 22:06:00', '25/07/2011 22:07:00', '25/07/2011 22:08:00', '25/07/2011 22:09:00', '25/07/2011 22:10:00', '25/07/2011 22:11:00', '25/07/2011 22:12:00', '25/07/2011 22:13:00', '25/07/2011 22:14:00', '25/07/2011 22:15:00', '25/07/2011 22:16:00', '25/07/2011 22:17:00', '25/07/2011 22:18:00', '25/07/2011 22:19:00', '25/07/2011 22:20:00', '25/07/2011 22:21:00', '25/07/2011 22:22:00', '25/07/2011 22:23:00', '25/07/2011 22:24:00', '25/07/2011 22:25:00', '25/07/2011 22:26:00', '25/07/2011 22:27:00', '25/07/2011 22:28:00', '25/07/2011 22:29:00', '25/07/2011 22:30:00', '25/07/2011 22:31:00', '25/07/2011 22:32:00', '25/07/2011 22:33:00', '25/07/2011 22:34:00', '25/07/2011 22:35:00', '25/07/2011 22:36:00', '25/07/2011 22:37:00', '25/07/2011 22:38:00', '25/07/2011 22:39:00', '25/07/2011 22:40:00', '25/07/2011 22:41:00', '25/07/2011 22:42:00', '25/07/2011 22:43:00', '25/07/2011 22:44:00', '25/07/2011 22:45:00', '25/07/2011 22:46:00', '25/07/2011 22:47:00', '25/07/2011 22:48:00', '25/07/2011 22:49:00', '25/07/2011 22:50:00', '25/07/2011 22:51:00', '25/07/2011 22:52:00', '25/07/2011 22:53:00', '25/07/2011 22:54:00', '25/07/2011 22:55:00', '25/07/2011 22:56:00', '25/07/2011 22:57:00', '25/07/2011 22:58:00', '25/07/2011 22:59:00', '25/07/2011 23:00:00', '25/07/2011 23:01:00', '25/07/2011 23:02:00', '25/07/2011 23:03:00', '25/07/2011 23:04:00', '25/07/2011 23:05:00', '25/07/2011 23:06:00', '25/07/2011 23:07:00', '25/07/2011 23:08:00', '25/07/2011 23:09:00', '25/07/2011 23:10:00', '25/07/2011 23:11:00', '25/07/2011 23:12:00', '25/07/2011 23:13:00', '25/07/2011 23:14:00', '25/07/2011 23:15:00', '25/07/2011 23:16:00', '25/07/2011 23:17:00', '25/07/2011 23:18:00', '25/07/2011 23:19:00', '25/07/2011 23:20:00', '25/07/2011 23:21:00', '25/07/2011 23:22:00', '25/07/2011 23:23:00', '25/07/2011 23:24:00', '25/07/2011 23:25:00', '25/07/2011 23:26:00', '25/07/2011 23:27:00', '25/07/2011 23:28:00', '25/07/2011 23:29:00', '25/07/2011 23:30:00', '25/07/2011 23:31:00', '25/07/2011 23:32:00', '25/07/2011 23:33:00', '25/07/2011 23:34:00', '25/07/2011 23:35:00', '25/07/2011 23:36:00', '25/07/2011 23:37:00', '25/07/2011 23:38:00', '25/07/2011 23:39:00', '25/07/2011 23:40:00', '25/07/2011 23:41:00', '25/07/2011 23:42:00', '25/07/2011 23:43:00', '25/07/2011 23:44:00', '25/07/2011 23:45:00', '25/07/2011 23:46:00', '25/07/2011 23:47:00', '25/07/2011 23:48:00', '25/07/2011 23:49:00', '25/07/2011 23:50:00', '25/07/2011 23:51:00', '25/07/2011 23:52:00', '25/07/2011 23:53:00', '25/07/2011 23:54:00', '25/07/2011 23:55:00', '25/07/2011 23:56:00', '25/07/2011 23:57:00', '25/07/2011 23:58:00', '25/07/2011 23:59:00', '26/07/2011 0:00:00', '26/07/2011 0:01:00', '26/07/2011 0:02:00', '26/07/2011 0:03:00', '26/07/2011 0:04:00', '26/07/2011 0:05:00', '26/07/2011 0:06:00', '26/07/2011 0:07:00', '26/07/2011 0:08:00', '26/07/2011 0:09:00', '26/07/2011 0:10:00', '26/07/2011 0:11:00', '26/07/2011 0:12:00', '26/07/2011 0:13:00', '26/07/2011 0:14:00', '26/07/2011 0:15:00', '26/07/2011 0:16:00', '26/07/2011 0:17:00', '26/07/2011 0:18:00', '26/07/2011 0:19:00', '26/07/2011 0:20:00', '26/07/2011 0:21:00', '26/07/2011 0:22:00', '26/07/2011 0:23:00', '26/07/2011 0:24:00', '26/07/2011 0:25:00', '26/07/2011 0:26:00', '26/07/2011 0:27:00', '26/07/2011 0:28:00', '26/07/2011 0:29:00', '26/07/2011 0:30:00', '26/07/2011 0:31:00', '26/07/2011 0:32:00', '26/07/2011 0:33:00', '26/07/2011 0:34:00', '26/07/2011 0:35:00', '26/07/2011 0:36:00', '26/07/2011 0:37:00', '26/07/2011 0:38:00', '26/07/2011 0:39:00', '26/07/2011 0:40:00', '26/07/2011 0:41:00', '26/07/2011 0:42:00', '26/07/2011 0:43:00', '26/07/2011 0:44:00', '26/07/2011 0:45:00', '26/07/2011 0:46:00', '26/07/2011 0:47:00', '26/07/2011 0:48:00', '26/07/2011 0:49:00', '26/07/2011 0:50:00', '26/07/2011 0:51:00', '26/07/2011 0:52:00', '26/07/2011 0:53:00', '26/07/2011 0:54:00', '26/07/2011 0:55:00', '26/07/2011 0:56:00', '26/07/2011 0:57:00', '26/07/2011 0:58:00', '26/07/2011 0:59:00', '26/07/2011 1:00:00', '26/07/2011 1:01:00', '26/07/2011 1:02:00', '26/07/2011 1:03:00', '26/07/2011 1:04:00', '26/07/2011 1:05:00', '26/07/2011 1:06:00', '26/07/2011 1:07:00', '26/07/2011 1:08:00', '26/07/2011 1:09:00', '26/07/2011 1:10:00', '26/07/2011 1:11:00', '26/07/2011 1:12:00', '26/07/2011 1:13:00', '26/07/2011 1:14:00', '26/07/2011 1:15:00', '26/07/2011 1:16:00', '26/07/2011 1:17:00', '26/07/2011 1:18:00', '26/07/2011 1:19:00', '26/07/2011 1:20:00', '26/07/2011 1:21:00', '26/07/2011 1:22:00', '26/07/2011 1:23:00', '26/07/2011 1:24:00', '26/07/2011 1:25:00', '26/07/2011 1:26:00', '26/07/2011 1:27:00', '26/07/2011 1:28:00', '26/07/2011 1:29:00', '26/07/2011 1:30:00', '26/07/2011 1:31:00', '26/07/2011 1:32:00', '26/07/2011 1:33:00', '26/07/2011 1:34:00', '26/07/2011 1:35:00', '26/07/2011 1:36:00', '26/07/2011 1:37:00', '26/07/2011 1:38:00', '26/07/2011 1:39:00', '26/07/2011 1:40:00', '26/07/2011 1:41:00', '26/07/2011 1:42:00', '26/07/2011 1:43:00', '26/07/2011 1:44:00', '26/07/2011 1:45:00', '26/07/2011 1:46:00', '26/07/2011 1:47:00', '26/07/2011 1:48:00', '26/07/2011 1:49:00', '26/07/2011 1:50:00', '26/07/2011 1:51:00', '26/07/2011 1:52:00', '26/07/2011 1:53:00', '26/07/2011 1:54:00', '26/07/2011 1:55:00', '26/07/2011 1:56:00', '26/07/2011 1:57:00', '26/07/2011 1:58:00', '26/07/2011 1:59:00', '26/07/2011 2:00:00', '26/07/2011 2:01:00', '26/07/2011 2:02:00', '26/07/2011 2:03:00', '26/07/2011 2:04:00', '26/07/2011 2:05:00', '26/07/2011 2:06:00', '26/07/2011 2:07:00', '26/07/2011 2:08:00', '26/07/2011 2:09:00', '26/07/2011 2:10:00', '26/07/2011 2:11:00', '26/07/2011 2:12:00', '26/07/2011 2:13:00', '26/07/2011 2:14:00', '26/07/2011 2:15:00', '26/07/2011 2:16:00', '26/07/2011 2:17:00', '26/07/2011 2:18:00', '26/07/2011 2:19:00', '26/07/2011 2:20:00', '26/07/2011 2:21:00', '26/07/2011 2:22:00', '26/07/2011 2:23:00', '26/07/2011 2:24:00', '26/07/2011 2:25:00', '26/07/2011 2:26:00', '26/07/2011 2:27:00', '26/07/2011 2:28:00', '26/07/2011 2:29:00', '26/07/2011 2:30:00', '26/07/2011 2:31:00', '26/07/2011 2:32:00', '26/07/2011 2:33:00', '26/07/2011 2:34:00', '26/07/2011 2:35:00', '26/07/2011 2:36:00', '26/07/2011 2:37:00', '26/07/2011 2:38:00', '26/07/2011 2:39:00', '26/07/2011 2:40:00', '26/07/2011 2:41:00', '26/07/2011 2:42:00', '26/07/2011 2:43:00', '26/07/2011 2:44:00', '26/07/2011 2:45:00', '26/07/2011 2:46:00', '26/07/2011 2:47:00', '26/07/2011 2:48:00', '26/07/2011 2:49:00', '26/07/2011 2:50:00', '26/07/2011 2:51:00', '26/07/2011 2:52:00', '26/07/2011 2:53:00', '26/07/2011 2:54:00', '26/07/2011 2:55:00', '26/07/2011 2:56:00', '26/07/2011 2:57:00', '26/07/2011 2:58:00', '26/07/2011 2:59:00', '26/07/2011 3:00:00', '26/07/2011 3:01:00', '26/07/2011 3:02:00', '26/07/2011 3:03:00', '26/07/2011 3:04:00', '26/07/2011 3:05:00', '26/07/2011 3:06:00', '26/07/2011 3:07:00', '26/07/2011 3:08:00', '26/07/2011 3:09:00', '26/07/2011 3:10:00', '26/07/2011 3:11:00', '26/07/2011 3:12:00', '26/07/2011 3:13:00', '26/07/2011 3:14:00', '26/07/2011 3:15:00', '26/07/2011 3:16:00', '26/07/2011 3:17:00', '26/07/2011 3:18:00', '26/07/2011 3:19:00', '26/07/2011 3:20:00', '26/07/2011 3:21:00', '26/07/2011 3:22:00', '26/07/2011 3:23:00', '26/07/2011 3:24:00', '26/07/2011 3:25:00', '26/07/2011 3:26:00', '26/07/2011 3:27:00', '26/07/2011 3:28:00', '26/07/2011 3:29:00', '26/07/2011 3:30:00', '26/07/2011 3:31:00', '26/07/2011 3:32:00', '26/07/2011 3:33:00', '26/07/2011 3:34:00', '26/07/2011 3:35:00', '26/07/2011 3:36:00', '26/07/2011 3:37:00', '26/07/2011 3:38:00', '26/07/2011 3:39:00', '26/07/2011 3:40:00', '26/07/2011 3:41:00', '26/07/2011 3:42:00', '26/07/2011 3:43:00', '26/07/2011 3:44:00', '26/07/2011 3:45:00', '26/07/2011 3:46:00', '26/07/2011 3:47:00', '26/07/2011 3:48:00', '26/07/2011 3:49:00', '26/07/2011 3:50:00', '26/07/2011 3:51:00', '26/07/2011 3:52:00', '26/07/2011 3:53:00', '26/07/2011 3:54:00', '26/07/2011 3:55:00', '26/07/2011 3:56:00', '26/07/2011 3:57:00', '26/07/2011 3:58:00', '26/07/2011 3:59:00', '26/07/2011 4:00:00', '26/07/2011 4:01:00', '26/07/2011 4:02:00', '26/07/2011 4:03:00', '26/07/2011 4:04:00', '26/07/2011 4:05:00', '26/07/2011 4:06:00', '26/07/2011 4:07:00', '26/07/2011 4:08:00', '26/07/2011 4:09:00', '26/07/2011 4:10:00', '26/07/2011 4:11:00', '26/07/2011 4:12:00', '26/07/2011 4:13:00', '26/07/2011 4:14:00', '26/07/2011 4:15:00', '26/07/2011 4:16:00', '26/07/2011 4:17:00', '26/07/2011 4:18:00', '26/07/2011 4:19:00', '26/07/2011 4:20:00', '26/07/2011 4:21:00', '26/07/2011 4:22:00', '26/07/2011 4:23:00', '26/07/2011 4:24:00', '26/07/2011 4:25:00', '26/07/2011 4:26:00', '26/07/2011 4:27:00', '26/07/2011 4:28:00', '26/07/2011 4:29:00', '26/07/2011 4:30:00', '26/07/2011 4:31:00', '26/07/2011 4:32:00', '26/07/2011 4:33:00', '26/07/2011 4:34:00', '26/07/2011 4:35:00', '26/07/2011 4:36:00', '26/07/2011 4:37:00', '26/07/2011 4:38:00', '26/07/2011 4:39:00', '26/07/2011 4:40:00', '26/07/2011 4:41:00', '26/07/2011 4:42:00', '26/07/2011 4:43:00', '26/07/2011 4:44:00', '26/07/2011 4:45:00', '26/07/2011 4:46:00', '26/07/2011 4:47:00', '26/07/2011 4:48:00', '26/07/2011 4:49:00', '26/07/2011 4:50:00', '26/07/2011 4:51:00', '26/07/2011 4:52:00', '26/07/2011 4:53:00', '26/07/2011 4:54:00', '26/07/2011 4:55:00', '26/07/2011 4:56:00', '26/07/2011 4:57:00', '26/07/2011 4:58:00', '26/07/2011 4:59:00', '26/07/2011 5:00:00', '26/07/2011 5:01:00', '26/07/2011 5:02:00', '26/07/2011 5:03:00', '26/07/2011 5:04:00', '26/07/2011 5:05:00', '26/07/2011 5:06:00', '26/07/2011 5:07:00', '26/07/2011 5:08:00', '26/07/2011 5:09:00', '26/07/2011 5:10:00', '26/07/2011 5:11:00', '26/07/2011 5:12:00', '26/07/2011 5:13:00', '26/07/2011 5:14:00', '26/07/2011 5:15:00', '26/07/2011 5:16:00', '26/07/2011 5:17:00', '26/07/2011 5:18:00', '26/07/2011 5:19:00', '26/07/2011 5:20:00', '26/07/2011 5:21:00', '26/07/2011 5:22:00', '26/07/2011 5:23:00', '26/07/2011 5:24:00', '26/07/2011 5:25:00', '26/07/2011 5:26:00', '26/07/2011 5:27:00', '26/07/2011 5:28:00', '26/07/2011 5:29:00', '26/07/2011 5:30:00', '26/07/2011 5:31:00', '26/07/2011 5:32:00', '26/07/2011 5:33:00', '26/07/2011 5:34:00', '26/07/2011 5:35:00', '26/07/2011 5:36:00', '26/07/2011 5:37:00', '26/07/2011 5:38:00', '26/07/2011 5:39:00', '26/07/2011 5:40:00', '26/07/2011 5:41:00', '26/07/2011 5:42:00', '26/07/2011 5:43:00', '26/07/2011 5:44:00', '26/07/2011 5:45:00', '26/07/2011 5:46:00', '26/07/2011 5:47:00', '26/07/2011 5:48:00', '26/07/2011 5:49:00', '26/07/2011 5:50:00', '26/07/2011 5:51:00', '26/07/2011 5:52:00', '26/07/2011 5:53:00', '26/07/2011 5:54:00', '26/07/2011 5:55:00', '26/07/2011 5:56:00', '26/07/2011 5:57:00', '26/07/2011 5:58:00', '26/07/2011 5:59:00', '26/07/2011 6:00:00', '26/07/2011 6:01:00', '26/07/2011 6:02:00', '26/07/2011 6:03:00', '26/07/2011 6:04:00', '26/07/2011 6:05:00', '26/07/2011 6:06:00', '26/07/2011 6:07:00', '26/07/2011 6:08:00', '26/07/2011 6:09:00', '26/07/2011 6:10:00', '26/07/2011 6:11:00', '26/07/2011 6:12:00', '26/07/2011 6:13:00', '26/07/2011 6:14:00', '26/07/2011 6:15:00', '26/07/2011 6:16:00', '26/07/2011 6:17:00', '26/07/2011 6:18:00', '26/07/2011 6:19:00', '26/07/2011 6:20:00', '26/07/2011 6:21:00', '26/07/2011 6:22:00', '26/07/2011 6:23:00', '26/07/2011 6:24:00', '26/07/2011 6:25:00', '26/07/2011 6:26:00', '26/07/2011 6:27:00', '26/07/2011 6:28:00', '26/07/2011 6:29:00', '26/07/2011 6:30:00', '26/07/2011 6:31:00', '26/07/2011 6:32:00', '26/07/2011 6:33:00', '26/07/2011 6:34:00', '26/07/2011 6:35:00', '26/07/2011 6:36:00', '26/07/2011 6:37:00', '26/07/2011 6:38:00', '26/07/2011 6:39:00', '26/07/2011 6:40:00', '26/07/2011 6:41:00', '26/07/2011 6:42:00', '26/07/2011 6:43:00', '26/07/2011 6:44:00', '26/07/2011 6:45:00', '26/07/2011 6:46:00', '26/07/2011 6:47:00', '26/07/2011 6:48:00', '26/07/2011 6:49:00', '26/07/2011 6:50:00', '26/07/2011 6:51:00', '26/07/2011 6:52:00', '26/07/2011 6:53:00', '26/07/2011 6:54:00', '26/07/2011 6:55:00', '26/07/2011 6:56:00', '26/07/2011 6:57:00', '26/07/2011 6:58:00', '26/07/2011 6:59:00', '26/07/2011 7:00:00', '26/07/2011 7:01:00', '26/07/2011 7:02:00', '26/07/2011 7:03:00', '26/07/2011 7:04:00', '26/07/2011 7:05:00', '26/07/2011 7:06:00', '26/07/2011 7:07:00', '26/07/2011 7:08:00', '26/07/2011 7:09:00', '26/07/2011 7:10:00', '26/07/2011 7:11:00', '26/07/2011 7:12:00', '26/07/2011 7:13:00', '26/07/2011 7:14:00', '26/07/2011 7:15:00', '26/07/2011 7:16:00', '26/07/2011 7:17:00', '26/07/2011 7:18:00', '26/07/2011 7:19:00', '26/07/2011 7:20:00', '26/07/2011 7:21:00', '26/07/2011 7:22:00', '26/07/2011 7:23:00', '26/07/2011 7:24:00', '26/07/2011 7:25:00', '26/07/2011 7:26:00', '26/07/2011 7:27:00', '26/07/2011 7:28:00', '26/07/2011 7:29:00', '26/07/2011 7:30:00', '26/07/2011 7:31:00', '26/07/2011 7:32:00', '26/07/2011 7:33:00', '26/07/2011 7:34:00', '26/07/2011 7:35:00', '26/07/2011 7:36:00', '26/07/2011 7:37:00', '26/07/2011 7:38:00', '26/07/2011 7:39:00', '26/07/2011 7:40:00', '26/07/2011 7:41:00', '26/07/2011 7:42:00', '26/07/2011 7:43:00', '26/07/2011 7:44:00', '26/07/2011 7:45:00', '26/07/2011 7:46:00', '26/07/2011 7:47:00', '26/07/2011 7:48:00', '26/07/2011 7:49:00', '26/07/2011 7:50:00', '26/07/2011 7:51:00', '26/07/2011 7:52:00', '26/07/2011 7:53:00', '26/07/2011 7:54:00', '26/07/2011 7:55:00', '26/07/2011 7:56:00', '26/07/2011 7:57:00', '26/07/2011 7:58:00', '26/07/2011 7:59:00', '26/07/2011 8:00:00', '26/07/2011 8:01:00', '26/07/2011 8:02:00', '26/07/2011 8:03:00', '26/07/2011 8:04:00', '26/07/2011 8:05:00', '26/07/2011 8:06:00', '26/07/2011 8:07:00', '26/07/2011 8:08:00', '26/07/2011 8:09:00', '26/07/2011 8:10:00', '26/07/2011 8:11:00', '26/07/2011 8:12:00', '26/07/2011 8:13:00', '26/07/2011 8:14:00', '26/07/2011 8:15:00', '26/07/2011 8:16:00', '26/07/2011 8:17:00', '26/07/2011 8:18:00', '26/07/2011 8:19:00', '26/07/2011 8:20:00', '26/07/2011 8:21:00', '26/07/2011 8:22:00', '26/07/2011 8:23:00', '26/07/2011 8:24:00', '26/07/2011 8:25:00', '26/07/2011 8:26:00', '26/07/2011 8:27:00', '26/07/2011 8:28:00', '26/07/2011 8:29:00', '26/07/2011 8:30:00', '26/07/2011 8:31:00', '26/07/2011 8:32:00', '26/07/2011 8:33:00', '26/07/2011 8:34:00', '26/07/2011 8:35:00', '26/07/2011 8:36:00', '26/07/2011 8:37:00', '26/07/2011 8:38:00', '26/07/2011 8:39:00', '26/07/2011 8:40:00', '26/07/2011 8:41:00', '26/07/2011 8:42:00', '26/07/2011 8:43:00', '26/07/2011 8:44:00', '26/07/2011 8:45:00', '26/07/2011 8:46:00', '26/07/2011 8:47:00', '26/07/2011 8:48:00', '26/07/2011 8:49:00', '26/07/2011 8:50:00', '26/07/2011 8:51:00', '26/07/2011 8:52:00', '26/07/2011 8:53:00', '26/07/2011 8:54:00', '26/07/2011 8:55:00', '26/07/2011 8:56:00', '26/07/2011 8:57:00', '26/07/2011 8:58:00', '26/07/2011 8:59:00', '26/07/2011 9:00:00', '26/07/2011 9:01:00', '26/07/2011 9:02:00', '26/07/2011 9:03:00', '26/07/2011 9:04:00', '26/07/2011 9:05:00', '26/07/2011 9:06:00', '26/07/2011 9:07:00', '26/07/2011 9:08:00', '26/07/2011 9:09:00', '26/07/2011 9:10:00', '26/07/2011 9:11:00', '26/07/2011 9:12:00', '26/07/2011 9:13:00', '26/07/2011 9:14:00', '26/07/2011 9:15:00', '26/07/2011 9:16:00', '26/07/2011 9:17:00', '26/07/2011 9:18:00', '26/07/2011 9:19:00', '26/07/2011 9:20:00', '26/07/2011 9:21:00', '26/07/2011 9:22:00', '26/07/2011 9:23:00', '26/07/2011 9:24:00', '26/07/2011 9:25:00', '26/07/2011 9:26:00', '26/07/2011 9:27:00', '26/07/2011 9:28:00', '26/07/2011 9:29:00', '26/07/2011 9:30:00', '26/07/2011 9:31:00', '26/07/2011 9:32:00', '26/07/2011 9:33:00', '26/07/2011 9:34:00', '26/07/2011 9:35:00', '26/07/2011 9:36:00', '26/07/2011 9:37:00', '26/07/2011 9:38:00', '26/07/2011 9:39:00', '26/07/2011 9:40:00', '26/07/2011 9:41:00', '26/07/2011 9:42:00', '26/07/2011 9:43:00', '26/07/2011 9:44:00', '26/07/2011 9:45:00', '26/07/2011 9:46:00', '26/07/2011 9:47:00', '26/07/2011 9:48:00', '26/07/2011 9:49:00', '26/07/2011 9:50:00', '26/07/2011 9:51:00', '26/07/2011 9:52:00', '26/07/2011 9:53:00', '26/07/2011 9:54:00', '26/07/2011 9:55:00', '26/07/2011 9:56:00', '26/07/2011 9:57:00', '26/07/2011 9:58:00', '26/07/2011 9:59:00', '26/07/2011 10:00:00', '26/07/2011 10:01:00', '26/07/2011 10:02:00', '26/07/2011 10:03:00', '26/07/2011 10:04:00', '26/07/2011 10:05:00', '26/07/2011 10:06:00', '26/07/2011 10:07:00', '26/07/2011 10:08:00', '26/07/2011 10:09:00', '26/07/2011 10:10:00', '26/07/2011 10:11:00', '26/07/2011 10:12:00', '26/07/2011 10:13:00', '26/07/2011 10:14:00', '26/07/2011 10:15:00', '26/07/2011 10:16:00', '26/07/2011 10:17:00', '26/07/2011 10:18:00', '26/07/2011 10:19:00', '26/07/2011 10:20:00', '26/07/2011 10:21:00', '26/07/2011 10:22:00', '26/07/2011 10:23:00', '26/07/2011 10:24:00', '26/07/2011 10:25:00', '26/07/2011 10:26:00', '26/07/2011 10:27:00', '26/07/2011 10:28:00', '26/07/2011 10:29:00', '26/07/2011 10:30:00', '26/07/2011 10:31:00', '26/07/2011 10:32:00', '26/07/2011 10:33:00', '26/07/2011 10:34:00', '26/07/2011 10:35:00', '26/07/2011 10:36:00', '26/07/2011 10:37:00', '26/07/2011 10:38:00', '26/07/2011 10:39:00', '26/07/2011 10:40:00', '26/07/2011 10:41:00', '26/07/2011 10:42:00', '26/07/2011 10:43:00', '26/07/2011 10:44:00', '26/07/2011 10:45:00', '26/07/2011 10:46:00', '26/07/2011 10:47:00', '26/07/2011 10:48:00', '26/07/2011 10:49:00', '26/07/2011 10:50:00', '26/07/2011 10:51:00', '26/07/2011 10:52:00', '26/07/2011 10:53:00', '26/07/2011 10:54:00', '26/07/2011 10:55:00', '26/07/2011 10:56:00', '26/07/2011 10:57:00', '26/07/2011 10:58:00', '26/07/2011 10:59:00', '26/07/2011 11:00:00', '26/07/2011 11:01:00', '26/07/2011 11:02:00', '26/07/2011 11:03:00', '26/07/2011 11:04:00', '26/07/2011 11:05:00', '26/07/2011 11:06:00', '26/07/2011 11:07:00', '26/07/2011 11:08:00', '26/07/2011 11:09:00', '26/07/2011 11:10:00', '26/07/2011 11:11:00', '26/07/2011 11:12:00', '26/07/2011 11:13:00', '26/07/2011 11:14:00', '26/07/2011 11:15:00', '26/07/2011 11:16:00', '26/07/2011 11:17:00', '26/07/2011 11:18:00', '26/07/2011 11:19:00', '26/07/2011 11:20:00', '26/07/2011 11:21:00', '26/07/2011 11:22:00', '26/07/2011 11:23:00', '26/07/2011 11:24:00', '26/07/2011 11:25:00', '26/07/2011 11:26:00', '26/07/2011 11:27:00', '26/07/2011 11:28:00', '26/07/2011 11:29:00', '26/07/2011 11:30:00', '26/07/2011 11:31:00', '26/07/2011 11:32:00', '26/07/2011 11:33:00', '26/07/2011 11:34:00', '26/07/2011 11:35:00', '26/07/2011 11:36:00', '26/07/2011 11:37:00', '26/07/2011 11:38:00', '26/07/2011 11:39:00', '26/07/2011 11:40:00', '26/07/2011 11:41:00', '26/07/2011 11:42:00', '26/07/2011 11:43:00', '26/07/2011 11:44:00', '26/07/2011 11:45:00', '26/07/2011 11:46:00', '26/07/2011 11:47:00', '26/07/2011 11:48:00', '26/07/2011 11:49:00', '26/07/2011 11:50:00', '26/07/2011 11:51:00', '26/07/2011 11:52:00', '26/07/2011 11:53:00', '26/07/2011 11:54:00', '26/07/2011 11:55:00', '26/07/2011 11:56:00', '26/07/2011 11:57:00', '26/07/2011 11:58:00', '26/07/2011 11:59:00', '26/07/2011 12:00:00', '26/07/2011 12:01:00', '26/07/2011 12:02:00', '26/07/2011 12:03:00', '26/07/2011 12:04:00', '26/07/2011 12:05:00', '26/07/2011 12:06:00', '26/07/2011 12:07:00', '26/07/2011 12:08:00', '26/07/2011 12:09:00', '26/07/2011 12:10:00', '26/07/2011 12:11:00', '26/07/2011 12:12:00', '26/07/2011 12:13:00', '26/07/2011 12:14:00', '26/07/2011 12:15:00', '26/07/2011 12:16:00', '26/07/2011 12:17:00', '26/07/2011 12:18:00', '26/07/2011 12:19:00', '26/07/2011 12:20:00', '26/07/2011 12:21:00', '26/07/2011 12:22:00', '26/07/2011 12:23:00', '26/07/2011 12:24:00', '26/07/2011 12:25:00', '26/07/2011 12:26:00', '26/07/2011 12:27:00', '26/07/2011 12:28:00', '26/07/2011 12:29:00', '26/07/2011 12:30:00', '26/07/2011 12:31:00', '26/07/2011 12:32:00', '26/07/2011 12:33:00', '26/07/2011 12:34:00', '26/07/2011 12:35:00', '26/07/2011 12:36:00', '26/07/2011 12:37:00', '26/07/2011 12:38:00', '26/07/2011 12:39:00', '26/07/2011 12:40:00', '26/07/2011 12:41:00', '26/07/2011 12:42:00', '26/07/2011 12:43:00', '26/07/2011 12:44:00', '26/07/2011 12:45:00', '26/07/2011 12:46:00', '26/07/2011 12:47:00', '26/07/2011 12:48:00', '26/07/2011 12:49:00', '26/07/2011 12:50:00', '26/07/2011 12:51:00', '26/07/2011 12:52:00', '26/07/2011 12:53:00', '26/07/2011 12:54:00', '26/07/2011 12:55:00', '26/07/2011 12:56:00', '26/07/2011 12:57:00', '26/07/2011 12:58:00', '26/07/2011 12:59:00', '26/07/2011 13:00:00', '26/07/2011 13:01:00', '26/07/2011 13:02:00', '26/07/2011 13:03:00', '26/07/2011 13:04:00', '26/07/2011 13:05:00', '26/07/2011 13:06:00', '26/07/2011 13:07:00', '26/07/2011 13:08:00', '26/07/2011 13:09:00', '26/07/2011 13:10:00', '26/07/2011 13:11:00', '26/07/2011 13:12:00', '26/07/2011 13:13:00', '26/07/2011 13:14:00', '26/07/2011 13:15:00', '26/07/2011 13:16:00', '26/07/2011 13:17:00', '26/07/2011 13:18:00', '26/07/2011 13:19:00', '26/07/2011 13:20:00', '26/07/2011 13:21:00', '26/07/2011 13:22:00', '26/07/2011 13:23:00', '26/07/2011 13:24:00', '26/07/2011 13:25:00', '26/07/2011 13:26:00', '26/07/2011 13:27:00', '26/07/2011 13:28:00', '26/07/2011 13:29:00', '26/07/2011 13:30:00', '26/07/2011 13:31:00', '26/07/2011 13:32:00', '26/07/2011 13:33:00', '26/07/2011 13:34:00', '26/07/2011 13:35:00', '26/07/2011 13:36:00', '26/07/2011 13:37:00', '26/07/2011 13:38:00', '26/07/2011 13:39:00', '26/07/2011 13:40:00', '26/07/2011 13:41:00', '26/07/2011 13:42:00', '26/07/2011 13:43:00', '26/07/2011 13:44:00', '26/07/2011 13:45:00', '26/07/2011 13:46:00', '26/07/2011 13:47:00', '26/07/2011 13:48:00', '26/07/2011 13:49:00', '26/07/2011 13:50:00', '26/07/2011 13:51:00', '26/07/2011 13:52:00', '26/07/2011 13:53:00', '26/07/2011 13:54:00', '26/07/2011 13:55:00', '26/07/2011 13:56:00', '26/07/2011 13:57:00', '26/07/2011 13:58:00', '26/07/2011 13:59:00', '26/07/2011 14:00:00', '26/07/2011 14:01:00', '26/07/2011 14:02:00', '26/07/2011 14:03:00', '26/07/2011 14:04:00', '26/07/2011 14:05:00', '26/07/2011 14:06:00', '26/07/2011 14:07:00', '26/07/2011 14:08:00', '26/07/2011 14:09:00', '26/07/2011 14:10:00', '26/07/2011 14:11:00', '26/07/2011 14:12:00', '26/07/2011 14:13:00', '26/07/2011 14:14:00', '26/07/2011 14:15:00', '26/07/2011 14:16:00', '26/07/2011 14:17:00', '26/07/2011 14:18:00', '26/07/2011 14:19:00', '26/07/2011 14:20:00', '26/07/2011 14:21:00', '26/07/2011 14:22:00', '26/07/2011 14:23:00', '26/07/2011 14:24:00', '26/07/2011 14:25:00', '26/07/2011 14:26:00', '26/07/2011 14:27:00', '26/07/2011 14:28:00', '26/07/2011 14:29:00', '26/07/2011 14:30:00', '26/07/2011 14:31:00', '26/07/2011 14:32:00', '26/07/2011 14:33:00', '26/07/2011 14:34:00', '26/07/2011 14:35:00', '26/07/2011 14:36:00', '26/07/2011 14:37:00', '26/07/2011 14:38:00', '26/07/2011 14:39:00', '26/07/2011 14:40:00', '26/07/2011 14:41:00', '26/07/2011 14:42:00', '26/07/2011 14:43:00', '26/07/2011 14:44:00', '26/07/2011 14:45:00', '26/07/2011 14:46:00', '26/07/2011 14:47:00', '26/07/2011 14:48:00', '26/07/2011 14:49:00', '26/07/2011 14:50:00', '26/07/2011 14:51:00', '26/07/2011 14:52:00', '26/07/2011 14:53:00', '26/07/2011 14:54:00', '26/07/2011 14:55:00', '26/07/2011 14:56:00', '26/07/2011 14:57:00', '26/07/2011 14:58:00', '26/07/2011 14:59:00', '26/07/2011 15:00:00', '26/07/2011 15:01:00', '26/07/2011 15:02:00', '26/07/2011 15:03:00', '26/07/2011 15:04:00', '26/07/2011 15:05:00', '26/07/2011 15:06:00', '26/07/2011 15:07:00', '26/07/2011 15:08:00', '26/07/2011 15:09:00', '26/07/2011 15:10:00', '26/07/2011 15:11:00', '26/07/2011 15:12:00', '26/07/2011 15:13:00', '26/07/2011 15:14:00', '26/07/2011 15:15:00', '26/07/2011 15:16:00', '26/07/2011 15:17:00', '26/07/2011 15:18:00', '26/07/2011 15:19:00', '26/07/2011 15:20:00', '26/07/2011 15:21:00', '26/07/2011 15:22:00', '26/07/2011 15:23:00', '26/07/2011 15:24:00', '26/07/2011 15:25:00', '26/07/2011 15:26:00', '26/07/2011 15:27:00', '26/07/2011 15:28:00', '26/07/2011 15:29:00', '26/07/2011 15:30:00', '26/07/2011 15:31:00', '26/07/2011 15:32:00', '26/07/2011 15:33:00', '26/07/2011 15:34:00', '26/07/2011 15:35:00', '26/07/2011 15:36:00', '26/07/2011 15:37:00', '26/07/2011 15:38:00', '26/07/2011 15:39:00', '26/07/2011 15:40:00', '26/07/2011 15:41:00', '26/07/2011 15:42:00', '26/07/2011 15:43:00', '26/07/2011 15:44:00', '26/07/2011 15:45:00', '26/07/2011 15:46:00', '26/07/2011 15:47:00', '26/07/2011 15:48:00', '26/07/2011 15:49:00', '26/07/2011 15:50:00', '26/07/2011 15:51:00', '26/07/2011 15:52:00', '26/07/2011 15:53:00', '26/07/2011 15:54:00', '26/07/2011 15:55:00', '26/07/2011 15:56:00', '26/07/2011 15:57:00', '26/07/2011 15:58:00', '26/07/2011 15:59:00', '26/07/2011 16:00:00', '26/07/2011 16:01:00', '26/07/2011 16:02:00', '26/07/2011 16:03:00', '26/07/2011 16:04:00', '26/07/2011 16:05:00', '26/07/2011 16:06:00', '26/07/2011 16:07:00', '26/07/2011 16:08:00', '26/07/2011 16:09:00', '26/07/2011 16:10:00', '26/07/2011 16:11:00', '26/07/2011 16:12:00', '26/07/2011 16:13:00', '26/07/2011 16:14:00', '26/07/2011 16:15:00', '26/07/2011 16:16:00', '26/07/2011 16:17:00', '26/07/2011 16:18:00', '26/07/2011 16:19:00', '26/07/2011 16:20:00', '26/07/2011 16:21:00', '26/07/2011 16:22:00', '26/07/2011 16:23:00', '26/07/2011 16:24:00', '26/07/2011 16:25:00', '26/07/2011 16:26:00', '26/07/2011 16:27:00', '26/07/2011 16:28:00', '26/07/2011 16:29:00', '26/07/2011 16:30:00', '26/07/2011 16:31:00', '26/07/2011 16:32:00', '26/07/2011 16:33:00', '26/07/2011 16:34:00', '26/07/2011 16:35:00', '26/07/2011 16:36:00', '26/07/2011 16:37:00', '26/07/2011 16:38:00', '26/07/2011 16:39:00', '26/07/2011 16:40:00', '26/07/2011 16:41:00', '26/07/2011 16:42:00', '26/07/2011 16:43:00', '26/07/2011 16:44:00', '26/07/2011 16:45:00', '26/07/2011 16:46:00', '26/07/2011 16:47:00', '26/07/2011 16:48:00', '26/07/2011 16:49:00', '26/07/2011 16:50:00', '26/07/2011 16:51:00', '26/07/2011 16:52:00', '26/07/2011 16:53:00', '26/07/2011 16:54:00', '26/07/2011 16:55:00', '26/07/2011 16:56:00', '26/07/2011 16:57:00', '26/07/2011 16:58:00', '26/07/2011 16:59:00', '26/07/2011 17:00:00', '26/07/2011 17:01:00', '26/07/2011 17:02:00', '26/07/2011 17:03:00', '26/07/2011 17:04:00', '26/07/2011 17:05:00', '26/07/2011 17:06:00', '26/07/2011 17:07:00', '26/07/2011 17:08:00', '26/07/2011 17:09:00', '26/07/2011 17:10:00', '26/07/2011 17:11:00', '26/07/2011 17:12:00', '26/07/2011 17:13:00', '26/07/2011 17:14:00', '26/07/2011 17:15:00', '26/07/2011 17:16:00', '26/07/2011 17:17:00', '26/07/2011 17:18:00', '26/07/2011 17:19:00', '26/07/2011 17:20:00', '26/07/2011 17:21:00', '26/07/2011 17:22:00', '26/07/2011 17:23:00', '26/07/2011 17:24:00', '26/07/2011 17:25:00', '26/07/2011 17:26:00', '26/07/2011 17:27:00', '26/07/2011 17:28:00', '26/07/2011 17:29:00', '26/07/2011 17:30:00', '26/07/2011 17:31:00', '26/07/2011 17:32:00', '26/07/2011 17:33:00', '26/07/2011 17:34:00', '26/07/2011 17:35:00', '26/07/2011 17:36:00', '26/07/2011 17:37:00', '26/07/2011 17:38:00', '26/07/2011 17:39:00', '26/07/2011 17:40:00', '26/07/2011 17:41:00', '26/07/2011 17:42:00', '26/07/2011 17:43:00', '26/07/2011 17:44:00', '26/07/2011 17:45:00', '26/07/2011 17:46:00', '26/07/2011 17:47:00', '26/07/2011 17:48:00', '26/07/2011 17:49:00', '26/07/2011 17:50:00', '26/07/2011 17:51:00', '26/07/2011 17:52:00', '26/07/2011 17:53:00', '26/07/2011 17:54:00', '26/07/2011 17:55:00', '26/07/2011 17:56:00', '26/07/2011 17:57:00', '26/07/2011 17:58:00', '26/07/2011 17:59:00', '26/07/2011 18:00:00', '26/07/2011 18:01:00', '26/07/2011 18:02:00', '26/07/2011 18:03:00', '26/07/2011 18:04:00', '26/07/2011 18:05:00', '26/07/2011 18:06:00', '26/07/2011 18:07:00', '26/07/2011 18:08:00', '26/07/2011 18:09:00', '26/07/2011 18:10:00', '26/07/2011 18:11:00', '26/07/2011 18:12:00', '26/07/2011 18:13:00', '26/07/2011 18:14:00', '26/07/2011 18:15:00', '26/07/2011 18:16:00', '26/07/2011 18:17:00', '26/07/2011 18:18:00', '26/07/2011 18:19:00', '26/07/2011 18:20:00', '26/07/2011 18:21:00', '26/07/2011 18:22:00', '26/07/2011 18:23:00', '26/07/2011 18:24:00', '26/07/2011 18:25:00', '26/07/2011 18:26:00', '26/07/2011 18:27:00', '26/07/2011 18:28:00', '26/07/2011 18:29:00', '26/07/2011 18:30:00', '26/07/2011 18:31:00', '26/07/2011 18:32:00', '26/07/2011 18:33:00', '26/07/2011 18:34:00', '26/07/2011 18:35:00', '26/07/2011 18:36:00', '26/07/2011 18:37:00', '26/07/2011 18:38:00', '26/07/2011 18:39:00', '26/07/2011 18:40:00', '26/07/2011 18:41:00', '26/07/2011 18:42:00', '26/07/2011 18:43:00', '26/07/2011 18:44:00', '26/07/2011 18:45:00', '26/07/2011 18:46:00', '26/07/2011 18:47:00', '26/07/2011 18:48:00', '26/07/2011 18:49:00', '26/07/2011 18:50:00', '26/07/2011 18:51:00', '26/07/2011 18:52:00', '26/07/2011 18:53:00', '26/07/2011 18:54:00', '26/07/2011 18:55:00', '26/07/2011 18:56:00', '26/07/2011 18:57:00', '26/07/2011 18:58:00', '26/07/2011 18:59:00', '26/07/2011 19:00:00', '26/07/2011 19:01:00', '26/07/2011 19:02:00', '26/07/2011 19:03:00', '26/07/2011 19:04:00', '26/07/2011 19:05:00', '26/07/2011 19:06:00', '26/07/2011 19:07:00', '26/07/2011 19:08:00', '26/07/2011 19:09:00', '26/07/2011 19:10:00', '26/07/2011 19:11:00', '26/07/2011 19:12:00', '26/07/2011 19:13:00', '26/07/2011 19:14:00', '26/07/2011 19:15:00', '26/07/2011 19:16:00', '26/07/2011 19:17:00', '26/07/2011 19:18:00', '26/07/2011 19:19:00', '26/07/2011 19:20:00', '26/07/2011 19:21:00', '26/07/2011 19:22:00', '26/07/2011 19:23:00', '26/07/2011 19:24:00', '26/07/2011 19:25:00', '26/07/2011 19:26:00', '26/07/2011 19:27:00', '26/07/2011 19:28:00', '26/07/2011 19:29:00', '26/07/2011 19:30:00', '26/07/2011 19:31:00', '26/07/2011 19:32:00', '26/07/2011 19:33:00', '26/07/2011 19:34:00', '26/07/2011 19:35:00', '26/07/2011 19:36:00', '26/07/2011 19:37:00', '26/07/2011 19:38:00', '26/07/2011 19:39:00', '26/07/2011 19:40:00', '26/07/2011 19:41:00', '26/07/2011 19:42:00', '26/07/2011 19:43:00', '26/07/2011 19:44:00', '26/07/2011 19:45:00', '26/07/2011 19:46:00', '26/07/2011 19:47:00', '26/07/2011 19:48:00', '26/07/2011 19:49:00', '26/07/2011 19:50:00', '26/07/2011 19:51:00', '26/07/2011 19:52:00', '26/07/2011 19:53:00', '26/07/2011 19:54:00', '26/07/2011 19:55:00', '26/07/2011 19:56:00', '26/07/2011 19:57:00', '26/07/2011 19:58:00', '26/07/2011 19:59:00', '26/07/2011 20:00:00', '26/07/2011 20:01:00', '26/07/2011 20:02:00', '26/07/2011 20:03:00', '26/07/2011 20:04:00', '26/07/2011 20:05:00', '26/07/2011 20:06:00', '26/07/2011 20:07:00', '26/07/2011 20:08:00', '26/07/2011 20:09:00', '26/07/2011 20:10:00', '26/07/2011 20:11:00', '26/07/2011 20:12:00', '26/07/2011 20:13:00', '26/07/2011 20:14:00', '26/07/2011 20:15:00', '26/07/2011 20:16:00', '26/07/2011 20:17:00', '26/07/2011 20:18:00', '26/07/2011 20:19:00', '26/07/2011 20:20:00', '26/07/2011 20:21:00', '26/07/2011 20:22:00', '26/07/2011 20:23:00', '26/07/2011 20:24:00', '26/07/2011 20:25:00', '26/07/2011 20:26:00', '26/07/2011 20:27:00', '26/07/2011 20:28:00', '26/07/2011 20:29:00', '26/07/2011 20:30:00', '26/07/2011 20:31:00', '26/07/2011 20:32:00', '26/07/2011 20:33:00', '26/07/2011 20:34:00', '26/07/2011 20:35:00', '26/07/2011 20:36:00', '26/07/2011 20:37:00', '26/07/2011 20:38:00', '26/07/2011 20:39:00', '26/07/2011 20:40:00', '26/07/2011 20:41:00', '26/07/2011 20:42:00', '26/07/2011 20:43:00', '26/07/2011 20:44:00', '26/07/2011 20:45:00', '26/07/2011 20:46:00', '26/07/2011 20:47:00', '26/07/2011 20:48:00', '26/07/2011 20:49:00', '26/07/2011 20:50:00', '26/07/2011 20:51:00', '26/07/2011 20:52:00', '26/07/2011 20:53:00', '26/07/2011 20:54:00', '26/07/2011 20:55:00', '26/07/2011 20:56:00', '26/07/2011 20:57:00', '26/07/2011 20:58:00', '26/07/2011 20:59:00', '26/07/2011 21:00:00', '26/07/2011 21:01:00', '26/07/2011 21:02:00', '26/07/2011 21:03:00', '26/07/2011 21:04:00', '26/07/2011 21:05:00', '26/07/2011 21:06:00', '26/07/2011 21:07:00', '26/07/2011 21:08:00', '26/07/2011 21:09:00', '26/07/2011 21:10:00', '26/07/2011 21:11:00', '26/07/2011 21:12:00', '26/07/2011 21:13:00', '26/07/2011 21:14:00', '26/07/2011 21:15:00', '26/07/2011 21:16:00', '26/07/2011 21:17:00', '26/07/2011 21:18:00', '26/07/2011 21:19:00', '26/07/2011 21:20:00', '26/07/2011 21:21:00', '26/07/2011 21:22:00', '26/07/2011 21:23:00', '26/07/2011 21:24:00', '26/07/2011 21:25:00', '26/07/2011 21:26:00', '26/07/2011 21:27:00', '26/07/2011 21:28:00', '26/07/2011 21:29:00', '26/07/2011 21:30:00', '26/07/2011 21:31:00', '26/07/2011 21:32:00', '26/07/2011 21:33:00', '26/07/2011 21:34:00', '26/07/2011 21:35:00', '26/07/2011 21:36:00', '26/07/2011 21:37:00', '26/07/2011 21:38:00', '26/07/2011 21:39:00', '26/07/2011 21:40:00', '26/07/2011 21:41:00', '26/07/2011 21:42:00', '26/07/2011 21:43:00', '26/07/2011 21:44:00', '26/07/2011 21:45:00', '26/07/2011 21:46:00', '26/07/2011 21:47:00', '26/07/2011 21:48:00', '26/07/2011 21:49:00', '26/07/2011 21:50:00', '26/07/2011 21:51:00', '26/07/2011 21:52:00', '26/07/2011 21:53:00', '26/07/2011 21:54:00', '26/07/2011 21:55:00', '26/07/2011 21:56:00', '26/07/2011 21:57:00', '26/07/2011 21:58:00', '26/07/2011 21:59:00', '26/07/2011 22:00:00', '26/07/2011 22:01:00', '26/07/2011 22:02:00', '26/07/2011 22:03:00', '26/07/2011 22:04:00', '26/07/2011 22:05:00', '26/07/2011 22:06:00', '26/07/2011 22:07:00', '26/07/2011 22:08:00', '26/07/2011 22:09:00', '26/07/2011 22:10:00', '26/07/2011 22:11:00', '26/07/2011 22:12:00', '26/07/2011 22:13:00', '26/07/2011 22:14:00', '26/07/2011 22:15:00', '26/07/2011 22:16:00', '26/07/2011 22:17:00', '26/07/2011 22:18:00', '26/07/2011 22:19:00', '26/07/2011 22:20:00', '26/07/2011 22:21:00', '26/07/2011 22:22:00', '26/07/2011 22:23:00', '26/07/2011 22:24:00', '26/07/2011 22:25:00', '26/07/2011 22:26:00', '26/07/2011 22:27:00', '26/07/2011 22:28:00', '26/07/2011 22:29:00', '26/07/2011 22:30:00', '26/07/2011 22:31:00', '26/07/2011 22:32:00', '26/07/2011 22:33:00', '26/07/2011 22:34:00', '26/07/2011 22:35:00', '26/07/2011 22:36:00', '26/07/2011 22:37:00', '26/07/2011 22:38:00', '26/07/2011 22:39:00', '26/07/2011 22:40:00', '26/07/2011 22:41:00', '26/07/2011 22:42:00', '26/07/2011 22:43:00', '26/07/2011 22:44:00', '26/07/2011 22:45:00', '26/07/2011 22:46:00', '26/07/2011 22:47:00', '26/07/2011 22:48:00', '26/07/2011 22:49:00', '26/07/2011 22:50:00', '26/07/2011 22:51:00', '26/07/2011 22:52:00', '26/07/2011 22:53:00', '26/07/2011 22:54:00', '26/07/2011 22:55:00', '26/07/2011 22:56:00', '26/07/2011 22:57:00', '26/07/2011 22:58:00', '26/07/2011 22:59:00', '26/07/2011 23:00:00', '26/07/2011 23:01:00', '26/07/2011 23:02:00', '26/07/2011 23:03:00', '26/07/2011 23:04:00', '26/07/2011 23:05:00', '26/07/2011 23:06:00', '26/07/2011 23:07:00', '26/07/2011 23:08:00', '26/07/2011 23:09:00', '26/07/2011 23:10:00', '26/07/2011 23:11:00', '26/07/2011 23:12:00', '26/07/2011 23:13:00', '26/07/2011 23:14:00', '26/07/2011 23:15:00', '26/07/2011 23:16:00', '26/07/2011 23:17:00', '26/07/2011 23:18:00', '26/07/2011 23:19:00', '26/07/2011 23:20:00', '26/07/2011 23:21:00', '26/07/2011 23:22:00', '26/07/2011 23:23:00', '26/07/2011 23:24:00', '26/07/2011 23:25:00', '26/07/2011 23:26:00', '26/07/2011 23:27:00', '26/07/2011 23:28:00', '26/07/2011 23:29:00', '26/07/2011 23:30:00', '26/07/2011 23:31:00', '26/07/2011 23:32:00', '26/07/2011 23:33:00', '26/07/2011 23:34:00', '26/07/2011 23:35:00', '26/07/2011 23:36:00', '26/07/2011 23:37:00', '26/07/2011 23:38:00', '26/07/2011 23:39:00', '26/07/2011 23:40:00', '26/07/2011 23:41:00', '26/07/2011 23:42:00', '26/07/2011 23:43:00', '26/07/2011 23:44:00', '26/07/2011 23:45:00', '26/07/2011 23:46:00', '26/07/2011 23:47:00', '26/07/2011 23:48:00', '26/07/2011 23:49:00', '26/07/2011 23:50:00', '26/07/2011 23:51:00', '26/07/2011 23:52:00', '26/07/2011 23:53:00', '26/07/2011 23:54:00', '26/07/2011 23:55:00', '26/07/2011 23:56:00', '26/07/2011 23:57:00', '26/07/2011 23:58:00', '26/07/2011 23:59:00', '27/07/2011 0:00:00', '27/07/2011 0:01:00', '27/07/2011 0:02:00', '27/07/2011 0:03:00', '27/07/2011 0:04:00', '27/07/2011 0:05:00', '27/07/2011 0:06:00', '27/07/2011 0:07:00', '27/07/2011 0:08:00', '27/07/2011 0:09:00', '27/07/2011 0:10:00', '27/07/2011 0:11:00', '27/07/2011 0:12:00', '27/07/2011 0:13:00', '27/07/2011 0:14:00', '27/07/2011 0:15:00', '27/07/2011 0:16:00', '27/07/2011 0:17:00', '27/07/2011 0:18:00', '27/07/2011 0:19:00', '27/07/2011 0:20:00', '27/07/2011 0:21:00', '27/07/2011 0:22:00', '27/07/2011 0:23:00', '27/07/2011 0:24:00', '27/07/2011 0:25:00', '27/07/2011 0:26:00', '27/07/2011 0:27:00', '27/07/2011 0:28:00', '27/07/2011 0:29:00', '27/07/2011 0:30:00', '27/07/2011 0:31:00', '27/07/2011 0:32:00', '27/07/2011 0:33:00', '27/07/2011 0:34:00', '27/07/2011 0:35:00', '27/07/2011 0:36:00', '27/07/2011 0:37:00', '27/07/2011 0:38:00', '27/07/2011 0:39:00', '27/07/2011 0:40:00', '27/07/2011 0:41:00', '27/07/2011 0:42:00', '27/07/2011 0:43:00', '27/07/2011 0:44:00', '27/07/2011 0:45:00', '27/07/2011 0:46:00', '27/07/2011 0:47:00', '27/07/2011 0:48:00', '27/07/2011 0:49:00', '27/07/2011 0:50:00', '27/07/2011 0:51:00', '27/07/2011 0:52:00', '27/07/2011 0:53:00', '27/07/2011 0:54:00', '27/07/2011 0:55:00', '27/07/2011 0:56:00', '27/07/2011 0:57:00', '27/07/2011 0:58:00', '27/07/2011 0:59:00', '27/07/2011 1:00:00', '27/07/2011 1:01:00', '27/07/2011 1:02:00', '27/07/2011 1:03:00', '27/07/2011 1:04:00', '27/07/2011 1:05:00', '27/07/2011 1:06:00', '27/07/2011 1:07:00', '27/07/2011 1:08:00', '27/07/2011 1:09:00', '27/07/2011 1:10:00', '27/07/2011 1:11:00', '27/07/2011 1:12:00', '27/07/2011 1:13:00', '27/07/2011 1:14:00', '27/07/2011 1:15:00', '27/07/2011 1:16:00', '27/07/2011 1:17:00', '27/07/2011 1:18:00', '27/07/2011 1:19:00', '27/07/2011 1:20:00', '27/07/2011 1:21:00', '27/07/2011 1:22:00', '27/07/2011 1:23:00', '27/07/2011 1:24:00', '27/07/2011 1:25:00', '27/07/2011 1:26:00', '27/07/2011 1:27:00', '27/07/2011 1:28:00', '27/07/2011 1:29:00', '27/07/2011 1:30:00', '27/07/2011 1:31:00', '27/07/2011 1:32:00', '27/07/2011 1:33:00', '27/07/2011 1:34:00', '27/07/2011 1:35:00', '27/07/2011 1:36:00', '27/07/2011 1:37:00', '27/07/2011 1:38:00', '27/07/2011 1:39:00', '27/07/2011 1:40:00', '27/07/2011 1:41:00', '27/07/2011 1:42:00', '27/07/2011 1:43:00', '27/07/2011 1:44:00', '27/07/2011 1:45:00', '27/07/2011 1:46:00', '27/07/2011 1:47:00', '27/07/2011 1:48:00', '27/07/2011 1:49:00', '27/07/2011 1:50:00', '27/07/2011 1:51:00', '27/07/2011 1:52:00', '27/07/2011 1:53:00', '27/07/2011 1:54:00', '27/07/2011 1:55:00', '27/07/2011 1:56:00', '27/07/2011 1:57:00', '27/07/2011 1:58:00', '27/07/2011 1:59:00', '27/07/2011 2:00:00', '27/07/2011 2:01:00', '27/07/2011 2:02:00', '27/07/2011 2:03:00', '27/07/2011 2:04:00', '27/07/2011 2:05:00', '27/07/2011 2:06:00', '27/07/2011 2:07:00', '27/07/2011 2:08:00', '27/07/2011 2:09:00', '27/07/2011 2:10:00', '27/07/2011 2:11:00', '27/07/2011 2:12:00', '27/07/2011 2:13:00', '27/07/2011 2:14:00', '27/07/2011 2:15:00', '27/07/2011 2:16:00', '27/07/2011 2:17:00', '27/07/2011 2:18:00', '27/07/2011 2:19:00', '27/07/2011 2:20:00', '27/07/2011 2:21:00', '27/07/2011 2:22:00', '27/07/2011 2:23:00', '27/07/2011 2:24:00', '27/07/2011 2:25:00', '27/07/2011 2:26:00', '27/07/2011 2:27:00', '27/07/2011 2:28:00', '27/07/2011 2:29:00', '27/07/2011 2:30:00', '27/07/2011 2:31:00', '27/07/2011 2:32:00', '27/07/2011 2:33:00', '27/07/2011 2:34:00', '27/07/2011 2:35:00', '27/07/2011 2:36:00', '27/07/2011 2:37:00', '27/07/2011 2:38:00', '27/07/2011 2:39:00', '27/07/2011 2:40:00', '27/07/2011 2:41:00', '27/07/2011 2:42:00', '27/07/2011 2:43:00', '27/07/2011 2:44:00', '27/07/2011 2:45:00', '27/07/2011 2:46:00', '27/07/2011 2:47:00', '27/07/2011 2:48:00', '27/07/2011 2:49:00', '27/07/2011 2:50:00', '27/07/2011 2:51:00', '27/07/2011 2:52:00', '27/07/2011 2:53:00', '27/07/2011 2:54:00', '27/07/2011 2:55:00', '27/07/2011 2:56:00', '27/07/2011 2:57:00', '27/07/2011 2:58:00', '27/07/2011 2:59:00', '27/07/2011 3:00:00', '27/07/2011 3:01:00', '27/07/2011 3:02:00', '27/07/2011 3:03:00', '27/07/2011 3:04:00', '27/07/2011 3:05:00', '27/07/2011 3:06:00', '27/07/2011 3:07:00', '27/07/2011 3:08:00', '27/07/2011 3:09:00', '27/07/2011 3:10:00', '27/07/2011 3:11:00', '27/07/2011 3:12:00', '27/07/2011 3:13:00', '27/07/2011 3:14:00', '27/07/2011 3:15:00', '27/07/2011 3:16:00', '27/07/2011 3:17:00', '27/07/2011 3:18:00', '27/07/2011 3:19:00', '27/07/2011 3:20:00', '27/07/2011 3:21:00', '27/07/2011 3:22:00', '27/07/2011 3:23:00', '27/07/2011 3:24:00', '27/07/2011 3:25:00', '27/07/2011 3:26:00', '27/07/2011 3:27:00', '27/07/2011 3:28:00', '27/07/2011 3:29:00', '27/07/2011 3:30:00', '27/07/2011 3:31:00', '27/07/2011 3:32:00', '27/07/2011 3:33:00', '27/07/2011 3:34:00', '27/07/2011 3:35:00', '27/07/2011 3:36:00', '27/07/2011 3:37:00', '27/07/2011 3:38:00', '27/07/2011 3:39:00', '27/07/2011 3:40:00', '27/07/2011 3:41:00', '27/07/2011 3:42:00', '27/07/2011 3:43:00', '27/07/2011 3:44:00', '27/07/2011 3:45:00', '27/07/2011 3:46:00', '27/07/2011 3:47:00', '27/07/2011 3:48:00', '27/07/2011 3:49:00', '27/07/2011 3:50:00', '27/07/2011 3:51:00', '27/07/2011 3:52:00', '27/07/2011 3:53:00', '27/07/2011 3:54:00', '27/07/2011 3:55:00', '27/07/2011 3:56:00', '27/07/2011 3:57:00', '27/07/2011 3:58:00', '27/07/2011 3:59:00', '27/07/2011 4:00:00', '27/07/2011 4:01:00', '27/07/2011 4:02:00', '27/07/2011 4:03:00', '27/07/2011 4:04:00', '27/07/2011 4:05:00', '27/07/2011 4:06:00', '27/07/2011 4:07:00', '27/07/2011 4:08:00', '27/07/2011 4:09:00', '27/07/2011 4:10:00', '27/07/2011 4:11:00', '27/07/2011 4:12:00', '27/07/2011 4:13:00', '27/07/2011 4:14:00', '27/07/2011 4:15:00', '27/07/2011 4:16:00', '27/07/2011 4:17:00', '27/07/2011 4:18:00', '27/07/2011 4:19:00', '27/07/2011 4:20:00', '27/07/2011 4:21:00', '27/07/2011 4:22:00', '27/07/2011 4:23:00', '27/07/2011 4:24:00', '27/07/2011 4:25:00', '27/07/2011 4:26:00', '27/07/2011 4:27:00', '27/07/2011 4:28:00', '27/07/2011 4:29:00', '27/07/2011 4:30:00', '27/07/2011 4:31:00', '27/07/2011 4:32:00', '27/07/2011 4:33:00', '27/07/2011 4:34:00', '27/07/2011 4:35:00', '27/07/2011 4:36:00', '27/07/2011 4:37:00', '27/07/2011 4:38:00', '27/07/2011 4:39:00', '27/07/2011 4:40:00', '27/07/2011 4:41:00', '27/07/2011 4:42:00', '27/07/2011 4:43:00', '27/07/2011 4:44:00', '27/07/2011 4:45:00', '27/07/2011 4:46:00', '27/07/2011 4:47:00', '27/07/2011 4:48:00', '27/07/2011 4:49:00', '27/07/2011 4:50:00', '27/07/2011 4:51:00', '27/07/2011 4:52:00', '27/07/2011 4:53:00', '27/07/2011 4:54:00', '27/07/2011 4:55:00', '27/07/2011 4:56:00', '27/07/2011 4:57:00', '27/07/2011 4:58:00', '27/07/2011 4:59:00', '27/07/2011 5:00:00', '27/07/2011 5:01:00', '27/07/2011 5:02:00', '27/07/2011 5:03:00', '27/07/2011 5:04:00', '27/07/2011 5:05:00', '27/07/2011 5:06:00', '27/07/2011 5:07:00', '27/07/2011 5:08:00', '27/07/2011 5:09:00', '27/07/2011 5:10:00', '27/07/2011 5:11:00', '27/07/2011 5:12:00', '27/07/2011 5:13:00', '27/07/2011 5:14:00', '27/07/2011 5:15:00', '27/07/2011 5:16:00', '27/07/2011 5:17:00', '27/07/2011 5:18:00', '27/07/2011 5:19:00', '27/07/2011 5:20:00', '27/07/2011 5:21:00', '27/07/2011 5:22:00', '27/07/2011 5:23:00', '27/07/2011 5:24:00', '27/07/2011 5:25:00', '27/07/2011 5:26:00', '27/07/2011 5:27:00', '27/07/2011 5:28:00', '27/07/2011 5:29:00', '27/07/2011 5:30:00', '27/07/2011 5:31:00', '27/07/2011 5:32:00', '27/07/2011 5:33:00', '27/07/2011 5:34:00', '27/07/2011 5:35:00', '27/07/2011 5:36:00', '27/07/2011 5:37:00', '27/07/2011 5:38:00', '27/07/2011 5:39:00', '27/07/2011 5:40:00', '27/07/2011 5:41:00', '27/07/2011 5:42:00', '27/07/2011 5:43:00', '27/07/2011 5:44:00', '27/07/2011 5:45:00', '27/07/2011 5:46:00', '27/07/2011 5:47:00', '27/07/2011 5:48:00', '27/07/2011 5:49:00', '27/07/2011 5:50:00', '27/07/2011 5:51:00', '27/07/2011 5:52:00', '27/07/2011 5:53:00', '27/07/2011 5:54:00', '27/07/2011 5:55:00', '27/07/2011 5:56:00', '27/07/2011 5:57:00', '27/07/2011 5:58:00', '27/07/2011 5:59:00', '27/07/2011 6:00:00', '27/07/2011 6:01:00', '27/07/2011 6:02:00', '27/07/2011 6:03:00', '27/07/2011 6:04:00', '27/07/2011 6:05:00', '27/07/2011 6:06:00', '27/07/2011 6:07:00', '27/07/2011 6:08:00', '27/07/2011 6:09:00', '27/07/2011 6:10:00', '27/07/2011 6:11:00', '27/07/2011 6:12:00', '27/07/2011 6:13:00', '27/07/2011 6:14:00', '27/07/2011 6:15:00', '27/07/2011 6:16:00', '27/07/2011 6:17:00', '27/07/2011 6:18:00', '27/07/2011 6:19:00', '27/07/2011 6:20:00', '27/07/2011 6:21:00', '27/07/2011 6:22:00', '27/07/2011 6:23:00', '27/07/2011 6:24:00', '27/07/2011 6:25:00', '27/07/2011 6:26:00', '27/07/2011 6:27:00', '27/07/2011 6:28:00', '27/07/2011 6:29:00', '27/07/2011 6:30:00', '27/07/2011 6:31:00', '27/07/2011 6:32:00', '27/07/2011 6:33:00', '27/07/2011 6:34:00', '27/07/2011 6:35:00', '27/07/2011 6:36:00', '27/07/2011 6:37:00', '27/07/2011 6:38:00', '27/07/2011 6:39:00', '27/07/2011 6:40:00', '27/07/2011 6:41:00', '27/07/2011 6:42:00', '27/07/2011 6:43:00', '27/07/2011 6:44:00', '27/07/2011 6:45:00', '27/07/2011 6:46:00', '27/07/2011 6:47:00', '27/07/2011 6:48:00', '27/07/2011 6:49:00', '27/07/2011 6:50:00', '27/07/2011 6:51:00', '27/07/2011 6:52:00', '27/07/2011 6:53:00', '27/07/2011 6:54:00', '27/07/2011 6:55:00', '27/07/2011 6:56:00', '27/07/2011 6:57:00', '27/07/2011 6:58:00', '27/07/2011 6:59:00', '27/07/2011 7:00:00', '27/07/2011 7:01:00', '27/07/2011 7:02:00', '27/07/2011 7:03:00', '27/07/2011 7:04:00', '27/07/2011 7:05:00', '27/07/2011 7:06:00', '27/07/2011 7:07:00', '27/07/2011 7:08:00', '27/07/2011 7:09:00', '27/07/2011 7:10:00', '27/07/2011 7:11:00', '27/07/2011 7:12:00', '27/07/2011 7:13:00', '27/07/2011 7:14:00', '27/07/2011 7:15:00', '27/07/2011 7:16:00', '27/07/2011 7:17:00', '27/07/2011 7:18:00', '27/07/2011 7:19:00', '27/07/2011 7:20:00', '27/07/2011 7:21:00', '27/07/2011 7:22:00', '27/07/2011 7:23:00', '27/07/2011 7:24:00', '27/07/2011 7:25:00', '27/07/2011 7:26:00', '27/07/2011 7:27:00', '27/07/2011 7:28:00', '27/07/2011 7:29:00', '27/07/2011 7:30:00', '27/07/2011 7:31:00', '27/07/2011 7:32:00', '27/07/2011 7:33:00', '27/07/2011 7:34:00', '27/07/2011 7:35:00', '27/07/2011 7:36:00', '27/07/2011 7:37:00', '27/07/2011 7:38:00', '27/07/2011 7:39:00', '27/07/2011 7:40:00', '27/07/2011 7:41:00', '27/07/2011 7:42:00', '27/07/2011 7:43:00', '27/07/2011 7:44:00', '27/07/2011 7:45:00', '27/07/2011 7:46:00', '27/07/2011 7:47:00', '27/07/2011 7:48:00', '27/07/2011 7:49:00', '27/07/2011 7:50:00', '27/07/2011 7:51:00', '27/07/2011 7:52:00', '27/07/2011 7:53:00', '27/07/2011 7:54:00', '27/07/2011 7:55:00', '27/07/2011 7:56:00', '27/07/2011 7:57:00', '27/07/2011 7:58:00', '27/07/2011 7:59:00', '27/07/2011 8:00:00', '27/07/2011 8:01:00', '27/07/2011 8:02:00', '27/07/2011 8:03:00', '27/07/2011 8:04:00', '27/07/2011 8:05:00', '27/07/2011 8:06:00', '27/07/2011 8:07:00', '27/07/2011 8:08:00', '27/07/2011 8:09:00', '27/07/2011 8:10:00', '27/07/2011 8:11:00', '27/07/2011 8:12:00', '27/07/2011 8:13:00', '27/07/2011 8:14:00', '27/07/2011 8:15:00', '27/07/2011 8:16:00', '27/07/2011 8:17:00', '27/07/2011 8:18:00', '27/07/2011 8:19:00', '27/07/2011 8:20:00', '27/07/2011 8:21:00', '27/07/2011 8:22:00', '27/07/2011 8:23:00', '27/07/2011 8:24:00', '27/07/2011 8:25:00', '27/07/2011 8:26:00', '27/07/2011 8:27:00', '27/07/2011 8:28:00', '27/07/2011 8:29:00', '27/07/2011 8:30:00', '27/07/2011 8:31:00', '27/07/2011 8:32:00', '27/07/2011 8:33:00', '27/07/2011 8:34:00', '27/07/2011 8:35:00', '27/07/2011 8:36:00', '27/07/2011 8:37:00', '27/07/2011 8:38:00', '27/07/2011 8:39:00', '27/07/2011 8:40:00', '27/07/2011 8:41:00', '27/07/2011 8:42:00', '27/07/2011 8:43:00', '27/07/2011 8:44:00', '27/07/2011 8:45:00', '27/07/2011 8:46:00', '27/07/2011 8:47:00', '27/07/2011 8:48:00', '27/07/2011 8:49:00', '27/07/2011 8:50:00', '27/07/2011 8:51:00', '27/07/2011 8:52:00', '27/07/2011 8:53:00', '27/07/2011 8:54:00', '27/07/2011 8:55:00', '27/07/2011 8:56:00', '27/07/2011 8:57:00', '27/07/2011 8:58:00', '27/07/2011 8:59:00', '27/07/2011 9:00:00', '27/07/2011 9:01:00', '27/07/2011 9:02:00', '27/07/2011 9:03:00', '27/07/2011 9:04:00', '27/07/2011 9:05:00', '27/07/2011 9:06:00', '27/07/2011 9:07:00', '27/07/2011 9:08:00', '27/07/2011 9:09:00', '27/07/2011 9:10:00', '27/07/2011 9:11:00', '27/07/2011 9:12:00', '27/07/2011 9:13:00', '27/07/2011 9:14:00', '27/07/2011 9:15:00', '27/07/2011 9:16:00', '27/07/2011 9:17:00', '27/07/2011 9:18:00', '27/07/2011 9:19:00', '27/07/2011 9:20:00', '27/07/2011 9:21:00', '27/07/2011 9:22:00', '27/07/2011 9:23:00', '27/07/2011 9:24:00', '27/07/2011 9:25:00', '27/07/2011 9:26:00', '27/07/2011 9:27:00', '27/07/2011 9:28:00', '27/07/2011 9:29:00', '27/07/2011 9:30:00', '27/07/2011 9:31:00', '27/07/2011 9:32:00', '27/07/2011 9:33:00', '27/07/2011 9:34:00', '27/07/2011 9:35:00', '27/07/2011 9:36:00', '27/07/2011 9:37:00', '27/07/2011 9:38:00', '27/07/2011 9:39:00', '27/07/2011 9:40:00', '27/07/2011 9:41:00', '27/07/2011 9:42:00', '27/07/2011 9:43:00', '27/07/2011 9:44:00', '27/07/2011 9:45:00', '27/07/2011 9:46:00', '27/07/2011 9:47:00', '27/07/2011 9:48:00', '27/07/2011 9:49:00', '27/07/2011 9:50:00', '27/07/2011 9:51:00', '27/07/2011 9:52:00', '27/07/2011 9:53:00', '27/07/2011 9:54:00', '27/07/2011 9:55:00', '27/07/2011 9:56:00', '27/07/2011 9:57:00', '27/07/2011 9:58:00', '27/07/2011 9:59:00', '27/07/2011 10:00:00', '27/07/2011 10:01:00', '27/07/2011 10:02:00', '27/07/2011 10:03:00', '27/07/2011 10:04:00', '27/07/2011 10:05:00', '27/07/2011 10:06:00', '27/07/2011 10:07:00', '27/07/2011 10:08:00', '27/07/2011 10:09:00', '27/07/2011 10:10:00', '27/07/2011 10:11:00', '27/07/2011 10:12:00', '27/07/2011 10:13:00', '27/07/2011 10:14:00', '27/07/2011 10:15:00', '27/07/2011 10:16:00', '27/07/2011 10:17:00', '27/07/2011 10:18:00', '27/07/2011 10:19:00', '27/07/2011 10:20:00', '27/07/2011 10:21:00', '27/07/2011 10:22:00', '27/07/2011 10:23:00', '27/07/2011 10:24:00', '27/07/2011 10:25:00', '27/07/2011 10:26:00', '27/07/2011 10:27:00', '27/07/2011 10:28:00', '27/07/2011 10:29:00', '27/07/2011 10:30:00', '27/07/2011 10:31:00', '27/07/2011 10:32:00', '27/07/2011 10:33:00', '27/07/2011 10:34:00', '27/07/2011 10:35:00', '27/07/2011 10:36:00', '27/07/2011 10:37:00', '27/07/2011 10:38:00', '27/07/2011 10:39:00', '27/07/2011 10:40:00', '27/07/2011 10:41:00', '27/07/2011 10:42:00', '27/07/2011 10:43:00', '27/07/2011 10:44:00', '27/07/2011 10:45:00', '27/07/2011 10:46:00', '27/07/2011 10:47:00', '27/07/2011 10:48:00', '27/07/2011 10:49:00', '27/07/2011 10:50:00', '27/07/2011 10:51:00', '27/07/2011 10:52:00', '27/07/2011 10:53:00', '27/07/2011 10:54:00', '27/07/2011 10:55:00', '27/07/2011 10:56:00', '27/07/2011 10:57:00', '27/07/2011 10:58:00', '27/07/2011 10:59:00', '27/07/2011 11:00:00', '27/07/2011 11:01:00', '27/07/2011 11:02:00', '27/07/2011 11:03:00', '27/07/2011 11:04:00', '27/07/2011 11:05:00', '27/07/2011 11:06:00', '27/07/2011 11:07:00', '27/07/2011 11:08:00', '27/07/2011 11:09:00', '27/07/2011 11:10:00', '27/07/2011 11:11:00', '27/07/2011 11:12:00', '27/07/2011 11:13:00', '27/07/2011 11:14:00', '27/07/2011 11:15:00', '27/07/2011 11:16:00', '27/07/2011 11:17:00', '27/07/2011 11:18:00', '27/07/2011 11:19:00', '27/07/2011 11:20:00', '27/07/2011 11:21:00', '27/07/2011 11:22:00', '27/07/2011 11:23:00', '27/07/2011 11:24:00', '27/07/2011 11:25:00', '27/07/2011 11:26:00', '27/07/2011 11:27:00', '27/07/2011 11:28:00', '27/07/2011 11:29:00', '27/07/2011 11:30:00', '27/07/2011 11:31:00', '27/07/2011 11:32:00', '27/07/2011 11:33:00', '27/07/2011 11:34:00', '27/07/2011 11:35:00', '27/07/2011 11:36:00', '27/07/2011 11:37:00', '27/07/2011 11:38:00', '27/07/2011 11:39:00', '27/07/2011 11:40:00', '27/07/2011 11:41:00', '27/07/2011 11:42:00', '27/07/2011 11:43:00', '27/07/2011 11:44:00', '27/07/2011 11:45:00', '27/07/2011 11:46:00', '27/07/2011 11:47:00', '27/07/2011 11:48:00', '27/07/2011 11:49:00', '27/07/2011 11:50:00', '27/07/2011 11:51:00', '27/07/2011 11:52:00', '27/07/2011 11:53:00', '27/07/2011 11:54:00', '27/07/2011 11:55:00', '27/07/2011 11:56:00', '27/07/2011 11:57:00', '27/07/2011 11:58:00', '27/07/2011 11:59:00', '27/07/2011 12:00:00', '27/07/2011 12:01:00', '27/07/2011 12:02:00', '27/07/2011 12:03:00', '27/07/2011 12:04:00', '27/07/2011 12:05:00', '27/07/2011 12:06:00', '27/07/2011 12:07:00', '27/07/2011 12:08:00', '27/07/2011 12:09:00', '27/07/2011 12:10:00', '27/07/2011 12:11:00', '27/07/2011 12:12:00', '27/07/2011 12:13:00', '27/07/2011 12:14:00', '27/07/2011 12:15:00', '27/07/2011 12:16:00', '27/07/2011 12:17:00', '27/07/2011 12:18:00', '27/07/2011 12:19:00', '27/07/2011 12:20:00', '27/07/2011 12:21:00', '27/07/2011 12:22:00', '27/07/2011 12:23:00', '27/07/2011 12:24:00', '27/07/2011 12:25:00', '27/07/2011 12:26:00', '27/07/2011 12:27:00', '27/07/2011 12:28:00', '27/07/2011 12:29:00', '27/07/2011 12:30:00', '27/07/2011 12:31:00', '27/07/2011 12:32:00', '27/07/2011 12:33:00', '27/07/2011 12:34:00', '27/07/2011 12:35:00', '27/07/2011 12:36:00', '27/07/2011 12:37:00', '27/07/2011 12:38:00', '27/07/2011 12:39:00', '27/07/2011 12:40:00', '27/07/2011 12:41:00', '27/07/2011 12:42:00', '27/07/2011 12:43:00', '27/07/2011 12:44:00', '27/07/2011 12:45:00', '27/07/2011 12:46:00', '27/07/2011 12:47:00', '27/07/2011 12:48:00', '27/07/2011 12:49:00', '27/07/2011 12:50:00', '27/07/2011 12:51:00', '27/07/2011 12:52:00', '27/07/2011 12:53:00', '27/07/2011 12:54:00', '27/07/2011 12:55:00', '27/07/2011 12:56:00', '27/07/2011 12:57:00', '27/07/2011 12:58:00', '27/07/2011 12:59:00', '27/07/2011 13:00:00', '27/07/2011 13:01:00', '27/07/2011 13:02:00', '27/07/2011 13:03:00', '27/07/2011 13:04:00', '27/07/2011 13:05:00', '27/07/2011 13:06:00', '27/07/2011 13:07:00', '27/07/2011 13:08:00', '27/07/2011 13:09:00', '27/07/2011 13:10:00', '27/07/2011 13:11:00', '27/07/2011 13:12:00', '27/07/2011 13:13:00', '27/07/2011 13:14:00', '27/07/2011 13:15:00', '27/07/2011 13:16:00', '27/07/2011 13:17:00', '27/07/2011 13:18:00', '27/07/2011 13:19:00', '27/07/2011 13:20:00', '27/07/2011 13:21:00', '27/07/2011 13:22:00', '27/07/2011 13:23:00', '27/07/2011 13:24:00', '27/07/2011 13:25:00', '27/07/2011 13:26:00', '27/07/2011 13:27:00', '27/07/2011 13:28:00', '27/07/2011 13:29:00', '27/07/2011 13:30:00', '27/07/2011 13:31:00', '27/07/2011 13:32:00', '27/07/2011 13:33:00', '27/07/2011 13:34:00', '27/07/2011 13:35:00', '27/07/2011 13:36:00', '27/07/2011 13:37:00', '27/07/2011 13:38:00', '27/07/2011 13:39:00', '27/07/2011 13:40:00', '27/07/2011 13:41:00', '27/07/2011 13:42:00', '27/07/2011 13:43:00', '27/07/2011 13:44:00', '27/07/2011 13:45:00', '27/07/2011 13:46:00', '27/07/2011 13:47:00', '27/07/2011 13:48:00', '27/07/2011 13:49:00', '27/07/2011 13:50:00', '27/07/2011 13:51:00', '27/07/2011 13:52:00', '27/07/2011 13:53:00', '27/07/2011 13:54:00', '27/07/2011 13:55:00', '27/07/2011 13:56:00', '27/07/2011 13:57:00', '27/07/2011 13:58:00', '27/07/2011 13:59:00', '27/07/2011 14:00:00', '27/07/2011 14:01:00', '27/07/2011 14:02:00', '27/07/2011 14:03:00', '27/07/2011 14:04:00', '27/07/2011 14:05:00', '27/07/2011 14:06:00', '27/07/2011 14:07:00', '27/07/2011 14:08:00', '27/07/2011 14:09:00', '27/07/2011 14:10:00', '27/07/2011 14:11:00', '27/07/2011 14:12:00', '27/07/2011 14:13:00', '27/07/2011 14:14:00', '27/07/2011 14:15:00', '27/07/2011 14:16:00', '27/07/2011 14:17:00', '27/07/2011 14:18:00', '27/07/2011 14:19:00', '27/07/2011 14:20:00', '27/07/2011 14:21:00', '27/07/2011 14:22:00', '27/07/2011 14:23:00', '27/07/2011 14:24:00', '27/07/2011 14:25:00', '27/07/2011 14:26:00', '27/07/2011 14:27:00', '27/07/2011 14:28:00', '27/07/2011 14:29:00', '27/07/2011 14:30:00', '27/07/2011 14:31:00', '27/07/2011 14:32:00', '27/07/2011 14:33:00', '27/07/2011 14:34:00', '27/07/2011 14:35:00', '27/07/2011 14:36:00', '27/07/2011 14:37:00', '27/07/2011 14:38:00', '27/07/2011 14:39:00', '27/07/2011 14:40:00', '27/07/2011 14:41:00', '27/07/2011 14:42:00', '27/07/2011 14:43:00', '27/07/2011 14:44:00', '27/07/2011 14:45:00', '27/07/2011 14:46:00', '27/07/2011 14:47:00', '27/07/2011 14:48:00', '27/07/2011 14:49:00', '27/07/2011 14:50:00', '27/07/2011 14:51:00', '27/07/2011 14:52:00', '27/07/2011 14:53:00', '27/07/2011 14:54:00', '27/07/2011 14:55:00', '27/07/2011 14:56:00', '27/07/2011 14:57:00', '27/07/2011 14:58:00', '27/07/2011 14:59:00', '27/07/2011 15:00:00', '27/07/2011 15:01:00', '27/07/2011 15:02:00', '27/07/2011 15:03:00', '27/07/2011 15:04:00', '27/07/2011 15:05:00', '27/07/2011 15:06:00', '27/07/2011 15:07:00', '27/07/2011 15:08:00', '27/07/2011 15:09:00', '27/07/2011 15:10:00', '27/07/2011 15:11:00', '27/07/2011 15:12:00', '27/07/2011 15:13:00', '27/07/2011 15:14:00', '27/07/2011 15:15:00', '27/07/2011 15:16:00', '27/07/2011 15:17:00', '27/07/2011 15:18:00', '27/07/2011 15:19:00', '27/07/2011 15:20:00', '27/07/2011 15:21:00', '27/07/2011 15:22:00', '27/07/2011 15:23:00', '27/07/2011 15:24:00', '27/07/2011 15:25:00', '27/07/2011 15:26:00', '27/07/2011 15:27:00', '27/07/2011 15:28:00', '27/07/2011 15:29:00', '27/07/2011 15:30:00', '27/07/2011 15:31:00', '27/07/2011 15:32:00', '27/07/2011 15:33:00', '27/07/2011 15:34:00', '27/07/2011 15:35:00', '27/07/2011 15:36:00', '27/07/2011 15:37:00', '27/07/2011 15:38:00', '27/07/2011 15:39:00', '27/07/2011 15:40:00', '27/07/2011 15:41:00', '27/07/2011 15:42:00', '27/07/2011 15:43:00', '27/07/2011 15:44:00', '27/07/2011 15:45:00', '27/07/2011 15:46:00', '27/07/2011 15:47:00', '27/07/2011 15:48:00', '27/07/2011 15:49:00', '27/07/2011 15:50:00', '27/07/2011 15:51:00', '27/07/2011 15:52:00', '27/07/2011 15:53:00', '27/07/2011 15:54:00', '27/07/2011 15:55:00', '27/07/2011 15:56:00', '27/07/2011 15:57:00', '27/07/2011 15:58:00', '27/07/2011 15:59:00', '27/07/2011 16:00:00', '27/07/2011 16:01:00', '27/07/2011 16:02:00', '27/07/2011 16:03:00', '27/07/2011 16:04:00', '27/07/2011 16:05:00', '27/07/2011 16:06:00', '27/07/2011 16:07:00', '27/07/2011 16:08:00', '27/07/2011 16:09:00', '27/07/2011 16:10:00', '27/07/2011 16:11:00', '27/07/2011 16:12:00', '27/07/2011 16:13:00', '27/07/2011 16:14:00', '27/07/2011 16:15:00', '27/07/2011 16:16:00', '27/07/2011 16:17:00', '27/07/2011 16:18:00', '27/07/2011 16:19:00', '27/07/2011 16:20:00', '27/07/2011 16:21:00', '27/07/2011 16:22:00', '27/07/2011 16:23:00', '27/07/2011 16:24:00', '27/07/2011 16:25:00', '27/07/2011 16:26:00', '27/07/2011 16:27:00', '27/07/2011 16:28:00', '27/07/2011 16:29:00', '27/07/2011 16:30:00', '27/07/2011 16:31:00', '27/07/2011 16:32:00', '27/07/2011 16:33:00', '27/07/2011 16:34:00', '27/07/2011 16:35:00', '27/07/2011 16:36:00', '27/07/2011 16:37:00', '27/07/2011 16:38:00', '27/07/2011 16:39:00', '27/07/2011 16:40:00', '27/07/2011 16:41:00', '27/07/2011 16:42:00', '27/07/2011 16:43:00', '27/07/2011 16:44:00', '27/07/2011 16:45:00', '27/07/2011 16:46:00', '27/07/2011 16:47:00', '27/07/2011 16:48:00', '27/07/2011 16:49:00', '27/07/2011 16:50:00', '27/07/2011 16:51:00', '27/07/2011 16:52:00', '27/07/2011 16:53:00', '27/07/2011 16:54:00', '27/07/2011 16:55:00', '27/07/2011 16:56:00', '27/07/2011 16:57:00', '27/07/2011 16:58:00', '27/07/2011 16:59:00', '27/07/2011 17:00:00', '27/07/2011 17:01:00', '27/07/2011 17:02:00', '27/07/2011 17:03:00', '27/07/2011 17:04:00', '27/07/2011 17:05:00', '27/07/2011 17:06:00', '27/07/2011 17:07:00', '27/07/2011 17:08:00', '27/07/2011 17:09:00', '27/07/2011 17:10:00', '27/07/2011 17:11:00', '27/07/2011 17:12:00', '27/07/2011 17:13:00', '27/07/2011 17:14:00', '27/07/2011 17:15:00', '27/07/2011 17:16:00', '27/07/2011 17:17:00', '27/07/2011 17:18:00', '27/07/2011 17:19:00', '27/07/2011 17:20:00', '27/07/2011 17:21:00', '27/07/2011 17:22:00', '27/07/2011 17:23:00', '27/07/2011 17:24:00', '27/07/2011 17:25:00', '27/07/2011 17:26:00', '27/07/2011 17:27:00', '27/07/2011 17:28:00', '27/07/2011 17:29:00', '27/07/2011 17:30:00', '27/07/2011 17:31:00', '27/07/2011 17:32:00', '27/07/2011 17:33:00', '27/07/2011 17:34:00', '27/07/2011 17:35:00', '27/07/2011 17:36:00', '27/07/2011 17:37:00', '27/07/2011 17:38:00', '27/07/2011 17:39:00', '27/07/2011 17:40:00', '27/07/2011 17:41:00', '27/07/2011 17:42:00', '27/07/2011 17:43:00', '27/07/2011 17:44:00', '27/07/2011 17:45:00', '27/07/2011 17:46:00', '27/07/2011 17:47:00', '27/07/2011 17:48:00', '27/07/2011 17:49:00', '27/07/2011 17:50:00', '27/07/2011 17:51:00', '27/07/2011 17:52:00', '27/07/2011 17:53:00', '27/07/2011 17:54:00', '27/07/2011 17:55:00', '27/07/2011 17:56:00', '27/07/2011 17:57:00', '27/07/2011 17:58:00', '27/07/2011 17:59:00', '27/07/2011 18:00:00', '27/07/2011 18:01:00', '27/07/2011 18:02:00', '27/07/2011 18:03:00', '27/07/2011 18:04:00', '27/07/2011 18:05:00', '27/07/2011 18:06:00', '27/07/2011 18:07:00', '27/07/2011 18:08:00', '27/07/2011 18:09:00', '27/07/2011 18:10:00', '27/07/2011 18:11:00', '27/07/2011 18:12:00', '27/07/2011 18:13:00', '27/07/2011 18:14:00', '27/07/2011 18:15:00', '27/07/2011 18:16:00', '27/07/2011 18:17:00', '27/07/2011 18:18:00', '27/07/2011 18:19:00', '27/07/2011 18:20:00', '27/07/2011 18:21:00', '27/07/2011 18:22:00', '27/07/2011 18:23:00', '27/07/2011 18:24:00', '27/07/2011 18:25:00', '27/07/2011 18:26:00', '27/07/2011 18:27:00', '27/07/2011 18:28:00', '27/07/2011 18:29:00', '27/07/2011 18:30:00', '27/07/2011 18:31:00', '27/07/2011 18:32:00', '27/07/2011 18:33:00', '27/07/2011 18:34:00', '27/07/2011 18:35:00', '27/07/2011 18:36:00', '27/07/2011 18:37:00', '27/07/2011 18:38:00', '27/07/2011 18:39:00', '27/07/2011 18:40:00', '27/07/2011 18:41:00', '27/07/2011 18:42:00', '27/07/2011 18:43:00', '27/07/2011 18:44:00', '27/07/2011 18:45:00', '27/07/2011 18:46:00', '27/07/2011 18:47:00', '27/07/2011 18:48:00', '27/07/2011 18:49:00', '27/07/2011 18:50:00', '27/07/2011 18:51:00', '27/07/2011 18:52:00', '27/07/2011 18:53:00', '27/07/2011 18:54:00', '27/07/2011 18:55:00', '27/07/2011 18:56:00', '27/07/2011 18:57:00', '27/07/2011 18:58:00', '27/07/2011 18:59:00', '27/07/2011 19:00:00', '27/07/2011 19:01:00', '27/07/2011 19:02:00', '27/07/2011 19:03:00', '27/07/2011 19:04:00', '27/07/2011 19:05:00', '27/07/2011 19:06:00', '27/07/2011 19:07:00', '27/07/2011 19:08:00', '27/07/2011 19:09:00', '27/07/2011 19:10:00', '27/07/2011 19:11:00', '27/07/2011 19:12:00', '27/07/2011 19:13:00', '27/07/2011 19:14:00', '27/07/2011 19:15:00', '27/07/2011 19:16:00', '27/07/2011 19:17:00', '27/07/2011 19:18:00', '27/07/2011 19:19:00', '27/07/2011 19:20:00', '27/07/2011 19:21:00', '27/07/2011 19:22:00', '27/07/2011 19:23:00', '27/07/2011 19:24:00', '27/07/2011 19:25:00', '27/07/2011 19:26:00', '27/07/2011 19:27:00', '27/07/2011 19:28:00', '27/07/2011 19:29:00', '27/07/2011 19:30:00', '27/07/2011 19:31:00', '27/07/2011 19:32:00', '27/07/2011 19:33:00', '27/07/2011 19:34:00', '27/07/2011 19:35:00', '27/07/2011 19:36:00', '27/07/2011 19:37:00', '27/07/2011 19:38:00', '27/07/2011 19:39:00', '27/07/2011 19:40:00', '27/07/2011 19:41:00', '27/07/2011 19:42:00', '27/07/2011 19:43:00', '27/07/2011 19:44:00', '27/07/2011 19:45:00', '27/07/2011 19:46:00', '27/07/2011 19:47:00', '27/07/2011 19:48:00', '27/07/2011 19:49:00', '27/07/2011 19:50:00', '27/07/2011 19:51:00', '27/07/2011 19:52:00', '27/07/2011 19:53:00', '27/07/2011 19:54:00', '27/07/2011 19:55:00', '27/07/2011 19:56:00', '27/07/2011 19:57:00', '27/07/2011 19:58:00', '27/07/2011 19:59:00', '27/07/2011 20:00:00', '27/07/2011 20:01:00', '27/07/2011 20:02:00', '27/07/2011 20:03:00', '27/07/2011 20:04:00', '27/07/2011 20:05:00', '27/07/2011 20:06:00', '27/07/2011 20:07:00', '27/07/2011 20:08:00', '27/07/2011 20:09:00', '27/07/2011 20:10:00', '27/07/2011 20:11:00', '27/07/2011 20:12:00', '27/07/2011 20:13:00', '27/07/2011 20:14:00', '27/07/2011 20:15:00', '27/07/2011 20:16:00', '27/07/2011 20:17:00', '27/07/2011 20:18:00', '27/07/2011 20:19:00', '27/07/2011 20:20:00', '27/07/2011 20:21:00', '27/07/2011 20:22:00', '27/07/2011 20:23:00', '27/07/2011 20:24:00', '27/07/2011 20:25:00', '27/07/2011 20:26:00', '27/07/2011 20:27:00', '27/07/2011 20:28:00', '27/07/2011 20:29:00', '27/07/2011 20:30:00', '27/07/2011 20:31:00', '27/07/2011 20:32:00', '27/07/2011 20:33:00', '27/07/2011 20:34:00', '27/07/2011 20:35:00', '27/07/2011 20:36:00', '27/07/2011 20:37:00', '27/07/2011 20:38:00', '27/07/2011 20:39:00', '27/07/2011 20:40:00', '27/07/2011 20:41:00', '27/07/2011 20:42:00', '27/07/2011 20:43:00', '27/07/2011 20:44:00', '27/07/2011 20:45:00', '27/07/2011 20:46:00', '27/07/2011 20:47:00', '27/07/2011 20:48:00', '27/07/2011 20:49:00', '27/07/2011 20:50:00', '27/07/2011 20:51:00', '27/07/2011 20:52:00', '27/07/2011 20:53:00', '27/07/2011 20:54:00', '27/07/2011 20:55:00', '27/07/2011 20:56:00', '27/07/2011 20:57:00', '27/07/2011 20:58:00', '27/07/2011 20:59:00', '27/07/2011 21:00:00', '27/07/2011 21:01:00', '27/07/2011 21:02:00', '27/07/2011 21:03:00', '27/07/2011 21:04:00', '27/07/2011 21:05:00', '27/07/2011 21:06:00', '27/07/2011 21:07:00', '27/07/2011 21:08:00', '27/07/2011 21:09:00', '27/07/2011 21:10:00', '27/07/2011 21:11:00', '27/07/2011 21:12:00', '27/07/2011 21:13:00', '27/07/2011 21:14:00', '27/07/2011 21:15:00', '27/07/2011 21:16:00', '27/07/2011 21:17:00', '27/07/2011 21:18:00', '27/07/2011 21:19:00', '27/07/2011 21:20:00', '27/07/2011 21:21:00', '27/07/2011 21:22:00', '27/07/2011 21:23:00', '27/07/2011 21:24:00', '27/07/2011 21:25:00', '27/07/2011 21:26:00', '27/07/2011 21:27:00', '27/07/2011 21:28:00', '27/07/2011 21:29:00', '27/07/2011 21:30:00', '27/07/2011 21:31:00', '27/07/2011 21:32:00', '27/07/2011 21:33:00', '27/07/2011 21:34:00', '27/07/2011 21:35:00', '27/07/2011 21:36:00', '27/07/2011 21:37:00', '27/07/2011 21:38:00', '27/07/2011 21:39:00', '27/07/2011 21:40:00', '27/07/2011 21:41:00', '27/07/2011 21:42:00', '27/07/2011 21:43:00', '27/07/2011 21:44:00', '27/07/2011 21:45:00', '27/07/2011 21:46:00', '27/07/2011 21:47:00', '27/07/2011 21:48:00', '27/07/2011 21:49:00', '27/07/2011 21:50:00', '27/07/2011 21:51:00', '27/07/2011 21:52:00', '27/07/2011 21:53:00', '27/07/2011 21:54:00', '27/07/2011 21:55:00', '27/07/2011 21:56:00', '27/07/2011 21:57:00', '27/07/2011 21:58:00', '27/07/2011 21:59:00', '27/07/2011 22:00:00', '27/07/2011 22:01:00', '27/07/2011 22:02:00', '27/07/2011 22:03:00', '27/07/2011 22:04:00', '27/07/2011 22:05:00', '27/07/2011 22:06:00', '27/07/2011 22:07:00', '27/07/2011 22:08:00', '27/07/2011 22:09:00', '27/07/2011 22:10:00', '27/07/2011 22:11:00', '27/07/2011 22:12:00', '27/07/2011 22:13:00', '27/07/2011 22:14:00', '27/07/2011 22:15:00', '27/07/2011 22:16:00', '27/07/2011 22:17:00', '27/07/2011 22:18:00', '27/07/2011 22:19:00', '27/07/2011 22:20:00', '27/07/2011 22:21:00', '27/07/2011 22:22:00', '27/07/2011 22:23:00', '27/07/2011 22:24:00', '27/07/2011 22:25:00', '27/07/2011 22:26:00', '27/07/2011 22:27:00', '27/07/2011 22:28:00', '27/07/2011 22:29:00', '27/07/2011 22:30:00', '27/07/2011 22:31:00', '27/07/2011 22:32:00', '27/07/2011 22:33:00', '27/07/2011 22:34:00', '27/07/2011 22:35:00', '27/07/2011 22:36:00', '27/07/2011 22:37:00', '27/07/2011 22:38:00', '27/07/2011 22:39:00', '27/07/2011 22:40:00', '27/07/2011 22:41:00', '27/07/2011 22:42:00', '27/07/2011 22:43:00', '27/07/2011 22:44:00', '27/07/2011 22:45:00', '27/07/2011 22:46:00', '27/07/2011 22:47:00', '27/07/2011 22:48:00', '27/07/2011 22:49:00', '27/07/2011 22:50:00', '27/07/2011 22:51:00', '27/07/2011 22:52:00', '27/07/2011 22:53:00', '27/07/2011 22:54:00', '27/07/2011 22:55:00', '27/07/2011 22:56:00', '27/07/2011 22:57:00', '27/07/2011 22:58:00', '27/07/2011 22:59:00', '27/07/2011 23:00:00', '27/07/2011 23:01:00', '27/07/2011 23:02:00', '27/07/2011 23:03:00', '27/07/2011 23:04:00', '27/07/2011 23:05:00', '27/07/2011 23:06:00', '27/07/2011 23:07:00', '27/07/2011 23:08:00', '27/07/2011 23:09:00', '27/07/2011 23:10:00', '27/07/2011 23:11:00', '27/07/2011 23:12:00', '27/07/2011 23:13:00', '27/07/2011 23:14:00', '27/07/2011 23:15:00', '27/07/2011 23:16:00', '27/07/2011 23:17:00', '27/07/2011 23:18:00', '27/07/2011 23:19:00', '27/07/2011 23:20:00', '27/07/2011 23:21:00', '27/07/2011 23:22:00', '27/07/2011 23:23:00', '27/07/2011 23:24:00', '27/07/2011 23:25:00', '27/07/2011 23:26:00', '27/07/2011 23:27:00', '27/07/2011 23:28:00', '27/07/2011 23:29:00', '27/07/2011 23:30:00', '27/07/2011 23:31:00', '27/07/2011 23:32:00', '27/07/2011 23:33:00', '27/07/2011 23:34:00', '27/07/2011 23:35:00', '27/07/2011 23:36:00', '27/07/2011 23:37:00', '27/07/2011 23:38:00', '27/07/2011 23:39:00', '27/07/2011 23:40:00', '27/07/2011 23:41:00', '27/07/2011 23:42:00', '27/07/2011 23:43:00', '27/07/2011 23:44:00', '27/07/2011 23:45:00', '27/07/2011 23:46:00', '27/07/2011 23:47:00', '27/07/2011 23:48:00', '27/07/2011 23:49:00', '27/07/2011 23:50:00', '27/07/2011 23:51:00', '27/07/2011 23:52:00', '27/07/2011 23:53:00', '27/07/2011 23:54:00', '27/07/2011 23:55:00', '27/07/2011 23:56:00', '27/07/2011 23:57:00', '27/07/2011 23:58:00', '27/07/2011 23:59:00', '28/07/2011 0:00:00', '28/07/2011 0:01:00', '28/07/2011 0:02:00', '28/07/2011 0:03:00', '28/07/2011 0:04:00', '28/07/2011 0:05:00', '28/07/2011 0:06:00', '28/07/2011 0:07:00', '28/07/2011 0:08:00', '28/07/2011 0:09:00', '28/07/2011 0:10:00', '28/07/2011 0:11:00', '28/07/2011 0:12:00', '28/07/2011 0:13:00', '28/07/2011 0:14:00', '28/07/2011 0:15:00', '28/07/2011 0:16:00', '28/07/2011 0:17:00', '28/07/2011 0:18:00', '28/07/2011 0:19:00', '28/07/2011 0:20:00', '28/07/2011 0:21:00', '28/07/2011 0:22:00', '28/07/2011 0:23:00', '28/07/2011 0:24:00', '28/07/2011 0:25:00', '28/07/2011 0:26:00', '28/07/2011 0:27:00', '28/07/2011 0:28:00', '28/07/2011 0:29:00', '28/07/2011 0:30:00', '28/07/2011 0:31:00', '28/07/2011 0:32:00', '28/07/2011 0:33:00', '28/07/2011 0:34:00', '28/07/2011 0:35:00', '28/07/2011 0:36:00', '28/07/2011 0:37:00', '28/07/2011 0:38:00', '28/07/2011 0:39:00', '28/07/2011 0:40:00', '28/07/2011 0:41:00', '28/07/2011 0:42:00', '28/07/2011 0:43:00', '28/07/2011 0:44:00', '28/07/2011 0:45:00', '28/07/2011 0:46:00', '28/07/2011 0:47:00', '28/07/2011 0:48:00', '28/07/2011 0:49:00', '28/07/2011 0:50:00', '28/07/2011 0:51:00', '28/07/2011 0:52:00', '28/07/2011 0:53:00', '28/07/2011 0:54:00', '28/07/2011 0:55:00', '28/07/2011 0:56:00', '28/07/2011 0:57:00', '28/07/2011 0:58:00', '28/07/2011 0:59:00', '28/07/2011 1:00:00', '28/07/2011 1:01:00', '28/07/2011 1:02:00', '28/07/2011 1:03:00', '28/07/2011 1:04:00', '28/07/2011 1:05:00', '28/07/2011 1:06:00', '28/07/2011 1:07:00', '28/07/2011 1:08:00', '28/07/2011 1:09:00', '28/07/2011 1:10:00', '28/07/2011 1:11:00', '28/07/2011 1:12:00', '28/07/2011 1:13:00', '28/07/2011 1:14:00', '28/07/2011 1:15:00', '28/07/2011 1:16:00', '28/07/2011 1:17:00', '28/07/2011 1:18:00', '28/07/2011 1:19:00', '28/07/2011 1:20:00', '28/07/2011 1:21:00', '28/07/2011 1:22:00', '28/07/2011 1:23:00', '28/07/2011 1:24:00', '28/07/2011 1:25:00', '28/07/2011 1:26:00', '28/07/2011 1:27:00', '28/07/2011 1:28:00', '28/07/2011 1:29:00', '28/07/2011 1:30:00', '28/07/2011 1:31:00', '28/07/2011 1:32:00', '28/07/2011 1:33:00', '28/07/2011 1:34:00', '28/07/2011 1:35:00', '28/07/2011 1:36:00', '28/07/2011 1:37:00', '28/07/2011 1:38:00', '28/07/2011 1:39:00', '28/07/2011 1:40:00', '28/07/2011 1:41:00', '28/07/2011 1:42:00', '28/07/2011 1:43:00', '28/07/2011 1:44:00', '28/07/2011 1:45:00', '28/07/2011 1:46:00', '28/07/2011 1:47:00', '28/07/2011 1:48:00', '28/07/2011 1:49:00', '28/07/2011 1:50:00', '28/07/2011 1:51:00', '28/07/2011 1:52:00', '28/07/2011 1:53:00', '28/07/2011 1:54:00', '28/07/2011 1:55:00', '28/07/2011 1:56:00', '28/07/2011 1:57:00', '28/07/2011 1:58:00', '28/07/2011 1:59:00', '28/07/2011 2:00:00', '28/07/2011 2:01:00', '28/07/2011 2:02:00', '28/07/2011 2:03:00', '28/07/2011 2:04:00', '28/07/2011 2:05:00', '28/07/2011 2:06:00', '28/07/2011 2:07:00', '28/07/2011 2:08:00', '28/07/2011 2:09:00', '28/07/2011 2:10:00', '28/07/2011 2:11:00', '28/07/2011 2:12:00', '28/07/2011 2:13:00', '28/07/2011 2:14:00', '28/07/2011 2:15:00', '28/07/2011 2:16:00', '28/07/2011 2:17:00', '28/07/2011 2:18:00', '28/07/2011 2:19:00', '28/07/2011 2:20:00', '28/07/2011 2:21:00', '28/07/2011 2:22:00', '28/07/2011 2:23:00', '28/07/2011 2:24:00', '28/07/2011 2:25:00', '28/07/2011 2:26:00', '28/07/2011 2:27:00', '28/07/2011 2:28:00', '28/07/2011 2:29:00', '28/07/2011 2:30:00', '28/07/2011 2:31:00', '28/07/2011 2:32:00', '28/07/2011 2:33:00', '28/07/2011 2:34:00', '28/07/2011 2:35:00', '28/07/2011 2:36:00', '28/07/2011 2:37:00', '28/07/2011 2:38:00', '28/07/2011 2:39:00', '28/07/2011 2:40:00', '28/07/2011 2:41:00', '28/07/2011 2:42:00', '28/07/2011 2:43:00', '28/07/2011 2:44:00', '28/07/2011 2:45:00', '28/07/2011 2:46:00', '28/07/2011 2:47:00', '28/07/2011 2:48:00', '28/07/2011 2:49:00', '28/07/2011 2:50:00', '28/07/2011 2:51:00', '28/07/2011 2:52:00', '28/07/2011 2:53:00', '28/07/2011 2:54:00', '28/07/2011 2:55:00', '28/07/2011 2:56:00', '28/07/2011 2:57:00', '28/07/2011 2:58:00', '28/07/2011 2:59:00', '28/07/2011 3:00:00', '28/07/2011 3:01:00', '28/07/2011 3:02:00', '28/07/2011 3:03:00', '28/07/2011 3:04:00', '28/07/2011 3:05:00', '28/07/2011 3:06:00', '28/07/2011 3:07:00', '28/07/2011 3:08:00', '28/07/2011 3:09:00', '28/07/2011 3:10:00', '28/07/2011 3:11:00', '28/07/2011 3:12:00', '28/07/2011 3:13:00', '28/07/2011 3:14:00', '28/07/2011 3:15:00', '28/07/2011 3:16:00', '28/07/2011 3:17:00', '28/07/2011 3:18:00', '28/07/2011 3:19:00', '28/07/2011 3:20:00', '28/07/2011 3:21:00', '28/07/2011 3:22:00', '28/07/2011 3:23:00', '28/07/2011 3:24:00', '28/07/2011 3:25:00', '28/07/2011 3:26:00', '28/07/2011 3:27:00', '28/07/2011 3:28:00', '28/07/2011 3:29:00', '28/07/2011 3:30:00', '28/07/2011 3:31:00', '28/07/2011 3:32:00', '28/07/2011 3:33:00', '28/07/2011 3:34:00', '28/07/2011 3:35:00', '28/07/2011 3:36:00', '28/07/2011 3:37:00', '28/07/2011 3:38:00', '28/07/2011 3:39:00', '28/07/2011 3:40:00', '28/07/2011 3:41:00', '28/07/2011 3:42:00', '28/07/2011 3:43:00', '28/07/2011 3:44:00', '28/07/2011 3:45:00', '28/07/2011 3:46:00', '28/07/2011 3:47:00', '28/07/2011 3:48:00', '28/07/2011 3:49:00', '28/07/2011 3:50:00', '28/07/2011 3:51:00', '28/07/2011 3:52:00', '28/07/2011 3:53:00', '28/07/2011 3:54:00', '28/07/2011 3:55:00', '28/07/2011 3:56:00', '28/07/2011 3:57:00', '28/07/2011 3:58:00', '28/07/2011 3:59:00', '28/07/2011 4:00:00', '28/07/2011 4:01:00', '28/07/2011 4:02:00', '28/07/2011 4:03:00', '28/07/2011 4:04:00', '28/07/2011 4:05:00', '28/07/2011 4:06:00', '28/07/2011 4:07:00', '28/07/2011 4:08:00', '28/07/2011 4:09:00', '28/07/2011 4:10:00', '28/07/2011 4:11:00', '28/07/2011 4:12:00', '28/07/2011 4:13:00', '28/07/2011 4:14:00', '28/07/2011 4:15:00', '28/07/2011 4:16:00', '28/07/2011 4:17:00', '28/07/2011 4:18:00', '28/07/2011 4:19:00', '28/07/2011 4:20:00', '28/07/2011 4:21:00', '28/07/2011 4:22:00', '28/07/2011 4:23:00', '28/07/2011 4:24:00', '28/07/2011 4:25:00', '28/07/2011 4:26:00', '28/07/2011 4:27:00', '28/07/2011 4:28:00', '28/07/2011 4:29:00', '28/07/2011 4:30:00', '28/07/2011 4:31:00', '28/07/2011 4:32:00', '28/07/2011 4:33:00', '28/07/2011 4:34:00', '28/07/2011 4:35:00', '28/07/2011 4:36:00', '28/07/2011 4:37:00', '28/07/2011 4:38:00', '28/07/2011 4:39:00', '28/07/2011 4:40:00', '28/07/2011 4:41:00', '28/07/2011 4:42:00', '28/07/2011 4:43:00', '28/07/2011 4:44:00', '28/07/2011 4:45:00', '28/07/2011 4:46:00', '28/07/2011 4:47:00', '28/07/2011 4:48:00', '28/07/2011 4:49:00', '28/07/2011 4:50:00', '28/07/2011 4:51:00', '28/07/2011 4:52:00', '28/07/2011 4:53:00', '28/07/2011 4:54:00', '28/07/2011 4:55:00', '28/07/2011 4:56:00', '28/07/2011 4:57:00', '28/07/2011 4:58:00', '28/07/2011 4:59:00', '28/07/2011 5:00:00', '28/07/2011 5:01:00', '28/07/2011 5:02:00', '28/07/2011 5:03:00', '28/07/2011 5:04:00', '28/07/2011 5:05:00', '28/07/2011 5:06:00', '28/07/2011 5:07:00', '28/07/2011 5:08:00', '28/07/2011 5:09:00', '28/07/2011 5:10:00', '28/07/2011 5:11:00', '28/07/2011 5:12:00', '28/07/2011 5:13:00', '28/07/2011 5:14:00', '28/07/2011 5:15:00', '28/07/2011 5:16:00', '28/07/2011 5:17:00', '28/07/2011 5:18:00', '28/07/2011 5:19:00', '28/07/2011 5:20:00', '28/07/2011 5:21:00', '28/07/2011 5:22:00', '28/07/2011 5:23:00', '28/07/2011 5:24:00', '28/07/2011 5:25:00', '28/07/2011 5:26:00', '28/07/2011 5:27:00', '28/07/2011 5:28:00', '28/07/2011 5:29:00', '28/07/2011 5:30:00', '28/07/2011 5:31:00', '28/07/2011 5:32:00', '28/07/2011 5:33:00', '28/07/2011 5:34:00', '28/07/2011 5:35:00', '28/07/2011 5:36:00', '28/07/2011 5:37:00', '28/07/2011 5:38:00', '28/07/2011 5:39:00', '28/07/2011 5:40:00', '28/07/2011 5:41:00', '28/07/2011 5:42:00', '28/07/2011 5:43:00', '28/07/2011 5:44:00', '28/07/2011 5:45:00', '28/07/2011 5:46:00', '28/07/2011 5:47:00', '28/07/2011 5:48:00', '28/07/2011 5:49:00', '28/07/2011 5:50:00', '28/07/2011 5:51:00', '28/07/2011 5:52:00', '28/07/2011 5:53:00', '28/07/2011 5:54:00', '28/07/2011 5:55:00', '28/07/2011 5:56:00', '28/07/2011 5:57:00', '28/07/2011 5:58:00', '28/07/2011 5:59:00', '28/07/2011 6:00:00', '28/07/2011 6:01:00', '28/07/2011 6:02:00', '28/07/2011 6:03:00', '28/07/2011 6:04:00', '28/07/2011 6:05:00', '28/07/2011 6:06:00', '28/07/2011 6:07:00', '28/07/2011 6:08:00', '28/07/2011 6:09:00', '28/07/2011 6:10:00', '28/07/2011 6:11:00', '28/07/2011 6:12:00', '28/07/2011 6:13:00', '28/07/2011 6:14:00', '28/07/2011 6:15:00', '28/07/2011 6:16:00', '28/07/2011 6:17:00', '28/07/2011 6:18:00', '28/07/2011 6:19:00', '28/07/2011 6:20:00', '28/07/2011 6:21:00', '28/07/2011 6:22:00', '28/07/2011 6:23:00', '28/07/2011 6:24:00', '28/07/2011 6:25:00', '28/07/2011 6:26:00', '28/07/2011 6:27:00', '28/07/2011 6:28:00', '28/07/2011 6:29:00', '28/07/2011 6:30:00', '28/07/2011 6:31:00', '28/07/2011 6:32:00', '28/07/2011 6:33:00', '28/07/2011 6:34:00', '28/07/2011 6:35:00', '28/07/2011 6:36:00', '28/07/2011 6:37:00', '28/07/2011 6:38:00', '28/07/2011 6:39:00', '28/07/2011 6:40:00', '28/07/2011 6:41:00', '28/07/2011 6:42:00', '28/07/2011 6:43:00', '28/07/2011 6:44:00', '28/07/2011 6:45:00', '28/07/2011 6:46:00', '28/07/2011 6:47:00', '28/07/2011 6:48:00', '28/07/2011 6:49:00', '28/07/2011 6:50:00', '28/07/2011 6:51:00', '28/07/2011 6:52:00', '28/07/2011 6:53:00', '28/07/2011 6:54:00', '28/07/2011 6:55:00', '28/07/2011 6:56:00', '28/07/2011 6:57:00', '28/07/2011 6:58:00', '28/07/2011 6:59:00', '28/07/2011 7:00:00', '28/07/2011 7:01:00', '28/07/2011 7:02:00', '28/07/2011 7:03:00', '28/07/2011 7:04:00', '28/07/2011 7:05:00', '28/07/2011 7:06:00', '28/07/2011 7:07:00', '28/07/2011 7:08:00', '28/07/2011 7:09:00', '28/07/2011 7:10:00', '28/07/2011 7:11:00', '28/07/2011 7:12:00', '28/07/2011 7:13:00', '28/07/2011 7:14:00', '28/07/2011 7:15:00', '28/07/2011 7:16:00', '28/07/2011 7:17:00', '28/07/2011 7:18:00', '28/07/2011 7:19:00', '28/07/2011 7:20:00', '28/07/2011 7:21:00', '28/07/2011 7:22:00', '28/07/2011 7:23:00', '28/07/2011 7:24:00', '28/07/2011 7:25:00', '28/07/2011 7:26:00', '28/07/2011 7:27:00', '28/07/2011 7:28:00', '28/07/2011 7:29:00', '28/07/2011 7:30:00', '28/07/2011 7:31:00', '28/07/2011 7:32:00', '28/07/2011 7:33:00', '28/07/2011 7:34:00', '28/07/2011 7:35:00', '28/07/2011 7:36:00', '28/07/2011 7:37:00', '28/07/2011 7:38:00', '28/07/2011 7:39:00', '28/07/2011 7:40:00', '28/07/2011 7:41:00', '28/07/2011 7:42:00', '28/07/2011 7:43:00', '28/07/2011 7:44:00', '28/07/2011 7:45:00', '28/07/2011 7:46:00', '28/07/2011 7:47:00', '28/07/2011 7:48:00', '28/07/2011 7:49:00', '28/07/2011 7:50:00', '28/07/2011 7:51:00', '28/07/2011 7:52:00', '28/07/2011 7:53:00', '28/07/2011 7:54:00', '28/07/2011 7:55:00', '28/07/2011 7:56:00', '28/07/2011 7:57:00', '28/07/2011 7:58:00', '28/07/2011 7:59:00', '28/07/2011 8:00:00', '28/07/2011 8:01:00', '28/07/2011 8:02:00', '28/07/2011 8:03:00', '28/07/2011 8:04:00', '28/07/2011 8:05:00', '28/07/2011 8:06:00', '28/07/2011 8:07:00', '28/07/2011 8:08:00', '28/07/2011 8:09:00', '28/07/2011 8:10:00', '28/07/2011 8:11:00', '28/07/2011 8:12:00', '28/07/2011 8:13:00', '28/07/2011 8:14:00', '28/07/2011 8:15:00', '28/07/2011 8:16:00', '28/07/2011 8:17:00', '28/07/2011 8:18:00', '28/07/2011 8:19:00', '28/07/2011 8:20:00', '28/07/2011 8:21:00', '28/07/2011 8:22:00', '28/07/2011 8:23:00', '28/07/2011 8:24:00', '28/07/2011 8:25:00', '28/07/2011 8:26:00', '28/07/2011 8:27:00', '28/07/2011 8:28:00', '28/07/2011 8:29:00', '28/07/2011 8:30:00', '28/07/2011 8:31:00', '28/07/2011 8:32:00', '28/07/2011 8:33:00', '28/07/2011 8:34:00', '28/07/2011 8:35:00', '28/07/2011 8:36:00', '28/07/2011 8:37:00', '28/07/2011 8:38:00', '28/07/2011 8:39:00', '28/07/2011 8:40:00', '28/07/2011 8:41:00', '28/07/2011 8:42:00', '28/07/2011 8:43:00', '28/07/2011 8:44:00', '28/07/2011 8:45:00', '28/07/2011 8:46:00', '28/07/2011 8:47:00', '28/07/2011 8:48:00', '28/07/2011 8:49:00', '28/07/2011 8:50:00', '28/07/2011 8:51:00', '28/07/2011 8:52:00', '28/07/2011 8:53:00', '28/07/2011 8:54:00', '28/07/2011 8:55:00', '28/07/2011 8:56:00', '28/07/2011 8:57:00', '28/07/2011 8:58:00', '28/07/2011 8:59:00', '28/07/2011 9:00:00', '28/07/2011 9:01:00', '28/07/2011 9:02:00', '28/07/2011 9:03:00', '28/07/2011 9:04:00', '28/07/2011 9:05:00', '28/07/2011 9:06:00', '28/07/2011 9:07:00', '28/07/2011 9:08:00', '28/07/2011 9:09:00', '28/07/2011 9:10:00', '28/07/2011 9:11:00', '28/07/2011 9:12:00', '28/07/2011 9:13:00', '28/07/2011 9:14:00', '28/07/2011 9:15:00', '28/07/2011 9:16:00', '28/07/2011 9:17:00', '28/07/2011 9:18:00', '28/07/2011 9:19:00', '28/07/2011 9:20:00', '28/07/2011 9:21:00', '28/07/2011 9:22:00', '28/07/2011 9:23:00', '28/07/2011 9:24:00', '28/07/2011 9:25:00', '28/07/2011 9:26:00', '28/07/2011 9:27:00', '28/07/2011 9:28:00', '28/07/2011 9:29:00', '28/07/2011 9:30:00', '28/07/2011 9:31:00', '28/07/2011 9:32:00', '28/07/2011 9:33:00', '28/07/2011 9:34:00', '28/07/2011 9:35:00', '28/07/2011 9:36:00', '28/07/2011 9:37:00', '28/07/2011 9:38:00', '28/07/2011 9:39:00', '28/07/2011 9:40:00', '28/07/2011 9:41:00', '28/07/2011 9:42:00', '28/07/2011 9:43:00', '28/07/2011 9:44:00', '28/07/2011 9:45:00', '28/07/2011 9:46:00', '28/07/2011 9:47:00', '28/07/2011 9:48:00', '28/07/2011 9:49:00', '28/07/2011 9:50:00', '28/07/2011 9:51:00', '28/07/2011 9:52:00', '28/07/2011 9:53:00', '28/07/2011 9:54:00', '28/07/2011 9:55:00', '28/07/2011 9:56:00', '28/07/2011 9:57:00', '28/07/2011 9:58:00', '28/07/2011 9:59:00', '28/07/2011 10:00:00', '28/07/2011 10:01:00', '28/07/2011 10:02:00', '28/07/2011 10:03:00', '28/07/2011 10:04:00', '28/07/2011 10:05:00', '28/07/2011 10:06:00', '28/07/2011 10:07:00', '28/07/2011 10:08:00', '28/07/2011 10:09:00', '28/07/2011 10:10:00', '28/07/2011 10:11:00', '28/07/2011 10:12:00', '28/07/2011 10:13:00', '28/07/2011 10:14:00', '28/07/2011 10:15:00', '28/07/2011 10:16:00', '28/07/2011 10:17:00', '28/07/2011 10:18:00', '28/07/2011 10:19:00', '28/07/2011 10:20:00', '28/07/2011 10:21:00', '28/07/2011 10:22:00', '28/07/2011 10:23:00', '28/07/2011 10:24:00', '28/07/2011 10:25:00', '28/07/2011 10:26:00', '28/07/2011 10:27:00', '28/07/2011 10:28:00', '28/07/2011 10:29:00', '28/07/2011 10:30:00', '28/07/2011 10:31:00', '28/07/2011 10:32:00', '28/07/2011 10:33:00', '28/07/2011 10:34:00', '28/07/2011 10:35:00', '28/07/2011 10:36:00', '28/07/2011 10:37:00', '28/07/2011 10:38:00', '28/07/2011 10:39:00', '28/07/2011 10:40:00', '28/07/2011 10:41:00', '28/07/2011 10:42:00', '28/07/2011 10:43:00', '28/07/2011 10:44:00', '28/07/2011 10:45:00', '28/07/2011 10:46:00', '28/07/2011 10:47:00', '28/07/2011 10:48:00', '28/07/2011 10:49:00', '28/07/2011 10:50:00', '28/07/2011 10:51:00', '28/07/2011 10:52:00', '28/07/2011 10:53:00', '28/07/2011 10:54:00', '28/07/2011 10:55:00', '28/07/2011 10:56:00', '28/07/2011 10:57:00', '28/07/2011 10:58:00', '28/07/2011 10:59:00', '28/07/2011 11:00:00', '28/07/2011 11:01:00', '28/07/2011 11:02:00', '28/07/2011 11:03:00', '28/07/2011 11:04:00', '28/07/2011 11:05:00', '28/07/2011 11:06:00', '28/07/2011 11:07:00', '28/07/2011 11:08:00', '28/07/2011 11:09:00', '28/07/2011 11:10:00', '28/07/2011 11:11:00', '28/07/2011 11:12:00', '28/07/2011 11:13:00', '28/07/2011 11:14:00', '28/07/2011 11:15:00', '28/07/2011 11:16:00', '28/07/2011 11:17:00', '28/07/2011 11:18:00', '28/07/2011 11:19:00', '28/07/2011 11:20:00', '28/07/2011 11:21:00', '28/07/2011 11:22:00', '28/07/2011 11:23:00', '28/07/2011 11:24:00', '28/07/2011 11:25:00', '28/07/2011 11:26:00', '28/07/2011 11:27:00', '28/07/2011 11:28:00', '28/07/2011 11:29:00', '28/07/2011 11:30:00', '28/07/2011 11:31:00', '28/07/2011 11:32:00', '28/07/2011 11:33:00', '28/07/2011 11:34:00', '28/07/2011 11:35:00', '28/07/2011 11:36:00', '28/07/2011 11:37:00', '28/07/2011 11:38:00', '28/07/2011 11:39:00', '28/07/2011 11:40:00', '28/07/2011 11:41:00', '28/07/2011 11:42:00', '28/07/2011 11:43:00', '28/07/2011 11:44:00', '28/07/2011 11:45:00', '28/07/2011 11:46:00', '28/07/2011 11:47:00', '28/07/2011 11:48:00', '28/07/2011 11:49:00', '28/07/2011 11:50:00', '28/07/2011 11:51:00', '28/07/2011 11:52:00', '28/07/2011 11:53:00', '28/07/2011 11:54:00', '28/07/2011 11:55:00', '28/07/2011 11:56:00', '28/07/2011 11:57:00', '28/07/2011 11:58:00', '28/07/2011 11:59:00', '28/07/2011 12:00:00', '28/07/2011 12:01:00', '28/07/2011 12:02:00', '28/07/2011 12:03:00', '28/07/2011 12:04:00', '28/07/2011 12:05:00', '28/07/2011 12:06:00', '28/07/2011 12:07:00', '28/07/2011 12:08:00', '28/07/2011 12:09:00', '28/07/2011 12:10:00', '28/07/2011 12:11:00', '28/07/2011 12:12:00', '28/07/2011 12:13:00', '28/07/2011 12:14:00', '28/07/2011 12:15:00', '28/07/2011 12:16:00', '28/07/2011 12:17:00', '28/07/2011 12:18:00', '28/07/2011 12:19:00', '28/07/2011 12:20:00', '28/07/2011 12:21:00', '28/07/2011 12:22:00', '28/07/2011 12:23:00', '28/07/2011 12:24:00', '28/07/2011 12:25:00', '28/07/2011 12:26:00', '28/07/2011 12:27:00', '28/07/2011 12:28:00', '28/07/2011 12:29:00', '28/07/2011 12:30:00', '28/07/2011 12:31:00', '28/07/2011 12:32:00', '28/07/2011 12:33:00', '28/07/2011 12:34:00', '28/07/2011 12:35:00', '28/07/2011 12:36:00', '28/07/2011 12:37:00', '28/07/2011 12:38:00', '28/07/2011 12:39:00', '28/07/2011 12:40:00', '28/07/2011 12:41:00', '28/07/2011 12:42:00', '28/07/2011 12:43:00', '28/07/2011 12:44:00', '28/07/2011 12:45:00', '28/07/2011 12:46:00', '28/07/2011 12:47:00', '28/07/2011 12:48:00', '28/07/2011 12:49:00', '28/07/2011 12:50:00', '28/07/2011 12:51:00', '28/07/2011 12:52:00', '28/07/2011 12:53:00', '28/07/2011 12:54:00', '28/07/2011 12:55:00', '28/07/2011 12:56:00', '28/07/2011 12:57:00', '28/07/2011 12:58:00', '28/07/2011 12:59:00', '28/07/2011 13:00:00', '28/07/2011 13:01:00', '28/07/2011 13:02:00', '28/07/2011 13:03:00', '28/07/2011 13:04:00', '28/07/2011 13:05:00', '28/07/2011 13:06:00', '28/07/2011 13:07:00', '28/07/2011 13:08:00', '28/07/2011 13:09:00', '28/07/2011 13:10:00', '28/07/2011 13:11:00', '28/07/2011 13:12:00', '28/07/2011 13:13:00', '28/07/2011 13:14:00', '28/07/2011 13:15:00', '28/07/2011 13:16:00', '28/07/2011 13:17:00', '28/07/2011 13:18:00', '28/07/2011 13:19:00', '28/07/2011 13:20:00', '28/07/2011 13:21:00', '28/07/2011 13:22:00', '28/07/2011 13:23:00', '28/07/2011 13:24:00', '28/07/2011 13:25:00', '28/07/2011 13:26:00', '28/07/2011 13:27:00', '28/07/2011 13:28:00', '28/07/2011 13:29:00', '28/07/2011 13:30:00', '28/07/2011 13:31:00', '28/07/2011 13:32:00', '28/07/2011 13:33:00', '28/07/2011 13:34:00', '28/07/2011 13:35:00', '28/07/2011 13:36:00', '28/07/2011 13:37:00', '28/07/2011 13:38:00', '28/07/2011 13:39:00', '28/07/2011 13:40:00', '28/07/2011 13:41:00', '28/07/2011 13:42:00', '28/07/2011 13:43:00', '28/07/2011 13:44:00', '28/07/2011 13:45:00', '28/07/2011 13:46:00', '28/07/2011 13:47:00', '28/07/2011 13:48:00', '28/07/2011 13:49:00', '28/07/2011 13:50:00', '28/07/2011 13:51:00', '28/07/2011 13:52:00', '28/07/2011 13:53:00', '28/07/2011 13:54:00', '28/07/2011 13:55:00', '28/07/2011 13:56:00', '28/07/2011 13:57:00', '28/07/2011 13:58:00', '28/07/2011 13:59:00', '28/07/2011 14:00:00', '28/07/2011 14:01:00', '28/07/2011 14:02:00', '28/07/2011 14:03:00', '28/07/2011 14:04:00', '28/07/2011 14:05:00', '28/07/2011 14:06:00', '28/07/2011 14:07:00', '28/07/2011 14:08:00', '28/07/2011 14:09:00', '28/07/2011 14:10:00', '28/07/2011 14:11:00', '28/07/2011 14:12:00', '28/07/2011 14:13:00', '28/07/2011 14:14:00', '28/07/2011 14:15:00', '28/07/2011 14:16:00', '28/07/2011 14:17:00', '28/07/2011 14:18:00', '28/07/2011 14:19:00', '28/07/2011 14:20:00', '28/07/2011 14:21:00', '28/07/2011 14:22:00', '28/07/2011 14:23:00', '28/07/2011 14:24:00', '28/07/2011 14:25:00', '28/07/2011 14:26:00', '28/07/2011 14:27:00', '28/07/2011 14:28:00', '28/07/2011 14:29:00', '28/07/2011 14:30:00', '28/07/2011 14:31:00', '28/07/2011 14:32:00', '28/07/2011 14:33:00', '28/07/2011 14:34:00', '28/07/2011 14:35:00', '28/07/2011 14:36:00', '28/07/2011 14:37:00', '28/07/2011 14:38:00', '28/07/2011 14:39:00', '28/07/2011 14:40:00', '28/07/2011 14:41:00', '28/07/2011 14:42:00', '28/07/2011 14:43:00', '28/07/2011 14:44:00', '28/07/2011 14:45:00', '28/07/2011 14:46:00', '28/07/2011 14:47:00', '28/07/2011 14:48:00', '28/07/2011 14:49:00', '28/07/2011 14:50:00', '28/07/2011 14:51:00', '28/07/2011 14:52:00', '28/07/2011 14:53:00', '28/07/2011 14:54:00', '28/07/2011 14:55:00', '28/07/2011 14:56:00', '28/07/2011 14:57:00', '28/07/2011 14:58:00', '28/07/2011 14:59:00', '28/07/2011 15:00:00', '28/07/2011 15:01:00', '28/07/2011 15:02:00', '28/07/2011 15:03:00', '28/07/2011 15:04:00', '28/07/2011 15:05:00', '28/07/2011 15:06:00', '28/07/2011 15:07:00', '28/07/2011 15:08:00', '28/07/2011 15:09:00', '28/07/2011 15:10:00', '28/07/2011 15:11:00', '28/07/2011 15:12:00', '28/07/2011 15:13:00', '28/07/2011 15:14:00', '28/07/2011 15:15:00', '28/07/2011 15:16:00', '28/07/2011 15:17:00', '28/07/2011 15:18:00', '28/07/2011 15:19:00', '28/07/2011 15:20:00', '28/07/2011 15:21:00', '28/07/2011 15:22:00', '28/07/2011 15:23:00', '28/07/2011 15:24:00', '28/07/2011 15:25:00', '28/07/2011 15:26:00', '28/07/2011 15:27:00', '28/07/2011 15:28:00', '28/07/2011 15:29:00', '28/07/2011 15:30:00', '28/07/2011 15:31:00', '28/07/2011 15:32:00', '28/07/2011 15:33:00', '28/07/2011 15:34:00', '28/07/2011 15:35:00', '28/07/2011 15:36:00', '28/07/2011 15:37:00', '28/07/2011 15:38:00', '28/07/2011 15:39:00', '28/07/2011 15:40:00', '28/07/2011 15:41:00', '28/07/2011 15:42:00', '28/07/2011 15:43:00', '28/07/2011 15:44:00', '28/07/2011 15:45:00', '28/07/2011 15:46:00', '28/07/2011 15:47:00', '28/07/2011 15:48:00', '28/07/2011 15:49:00', '28/07/2011 15:50:00', '28/07/2011 15:51:00', '28/07/2011 15:52:00', '28/07/2011 15:53:00', '28/07/2011 15:54:00', '28/07/2011 15:55:00', '28/07/2011 15:56:00', '28/07/2011 15:57:00', '28/07/2011 15:58:00', '28/07/2011 15:59:00', '28/07/2011 16:00:00', '28/07/2011 16:01:00', '28/07/2011 16:02:00', '28/07/2011 16:03:00', '28/07/2011 16:04:00', '28/07/2011 16:05:00', '28/07/2011 16:06:00', '28/07/2011 16:07:00', '28/07/2011 16:08:00', '28/07/2011 16:09:00', '28/07/2011 16:10:00', '28/07/2011 16:11:00', '28/07/2011 16:12:00', '28/07/2011 16:13:00', '28/07/2011 16:14:00', '28/07/2011 16:15:00', '28/07/2011 16:16:00', '28/07/2011 16:17:00', '28/07/2011 16:18:00', '28/07/2011 16:19:00', '28/07/2011 16:20:00', '28/07/2011 16:21:00', '28/07/2011 16:22:00', '28/07/2011 16:23:00', '28/07/2011 16:24:00', '28/07/2011 16:25:00', '28/07/2011 16:26:00', '28/07/2011 16:27:00', '28/07/2011 16:28:00', '28/07/2011 16:29:00', '28/07/2011 16:30:00', '28/07/2011 16:31:00', '28/07/2011 16:32:00', '28/07/2011 16:33:00', '28/07/2011 16:34:00', '28/07/2011 16:35:00', '28/07/2011 16:36:00', '28/07/2011 16:37:00', '28/07/2011 16:38:00', '28/07/2011 16:39:00', '28/07/2011 16:40:00', '28/07/2011 16:41:00', '28/07/2011 16:42:00', '28/07/2011 16:43:00', '28/07/2011 16:44:00', '28/07/2011 16:45:00', '28/07/2011 16:46:00', '28/07/2011 16:47:00', '28/07/2011 16:48:00', '28/07/2011 16:49:00', '28/07/2011 16:50:00', '28/07/2011 16:51:00', '28/07/2011 16:52:00', '28/07/2011 16:53:00', '28/07/2011 16:54:00', '28/07/2011 16:55:00', '28/07/2011 16:56:00', '28/07/2011 16:57:00', '28/07/2011 16:58:00', '28/07/2011 16:59:00', '28/07/2011 17:00:00', '28/07/2011 17:01:00', '28/07/2011 17:02:00', '28/07/2011 17:03:00', '28/07/2011 17:04:00', '28/07/2011 17:05:00', '28/07/2011 17:06:00', '28/07/2011 17:07:00', '28/07/2011 17:08:00', '28/07/2011 17:09:00', '28/07/2011 17:10:00', '28/07/2011 17:11:00', '28/07/2011 17:12:00', '28/07/2011 17:13:00', '28/07/2011 17:14:00', '28/07/2011 17:15:00', '28/07/2011 17:16:00', '28/07/2011 17:17:00', '28/07/2011 17:18:00', '28/07/2011 17:19:00', '28/07/2011 17:20:00', '28/07/2011 17:21:00', '28/07/2011 17:22:00', '28/07/2011 17:23:00', '28/07/2011 17:24:00', '28/07/2011 17:25:00', '28/07/2011 17:26:00', '28/07/2011 17:27:00', '28/07/2011 17:28:00', '28/07/2011 17:29:00', '28/07/2011 17:30:00', '28/07/2011 17:31:00', '28/07/2011 17:32:00', '28/07/2011 17:33:00', '28/07/2011 17:34:00', '28/07/2011 17:35:00', '28/07/2011 17:36:00', '28/07/2011 17:37:00', '28/07/2011 17:38:00', '28/07/2011 17:39:00', '28/07/2011 17:40:00', '28/07/2011 17:41:00', '28/07/2011 17:42:00', '28/07/2011 17:43:00', '28/07/2011 17:44:00', '28/07/2011 17:45:00', '28/07/2011 17:46:00', '28/07/2011 17:47:00', '28/07/2011 17:48:00', '28/07/2011 17:49:00', '28/07/2011 17:50:00', '28/07/2011 17:51:00', '28/07/2011 17:52:00', '28/07/2011 17:53:00', '28/07/2011 17:54:00', '28/07/2011 17:55:00', '28/07/2011 17:56:00', '28/07/2011 17:57:00', '28/07/2011 17:58:00', '28/07/2011 17:59:00', '28/07/2011 18:00:00', '28/07/2011 18:01:00', '28/07/2011 18:02:00', '28/07/2011 18:03:00', '28/07/2011 18:04:00', '28/07/2011 18:05:00', '28/07/2011 18:06:00', '28/07/2011 18:07:00', '28/07/2011 18:08:00', '28/07/2011 18:09:00', '28/07/2011 18:10:00', '28/07/2011 18:11:00', '28/07/2011 18:12:00', '28/07/2011 18:13:00', '28/07/2011 18:14:00', '28/07/2011 18:15:00', '28/07/2011 18:16:00', '28/07/2011 18:17:00', '28/07/2011 18:18:00', '28/07/2011 18:19:00', '28/07/2011 18:20:00', '28/07/2011 18:21:00', '28/07/2011 18:22:00', '28/07/2011 18:23:00', '28/07/2011 18:24:00', '28/07/2011 18:25:00', '28/07/2011 18:26:00', '28/07/2011 18:27:00', '28/07/2011 18:28:00', '28/07/2011 18:29:00', '28/07/2011 18:30:00', '28/07/2011 18:31:00', '28/07/2011 18:32:00', '28/07/2011 18:33:00', '28/07/2011 18:34:00', '28/07/2011 18:35:00', '28/07/2011 18:36:00', '28/07/2011 18:37:00', '28/07/2011 18:38:00', '28/07/2011 18:39:00', '28/07/2011 18:40:00', '28/07/2011 18:41:00', '28/07/2011 18:42:00', '28/07/2011 18:43:00', '28/07/2011 18:44:00', '28/07/2011 18:45:00', '28/07/2011 18:46:00', '28/07/2011 18:47:00', '28/07/2011 18:48:00', '28/07/2011 18:49:00', '28/07/2011 18:50:00', '28/07/2011 18:51:00', '28/07/2011 18:52:00', '28/07/2011 18:53:00', '28/07/2011 18:54:00', '28/07/2011 18:55:00', '28/07/2011 18:56:00', '28/07/2011 18:57:00', '28/07/2011 18:58:00', '28/07/2011 18:59:00', '28/07/2011 19:00:00', '28/07/2011 19:01:00', '28/07/2011 19:02:00', '28/07/2011 19:03:00', '28/07/2011 19:04:00', '28/07/2011 19:05:00', '28/07/2011 19:06:00', '28/07/2011 19:07:00', '28/07/2011 19:08:00', '28/07/2011 19:09:00', '28/07/2011 19:10:00', '28/07/2011 19:11:00', '28/07/2011 19:12:00', '28/07/2011 19:13:00', '28/07/2011 19:14:00', '28/07/2011 19:15:00', '28/07/2011 19:16:00', '28/07/2011 19:17:00', '28/07/2011 19:18:00', '28/07/2011 19:19:00', '28/07/2011 19:20:00', '28/07/2011 19:21:00', '28/07/2011 19:22:00', '28/07/2011 19:23:00', '28/07/2011 19:24:00', '28/07/2011 19:25:00', '28/07/2011 19:26:00', '28/07/2011 19:27:00', '28/07/2011 19:28:00', '28/07/2011 19:29:00', '28/07/2011 19:30:00', '28/07/2011 19:31:00', '28/07/2011 19:32:00', '28/07/2011 19:33:00', '28/07/2011 19:34:00', '28/07/2011 19:35:00', '28/07/2011 19:36:00', '28/07/2011 19:37:00', '28/07/2011 19:38:00', '28/07/2011 19:39:00', '28/07/2011 19:40:00', '28/07/2011 19:41:00', '28/07/2011 19:42:00', '28/07/2011 19:43:00', '28/07/2011 19:44:00', '28/07/2011 19:45:00', '28/07/2011 19:46:00', '28/07/2011 19:47:00', '28/07/2011 19:48:00', '28/07/2011 19:49:00', '28/07/2011 19:50:00', '28/07/2011 19:51:00', '28/07/2011 19:52:00', '28/07/2011 19:53:00', '28/07/2011 19:54:00', '28/07/2011 19:55:00', '28/07/2011 19:56:00', '28/07/2011 19:57:00', '28/07/2011 19:58:00', '28/07/2011 19:59:00', '28/07/2011 20:00:00', '28/07/2011 20:01:00', '28/07/2011 20:02:00', '28/07/2011 20:03:00', '28/07/2011 20:04:00', '28/07/2011 20:05:00', '28/07/2011 20:06:00', '28/07/2011 20:07:00', '28/07/2011 20:08:00', '28/07/2011 20:09:00', '28/07/2011 20:10:00', '28/07/2011 20:11:00', '28/07/2011 20:12:00', '28/07/2011 20:13:00', '28/07/2011 20:14:00', '28/07/2011 20:15:00', '28/07/2011 20:16:00', '28/07/2011 20:17:00', '28/07/2011 20:18:00', '28/07/2011 20:19:00', '28/07/2011 20:20:00', '28/07/2011 20:21:00', '28/07/2011 20:22:00', '28/07/2011 20:23:00', '28/07/2011 20:24:00', '28/07/2011 20:25:00', '28/07/2011 20:26:00', '28/07/2011 20:27:00', '28/07/2011 20:28:00', '28/07/2011 20:29:00', '28/07/2011 20:30:00', '28/07/2011 20:31:00', '28/07/2011 20:32:00', '28/07/2011 20:33:00', '28/07/2011 20:34:00', '28/07/2011 20:35:00', '28/07/2011 20:36:00', '28/07/2011 20:37:00', '28/07/2011 20:38:00', '28/07/2011 20:39:00', '28/07/2011 20:40:00', '28/07/2011 20:41:00', '28/07/2011 20:42:00', '28/07/2011 20:43:00', '28/07/2011 20:44:00', '28/07/2011 20:45:00', '28/07/2011 20:46:00', '28/07/2011 20:47:00', '28/07/2011 20:48:00', '28/07/2011 20:49:00', '28/07/2011 20:50:00', '28/07/2011 20:51:00', '28/07/2011 20:52:00', '28/07/2011 20:53:00', '28/07/2011 20:54:00', '28/07/2011 20:55:00', '28/07/2011 20:56:00', '28/07/2011 20:57:00', '28/07/2011 20:58:00', '28/07/2011 20:59:00', '28/07/2011 21:00:00', '28/07/2011 21:01:00', '28/07/2011 21:02:00', '28/07/2011 21:03:00', '28/07/2011 21:04:00', '28/07/2011 21:05:00', '28/07/2011 21:06:00', '28/07/2011 21:07:00', '28/07/2011 21:08:00', '28/07/2011 21:09:00', '28/07/2011 21:10:00', '28/07/2011 21:11:00', '28/07/2011 21:12:00', '28/07/2011 21:13:00', '28/07/2011 21:14:00', '28/07/2011 21:15:00', '28/07/2011 21:16:00', '28/07/2011 21:17:00', '28/07/2011 21:18:00', '28/07/2011 21:19:00', '28/07/2011 21:20:00', '28/07/2011 21:21:00', '28/07/2011 21:22:00', '28/07/2011 21:23:00', '28/07/2011 21:24:00', '28/07/2011 21:25:00', '28/07/2011 21:26:00', '28/07/2011 21:27:00', '28/07/2011 21:28:00', '28/07/2011 21:29:00', '28/07/2011 21:30:00', '28/07/2011 21:31:00', '28/07/2011 21:32:00', '28/07/2011 21:33:00', '28/07/2011 21:34:00', '28/07/2011 21:35:00', '28/07/2011 21:36:00', '28/07/2011 21:37:00', '28/07/2011 21:38:00', '28/07/2011 21:39:00', '28/07/2011 21:40:00', '28/07/2011 21:41:00', '28/07/2011 21:42:00', '28/07/2011 21:43:00', '28/07/2011 21:44:00', '28/07/2011 21:45:00', '28/07/2011 21:46:00', '28/07/2011 21:47:00', '28/07/2011 21:48:00', '28/07/2011 21:49:00', '28/07/2011 21:50:00', '28/07/2011 21:51:00', '28/07/2011 21:52:00', '28/07/2011 21:53:00', '28/07/2011 21:54:00', '28/07/2011 21:55:00', '28/07/2011 21:56:00', '28/07/2011 21:57:00', '28/07/2011 21:58:00', '28/07/2011 21:59:00', '28/07/2011 22:00:00', '28/07/2011 22:01:00', '28/07/2011 22:02:00', '28/07/2011 22:03:00', '28/07/2011 22:04:00', '28/07/2011 22:05:00', '28/07/2011 22:06:00', '28/07/2011 22:07:00', '28/07/2011 22:08:00', '28/07/2011 22:09:00', '28/07/2011 22:10:00', '28/07/2011 22:11:00', '28/07/2011 22:12:00', '28/07/2011 22:13:00', '28/07/2011 22:14:00', '28/07/2011 22:15:00', '28/07/2011 22:16:00', '28/07/2011 22:17:00', '28/07/2011 22:18:00', '28/07/2011 22:19:00', '28/07/2011 22:20:00', '28/07/2011 22:21:00', '28/07/2011 22:22:00', '28/07/2011 22:23:00', '28/07/2011 22:24:00', '28/07/2011 22:25:00', '28/07/2011 22:26:00', '28/07/2011 22:27:00', '28/07/2011 22:28:00', '28/07/2011 22:29:00', '28/07/2011 22:30:00', '28/07/2011 22:31:00', '28/07/2011 22:32:00', '28/07/2011 22:33:00', '28/07/2011 22:34:00', '28/07/2011 22:35:00', '28/07/2011 22:36:00', '28/07/2011 22:37:00', '28/07/2011 22:38:00', '28/07/2011 22:39:00', '28/07/2011 22:40:00', '28/07/2011 22:41:00', '28/07/2011 22:42:00', '28/07/2011 22:43:00', '28/07/2011 22:44:00', '28/07/2011 22:45:00', '28/07/2011 22:46:00', '28/07/2011 22:47:00', '28/07/2011 22:48:00', '28/07/2011 22:49:00', '28/07/2011 22:50:00', '28/07/2011 22:51:00', '28/07/2011 22:52:00', '28/07/2011 22:53:00', '28/07/2011 22:54:00', '28/07/2011 22:55:00', '28/07/2011 22:56:00', '28/07/2011 22:57:00', '28/07/2011 22:58:00', '28/07/2011 22:59:00', '28/07/2011 23:00:00', '28/07/2011 23:01:00', '28/07/2011 23:02:00', '28/07/2011 23:03:00', '28/07/2011 23:04:00', '28/07/2011 23:05:00', '28/07/2011 23:06:00', '28/07/2011 23:07:00', '28/07/2011 23:08:00', '28/07/2011 23:09:00', '28/07/2011 23:10:00', '28/07/2011 23:11:00', '28/07/2011 23:12:00', '28/07/2011 23:13:00', '28/07/2011 23:14:00', '28/07/2011 23:15:00', '28/07/2011 23:16:00', '28/07/2011 23:17:00', '28/07/2011 23:18:00', '28/07/2011 23:19:00', '28/07/2011 23:20:00', '28/07/2011 23:21:00', '28/07/2011 23:22:00', '28/07/2011 23:23:00', '28/07/2011 23:24:00', '28/07/2011 23:25:00', '28/07/2011 23:26:00', '28/07/2011 23:27:00', '28/07/2011 23:28:00', '28/07/2011 23:29:00', '28/07/2011 23:30:00', '28/07/2011 23:31:00', '28/07/2011 23:32:00', '28/07/2011 23:33:00', '28/07/2011 23:34:00', '28/07/2011 23:35:00', '28/07/2011 23:36:00', '28/07/2011 23:37:00', '28/07/2011 23:38:00', '28/07/2011 23:39:00', '28/07/2011 23:40:00', '28/07/2011 23:41:00', '28/07/2011 23:42:00', '28/07/2011 23:43:00', '28/07/2011 23:44:00', '28/07/2011 23:45:00', '28/07/2011 23:46:00', '28/07/2011 23:47:00', '28/07/2011 23:48:00', '28/07/2011 23:49:00', '28/07/2011 23:50:00', '28/07/2011 23:51:00', '28/07/2011 23:52:00', '28/07/2011 23:53:00', '28/07/2011 23:54:00', '28/07/2011 23:55:00', '28/07/2011 23:56:00', '28/07/2011 23:57:00', '28/07/2011 23:58:00', '28/07/2011 23:59:00', '29/07/2011 0:00:00', '29/07/2011 0:01:00', '29/07/2011 0:02:00', '29/07/2011 0:03:00', '29/07/2011 0:04:00', '29/07/2011 0:05:00', '29/07/2011 0:06:00', '29/07/2011 0:07:00', '29/07/2011 0:08:00', '29/07/2011 0:09:00', '29/07/2011 0:10:00', '29/07/2011 0:11:00', '29/07/2011 0:12:00', '29/07/2011 0:13:00', '29/07/2011 0:14:00', '29/07/2011 0:15:00', '29/07/2011 0:16:00', '29/07/2011 0:17:00', '29/07/2011 0:18:00', '29/07/2011 0:19:00', '29/07/2011 0:20:00', '29/07/2011 0:21:00', '29/07/2011 0:22:00', '29/07/2011 0:23:00', '29/07/2011 0:24:00', '29/07/2011 0:25:00', '29/07/2011 0:26:00', '29/07/2011 0:27:00', '29/07/2011 0:28:00', '29/07/2011 0:29:00', '29/07/2011 0:30:00', '29/07/2011 0:31:00', '29/07/2011 0:32:00', '29/07/2011 0:33:00', '29/07/2011 0:34:00', '29/07/2011 0:35:00', '29/07/2011 0:36:00', '29/07/2011 0:37:00', '29/07/2011 0:38:00', '29/07/2011 0:39:00', '29/07/2011 0:40:00', '29/07/2011 0:41:00', '29/07/2011 0:42:00', '29/07/2011 0:43:00', '29/07/2011 0:44:00', '29/07/2011 0:45:00', '29/07/2011 0:46:00', '29/07/2011 0:47:00', '29/07/2011 0:48:00', '29/07/2011 0:49:00', '29/07/2011 0:50:00', '29/07/2011 0:51:00', '29/07/2011 0:52:00', '29/07/2011 0:53:00', '29/07/2011 0:54:00', '29/07/2011 0:55:00', '29/07/2011 0:56:00', '29/07/2011 0:57:00', '29/07/2011 0:58:00', '29/07/2011 0:59:00', '29/07/2011 1:00:00', '29/07/2011 1:01:00', '29/07/2011 1:02:00', '29/07/2011 1:03:00', '29/07/2011 1:04:00', '29/07/2011 1:05:00', '29/07/2011 1:06:00', '29/07/2011 1:07:00', '29/07/2011 1:08:00', '29/07/2011 1:09:00', '29/07/2011 1:10:00', '29/07/2011 1:11:00', '29/07/2011 1:12:00', '29/07/2011 1:13:00', '29/07/2011 1:14:00', '29/07/2011 1:15:00', '29/07/2011 1:16:00', '29/07/2011 1:17:00', '29/07/2011 1:18:00', '29/07/2011 1:19:00', '29/07/2011 1:20:00', '29/07/2011 1:21:00', '29/07/2011 1:22:00', '29/07/2011 1:23:00', '29/07/2011 1:24:00', '29/07/2011 1:25:00', '29/07/2011 1:26:00', '29/07/2011 1:27:00', '29/07/2011 1:28:00', '29/07/2011 1:29:00', '29/07/2011 1:30:00', '29/07/2011 1:31:00', '29/07/2011 1:32:00', '29/07/2011 1:33:00', '29/07/2011 1:34:00', '29/07/2011 1:35:00', '29/07/2011 1:36:00', '29/07/2011 1:37:00', '29/07/2011 1:38:00', '29/07/2011 1:39:00', '29/07/2011 1:40:00', '29/07/2011 1:41:00', '29/07/2011 1:42:00', '29/07/2011 1:43:00', '29/07/2011 1:44:00', '29/07/2011 1:45:00', '29/07/2011 1:46:00', '29/07/2011 1:47:00', '29/07/2011 1:48:00', '29/07/2011 1:49:00', '29/07/2011 1:50:00', '29/07/2011 1:51:00', '29/07/2011 1:52:00', '29/07/2011 1:53:00', '29/07/2011 1:54:00', '29/07/2011 1:55:00', '29/07/2011 1:56:00', '29/07/2011 1:57:00', '29/07/2011 1:58:00', '29/07/2011 1:59:00', '29/07/2011 2:00:00', '29/07/2011 2:01:00', '29/07/2011 2:02:00', '29/07/2011 2:03:00', '29/07/2011 2:04:00', '29/07/2011 2:05:00', '29/07/2011 2:06:00', '29/07/2011 2:07:00', '29/07/2011 2:08:00', '29/07/2011 2:09:00', '29/07/2011 2:10:00', '29/07/2011 2:11:00', '29/07/2011 2:12:00', '29/07/2011 2:13:00', '29/07/2011 2:14:00', '29/07/2011 2:15:00', '29/07/2011 2:16:00', '29/07/2011 2:17:00', '29/07/2011 2:18:00', '29/07/2011 2:19:00', '29/07/2011 2:20:00', '29/07/2011 2:21:00', '29/07/2011 2:22:00', '29/07/2011 2:23:00', '29/07/2011 2:24:00', '29/07/2011 2:25:00', '29/07/2011 2:26:00', '29/07/2011 2:27:00', '29/07/2011 2:28:00', '29/07/2011 2:29:00', '29/07/2011 2:30:00', '29/07/2011 2:31:00', '29/07/2011 2:32:00', '29/07/2011 2:33:00', '29/07/2011 2:34:00', '29/07/2011 2:35:00', '29/07/2011 2:36:00', '29/07/2011 2:37:00', '29/07/2011 2:38:00', '29/07/2011 2:39:00', '29/07/2011 2:40:00', '29/07/2011 2:41:00', '29/07/2011 2:42:00', '29/07/2011 2:43:00', '29/07/2011 2:44:00', '29/07/2011 2:45:00', '29/07/2011 2:46:00', '29/07/2011 2:47:00', '29/07/2011 2:48:00', '29/07/2011 2:49:00', '29/07/2011 2:50:00', '29/07/2011 2:51:00', '29/07/2011 2:52:00', '29/07/2011 2:53:00', '29/07/2011 2:54:00', '29/07/2011 2:55:00', '29/07/2011 2:56:00', '29/07/2011 2:57:00', '29/07/2011 2:58:00', '29/07/2011 2:59:00', '29/07/2011 3:00:00', '29/07/2011 3:01:00', '29/07/2011 3:02:00', '29/07/2011 3:03:00', '29/07/2011 3:04:00', '29/07/2011 3:05:00', '29/07/2011 3:06:00', '29/07/2011 3:07:00', '29/07/2011 3:08:00', '29/07/2011 3:09:00', '29/07/2011 3:10:00', '29/07/2011 3:11:00', '29/07/2011 3:12:00', '29/07/2011 3:13:00', '29/07/2011 3:14:00', '29/07/2011 3:15:00', '29/07/2011 3:16:00', '29/07/2011 3:17:00', '29/07/2011 3:18:00', '29/07/2011 3:19:00', '29/07/2011 3:20:00', '29/07/2011 3:21:00', '29/07/2011 3:22:00', '29/07/2011 3:23:00', '29/07/2011 3:24:00', '29/07/2011 3:25:00', '29/07/2011 3:26:00', '29/07/2011 3:27:00', '29/07/2011 3:28:00', '29/07/2011 3:29:00', '29/07/2011 3:30:00', '29/07/2011 3:31:00', '29/07/2011 3:32:00', '29/07/2011 3:33:00', '29/07/2011 3:34:00', '29/07/2011 3:35:00', '29/07/2011 3:36:00', '29/07/2011 3:37:00', '29/07/2011 3:38:00', '29/07/2011 3:39:00', '29/07/2011 3:40:00', '29/07/2011 3:41:00', '29/07/2011 3:42:00', '29/07/2011 3:43:00', '29/07/2011 3:44:00', '29/07/2011 3:45:00', '29/07/2011 3:46:00', '29/07/2011 3:47:00', '29/07/2011 3:48:00', '29/07/2011 3:49:00', '29/07/2011 3:50:00', '29/07/2011 3:51:00', '29/07/2011 3:52:00', '29/07/2011 3:53:00', '29/07/2011 3:54:00', '29/07/2011 3:55:00', '29/07/2011 3:56:00', '29/07/2011 3:57:00', '29/07/2011 3:58:00', '29/07/2011 3:59:00', '29/07/2011 4:00:00', '29/07/2011 4:01:00', '29/07/2011 4:02:00', '29/07/2011 4:03:00', '29/07/2011 4:04:00', '29/07/2011 4:05:00', '29/07/2011 4:06:00', '29/07/2011 4:07:00', '29/07/2011 4:08:00', '29/07/2011 4:09:00', '29/07/2011 4:10:00', '29/07/2011 4:11:00', '29/07/2011 4:12:00', '29/07/2011 4:13:00', '29/07/2011 4:14:00', '29/07/2011 4:15:00', '29/07/2011 4:16:00', '29/07/2011 4:17:00', '29/07/2011 4:18:00', '29/07/2011 4:19:00', '29/07/2011 4:20:00', '29/07/2011 4:21:00', '29/07/2011 4:22:00', '29/07/2011 4:23:00', '29/07/2011 4:24:00', '29/07/2011 4:25:00', '29/07/2011 4:26:00', '29/07/2011 4:27:00', '29/07/2011 4:28:00', '29/07/2011 4:29:00', '29/07/2011 4:30:00', '29/07/2011 4:31:00', '29/07/2011 4:32:00', '29/07/2011 4:33:00', '29/07/2011 4:34:00', '29/07/2011 4:35:00', '29/07/2011 4:36:00', '29/07/2011 4:37:00', '29/07/2011 4:38:00', '29/07/2011 4:39:00', '29/07/2011 4:40:00', '29/07/2011 4:41:00', '29/07/2011 4:42:00', '29/07/2011 4:43:00', '29/07/2011 4:44:00', '29/07/2011 4:45:00', '29/07/2011 4:46:00', '29/07/2011 4:47:00', '29/07/2011 4:48:00', '29/07/2011 4:49:00', '29/07/2011 4:50:00', '29/07/2011 4:51:00', '29/07/2011 4:52:00', '29/07/2011 4:53:00', '29/07/2011 4:54:00', '29/07/2011 4:55:00', '29/07/2011 4:56:00', '29/07/2011 4:57:00', '29/07/2011 4:58:00', '29/07/2011 4:59:00', '29/07/2011 5:00:00', '29/07/2011 5:01:00', '29/07/2011 5:02:00', '29/07/2011 5:03:00', '29/07/2011 5:04:00', '29/07/2011 5:05:00', '29/07/2011 5:06:00', '29/07/2011 5:07:00', '29/07/2011 5:08:00', '29/07/2011 5:09:00', '29/07/2011 5:10:00', '29/07/2011 5:11:00', '29/07/2011 5:12:00', '29/07/2011 5:13:00', '29/07/2011 5:14:00', '29/07/2011 5:15:00', '29/07/2011 5:16:00', '29/07/2011 5:17:00', '29/07/2011 5:18:00', '29/07/2011 5:19:00', '29/07/2011 5:20:00', '29/07/2011 5:21:00', '29/07/2011 5:22:00', '29/07/2011 5:23:00', '29/07/2011 5:24:00', '29/07/2011 5:25:00', '29/07/2011 5:26:00', '29/07/2011 5:27:00', '29/07/2011 5:28:00', '29/07/2011 5:29:00', '29/07/2011 5:30:00', '29/07/2011 5:31:00', '29/07/2011 5:32:00', '29/07/2011 5:33:00', '29/07/2011 5:34:00', '29/07/2011 5:35:00', '29/07/2011 5:36:00', '29/07/2011 5:37:00', '29/07/2011 5:38:00', '29/07/2011 5:39:00', '29/07/2011 5:40:00', '29/07/2011 5:41:00', '29/07/2011 5:42:00', '29/07/2011 5:43:00', '29/07/2011 5:44:00', '29/07/2011 5:45:00', '29/07/2011 5:46:00', '29/07/2011 5:47:00', '29/07/2011 5:48:00', '29/07/2011 5:49:00', '29/07/2011 5:50:00', '29/07/2011 5:51:00', '29/07/2011 5:52:00', '29/07/2011 5:53:00', '29/07/2011 5:54:00', '29/07/2011 5:55:00', '29/07/2011 5:56:00', '29/07/2011 5:57:00', '29/07/2011 5:58:00', '29/07/2011 5:59:00', '29/07/2011 6:00:00', '29/07/2011 6:01:00', '29/07/2011 6:02:00', '29/07/2011 6:03:00', '29/07/2011 6:04:00', '29/07/2011 6:05:00', '29/07/2011 6:06:00', '29/07/2011 6:07:00', '29/07/2011 6:08:00', '29/07/2011 6:09:00', '29/07/2011 6:10:00', '29/07/2011 6:11:00', '29/07/2011 6:12:00', '29/07/2011 6:13:00', '29/07/2011 6:14:00', '29/07/2011 6:15:00', '29/07/2011 6:16:00', '29/07/2011 6:17:00', '29/07/2011 6:18:00', '29/07/2011 6:19:00', '29/07/2011 6:20:00', '29/07/2011 6:21:00', '29/07/2011 6:22:00', '29/07/2011 6:23:00', '29/07/2011 6:24:00', '29/07/2011 6:25:00', '29/07/2011 6:26:00', '29/07/2011 6:27:00', '29/07/2011 6:28:00', '29/07/2011 6:29:00', '29/07/2011 6:30:00', '29/07/2011 6:31:00', '29/07/2011 6:32:00', '29/07/2011 6:33:00', '29/07/2011 6:34:00', '29/07/2011 6:35:00', '29/07/2011 6:36:00', '29/07/2011 6:37:00', '29/07/2011 6:38:00', '29/07/2011 6:39:00', '29/07/2011 6:40:00', '29/07/2011 6:41:00', '29/07/2011 6:42:00', '29/07/2011 6:43:00', '29/07/2011 6:44:00', '29/07/2011 6:45:00', '29/07/2011 6:46:00', '29/07/2011 6:47:00', '29/07/2011 6:48:00', '29/07/2011 6:49:00', '29/07/2011 6:50:00', '29/07/2011 6:51:00', '29/07/2011 6:52:00', '29/07/2011 6:53:00', '29/07/2011 6:54:00', '29/07/2011 6:55:00', '29/07/2011 6:56:00', '29/07/2011 6:57:00', '29/07/2011 6:58:00', '29/07/2011 6:59:00', '29/07/2011 7:00:00', '29/07/2011 7:01:00', '29/07/2011 7:02:00', '29/07/2011 7:03:00', '29/07/2011 7:04:00', '29/07/2011 7:05:00', '29/07/2011 7:06:00', '29/07/2011 7:07:00', '29/07/2011 7:08:00', '29/07/2011 7:09:00', '29/07/2011 7:10:00', '29/07/2011 7:11:00', '29/07/2011 7:12:00', '29/07/2011 7:13:00', '29/07/2011 7:14:00', '29/07/2011 7:15:00', '29/07/2011 7:16:00', '29/07/2011 7:17:00', '29/07/2011 7:18:00', '29/07/2011 7:19:00', '29/07/2011 7:20:00', '29/07/2011 7:21:00', '29/07/2011 7:22:00', '29/07/2011 7:23:00', '29/07/2011 7:24:00', '29/07/2011 7:25:00', '29/07/2011 7:26:00', '29/07/2011 7:27:00', '29/07/2011 7:28:00', '29/07/2011 7:29:00', '29/07/2011 7:30:00', '29/07/2011 7:31:00', '29/07/2011 7:32:00', '29/07/2011 7:33:00', '29/07/2011 7:34:00', '29/07/2011 7:35:00', '29/07/2011 7:36:00', '29/07/2011 7:37:00', '29/07/2011 7:38:00', '29/07/2011 7:39:00', '29/07/2011 7:40:00', '29/07/2011 7:41:00', '29/07/2011 7:42:00', '29/07/2011 7:43:00', '29/07/2011 7:44:00', '29/07/2011 7:45:00', '29/07/2011 7:46:00', '29/07/2011 7:47:00', '29/07/2011 7:48:00', '29/07/2011 7:49:00', '29/07/2011 7:50:00', '29/07/2011 7:51:00', '29/07/2011 7:52:00', '29/07/2011 7:53:00', '29/07/2011 7:54:00', '29/07/2011 7:55:00', '29/07/2011 7:56:00', '29/07/2011 7:57:00', '29/07/2011 7:58:00', '29/07/2011 7:59:00', '29/07/2011 8:00:00', '29/07/2011 8:01:00', '29/07/2011 8:02:00', '29/07/2011 8:03:00', '29/07/2011 8:04:00', '29/07/2011 8:05:00', '29/07/2011 8:06:00', '29/07/2011 8:07:00', '29/07/2011 8:08:00', '29/07/2011 8:09:00', '29/07/2011 8:10:00', '29/07/2011 8:11:00', '29/07/2011 8:12:00', '29/07/2011 8:13:00', '29/07/2011 8:14:00', '29/07/2011 8:15:00', '29/07/2011 8:16:00', '29/07/2011 8:17:00', '29/07/2011 8:18:00', '29/07/2011 8:19:00', '29/07/2011 8:20:00', '29/07/2011 8:21:00', '29/07/2011 8:22:00', '29/07/2011 8:23:00', '29/07/2011 8:24:00', '29/07/2011 8:25:00', '29/07/2011 8:26:00', '29/07/2011 8:27:00', '29/07/2011 8:28:00', '29/07/2011 8:29:00', '29/07/2011 8:30:00', '29/07/2011 8:31:00', '29/07/2011 8:32:00', '29/07/2011 8:33:00', '29/07/2011 8:34:00', '29/07/2011 8:35:00', '29/07/2011 8:36:00', '29/07/2011 8:37:00', '29/07/2011 8:38:00', '29/07/2011 8:39:00', '29/07/2011 8:40:00', '29/07/2011 8:41:00', '29/07/2011 8:42:00', '29/07/2011 8:43:00', '29/07/2011 8:44:00', '29/07/2011 8:45:00', '29/07/2011 8:46:00', '29/07/2011 8:47:00', '29/07/2011 8:48:00', '29/07/2011 8:49:00', '29/07/2011 8:50:00', '29/07/2011 8:51:00', '29/07/2011 8:52:00', '29/07/2011 8:53:00', '29/07/2011 8:54:00', '29/07/2011 8:55:00', '29/07/2011 8:56:00', '29/07/2011 8:57:00', '29/07/2011 8:58:00', '29/07/2011 8:59:00', '29/07/2011 9:00:00', '29/07/2011 9:01:00', '29/07/2011 9:02:00', '29/07/2011 9:03:00', '29/07/2011 9:04:00', '29/07/2011 9:05:00', '29/07/2011 9:06:00', '29/07/2011 9:07:00', '29/07/2011 9:08:00', '29/07/2011 9:09:00', '29/07/2011 9:10:00', '29/07/2011 9:11:00', '29/07/2011 9:12:00', '29/07/2011 9:13:00', '29/07/2011 9:14:00', '29/07/2011 9:15:00', '29/07/2011 9:16:00', '29/07/2011 9:17:00', '29/07/2011 9:18:00', '29/07/2011 9:19:00', '29/07/2011 9:20:00', '29/07/2011 9:21:00', '29/07/2011 9:22:00', '29/07/2011 9:23:00', '29/07/2011 9:24:00', '29/07/2011 9:25:00', '29/07/2011 9:26:00', '29/07/2011 9:27:00', '29/07/2011 9:28:00', '29/07/2011 9:29:00', '29/07/2011 9:30:00', '29/07/2011 9:31:00', '29/07/2011 9:32:00', '29/07/2011 9:33:00', '29/07/2011 9:34:00', '29/07/2011 9:35:00', '29/07/2011 9:36:00', '29/07/2011 9:37:00', '29/07/2011 9:38:00', '29/07/2011 9:39:00', '29/07/2011 9:40:00', '29/07/2011 9:41:00', '29/07/2011 9:42:00', '29/07/2011 9:43:00', '29/07/2011 9:44:00', '29/07/2011 9:45:00', '29/07/2011 9:46:00', '29/07/2011 9:47:00', '29/07/2011 9:48:00', '29/07/2011 9:49:00', '29/07/2011 9:50:00', '29/07/2011 9:51:00', '29/07/2011 9:52:00', '29/07/2011 9:53:00', '29/07/2011 9:54:00', '29/07/2011 9:55:00', '29/07/2011 9:56:00', '29/07/2011 9:57:00', '29/07/2011 9:58:00', '29/07/2011 9:59:00', '29/07/2011 10:00:00', '29/07/2011 10:01:00', '29/07/2011 10:02:00', '29/07/2011 10:03:00', '29/07/2011 10:04:00', '29/07/2011 10:05:00', '29/07/2011 10:06:00', '29/07/2011 10:07:00', '29/07/2011 10:08:00', '29/07/2011 10:09:00', '29/07/2011 10:10:00', '29/07/2011 10:11:00', '29/07/2011 10:12:00', '29/07/2011 10:13:00', '29/07/2011 10:14:00', '29/07/2011 10:15:00', '29/07/2011 10:16:00', '29/07/2011 10:17:00', '29/07/2011 10:18:00', '29/07/2011 10:19:00', '29/07/2011 10:20:00', '29/07/2011 10:21:00', '29/07/2011 10:22:00', '29/07/2011 10:23:00', '29/07/2011 10:24:00', '29/07/2011 10:25:00', '29/07/2011 10:26:00', '29/07/2011 10:27:00', '29/07/2011 10:28:00', '29/07/2011 10:29:00', '29/07/2011 10:30:00', '29/07/2011 10:31:00', '29/07/2011 10:32:00', '29/07/2011 10:33:00', '29/07/2011 10:34:00', '29/07/2011 10:35:00', '29/07/2011 10:36:00', '29/07/2011 10:37:00', '29/07/2011 10:38:00', '29/07/2011 10:39:00', '29/07/2011 10:40:00', '29/07/2011 10:41:00', '29/07/2011 10:42:00', '29/07/2011 10:43:00', '29/07/2011 10:44:00', '29/07/2011 10:45:00', '29/07/2011 10:46:00', '29/07/2011 10:47:00', '29/07/2011 10:48:00', '29/07/2011 10:49:00', '29/07/2011 10:50:00', '29/07/2011 10:51:00', '29/07/2011 10:52:00', '29/07/2011 10:53:00', '29/07/2011 10:54:00', '29/07/2011 10:55:00', '29/07/2011 10:56:00', '29/07/2011 10:57:00', '29/07/2011 10:58:00', '29/07/2011 10:59:00', '29/07/2011 11:00:00', '29/07/2011 11:01:00', '29/07/2011 11:02:00', '29/07/2011 11:03:00', '29/07/2011 11:04:00', '29/07/2011 11:05:00', '29/07/2011 11:06:00', '29/07/2011 11:07:00', '29/07/2011 11:08:00', '29/07/2011 11:09:00', '29/07/2011 11:10:00', '29/07/2011 11:11:00', '29/07/2011 11:12:00', '29/07/2011 11:13:00', '29/07/2011 11:14:00', '29/07/2011 11:15:00', '29/07/2011 11:16:00', '29/07/2011 11:17:00', '29/07/2011 11:18:00', '29/07/2011 11:19:00', '29/07/2011 11:20:00', '29/07/2011 11:21:00', '29/07/2011 11:22:00', '29/07/2011 11:23:00', '29/07/2011 11:24:00', '29/07/2011 11:25:00', '29/07/2011 11:26:00', '29/07/2011 11:27:00', '29/07/2011 11:28:00', '29/07/2011 11:29:00', '29/07/2011 11:30:00', '29/07/2011 11:31:00', '29/07/2011 11:32:00', '29/07/2011 11:33:00', '29/07/2011 11:34:00', '29/07/2011 11:35:00', '29/07/2011 11:36:00', '29/07/2011 11:37:00', '29/07/2011 11:38:00', '29/07/2011 11:39:00', '29/07/2011 11:40:00', '29/07/2011 11:41:00', '29/07/2011 11:42:00', '29/07/2011 11:43:00', '29/07/2011 11:44:00', '29/07/2011 11:45:00', '29/07/2011 11:46:00', '29/07/2011 11:47:00', '29/07/2011 11:48:00', '29/07/2011 11:49:00', '29/07/2011 11:50:00']
print(times)
fig, ax = plt.subplots(1)
fig.autofmt_xdate()
print(times)
plt.plot(times, range(len(times)))
xfmt = mdates.DateFormatter('%d/%m/%y %H:%M:%S')
ax.xaxis.set_major_formatter(xfmt)
plt.show()
| agpl-3.0 |
kexinrong/macrobase | tools/py_analysis/plot_outlier_histograms.py | 2 | 2598 | import argparse
import itertools
import json
import matplotlib.pyplot as plt
import numpy as np
import os
import pandas as pd
from common import add_db_args
from common import add_plot_limit_args
from common import set_db_connection
from common import set_plot_limits
from matplotlib.colors import LogNorm
from plot_estimator import _format_datum
from plot_estimator import _extract_data
def parse_args(*argument_list):
parser = argparse.ArgumentParser()
parser.add_argument('infiles', type=argparse.FileType('r'), nargs='+',
help='File(s) with inliers & outliers with their scores '
'outputted by macrobase')
parser.add_argument('--histogram-bins', default=100, type=int)
parser.add_argument('--restrict-to', choices=['inliers', 'outliers'],
help='Plots 2d histogram of outliers or inliers')
parser.add_argument('--columns', nargs=1, default=['metrics.*'],
help='Data to include in the plot')
parser.add_argument('--legend-loc', default='best')
parser.add_argument('--no-scores', action='store_false', default=True, dest='plot_scores')
parser.add_argument('--savefig')
add_plot_limit_args(parser)
add_db_args(parser)
args = parser.parse_args(*argument_list)
return args
def _format_data(infile, args):
print 'formatting data from file %s' % infile.name
raw_data = json.load(infile)
dimensions = len(_format_datum(raw_data['inliers'][0], args.columns)) - 1
assert dimensions == 1
outliers = _extract_data(raw_data, 'outliers', args.columns, args.x_limits, None)
return os.path.basename(infile.name).rsplit('.')[0], list(outliers)
def plot_histograms(args):
classifiers = {}
data, labels = [], []
for _file in args.infiles:
label, content = _format_data(_file, args)
labels.append(label)
X, _ = zip(*content)
data.append(X)
plt.hist(data, args.histogram_bins, histtype='bar', stacked=False, label=labels)
plt.legend(loc=args.legend_loc)
set_plot_limits(plt, args)
if args.savefig is not None:
filename = args.savefig
modifiers = []
if args.x_limits:
modifiers.append('X=%d,%d' % tuple(args.x_limits))
if args.y_limits:
modifiers.append('Y=%d,%d' % tuple(args.y_limits))
name, ext = filename.rsplit('.')
new_filename = '{old_name}-{modifiers}.{ext}'.format(old_name=name, modifiers='-'.join(modifiers), ext=ext)
print 'saving figure to - ', new_filename
plt.savefig(new_filename, dpi=320)
plt.clf()
else:
plt.show()
if __name__ == '__main__':
args = parse_args()
plot_histograms(args)
| apache-2.0 |
lfairchild/PmagPy | programs/strip_magic.py | 2 | 14657 | #!/usr/bin/env python
import sys
import matplotlib
if matplotlib.get_backend() != "TKAgg":
matplotlib.use("TKAgg")
import pmagpy.pmagplotlib as pmagplotlib
import pmagpy.pmag as pmag
def main():
"""
NAME
strip_magic.py
DESCRIPTION
plots various parameters versus depth or age
SYNTAX
strip_magic.py [command line optins]
OPTIONS
-h prints help message and quits
-DM NUM: specify data model num, options 2 (legacy) or 3 (default)
-f FILE: specify input magic format file from magic,default='pmag_results.txt'
supported types=[pmag_specimens, pmag_samples, pmag_sites, pmag_results, magic_web]
-obj [sit,sam,all]: specify object to site,sample,all for pmag_result table, default is all
-fmt [svg,png,jpg], format for images - default is svg
-x [age,pos]: specify whether age or stratigraphic position
-y [dec,inc,int,chi,lat,lon,vdm,vadm]
(lat and lon are VGP lat and lon)
-Iex: plot the expected inc at lat - only available for results with lat info in file
-ts TS amin amax: plot the GPTS for the time interval between amin and amax (numbers in Ma)
TS: [ck95, gts04]
-mcd method_code, specify method code, default is first one encountered
-sav save plot and quit
NOTES
when x and/or y are not specified, a list of possibilities will be presented to the user for choosing
"""
if '-h' in sys.argv:
print(main.__doc__)
sys.exit()
xaxis, xplotind, yplotind = "", 0, 0 # (0 for strat pos)
yaxis, Xinc = "", ""
plot = 0
obj = 'all'
data_model_num = int(pmag.get_named_arg("-DM", 3))
# 2.5 keys
if data_model_num == 2:
supported = ['pmag_specimens', 'pmag_samples',
'pmag_sites', 'pmag_results', 'magic_web'] # available file types
Depth_keys = ['specimen_core_depth', 'specimen_height', 'specimen_elevation',
'specimen_composite_depth', 'sample_core_depth', 'sample_height',
'sample_elevation', 'sample_composite_depth', 'site_core_depth',
'site_height', 'site_elevation', 'site_composite_depth', 'average_height']
Age_keys = ['specimen_inferred_age', 'sample_inferred_age',
'site_inferred_age', 'average_age']
Unit_keys = {'specimen_inferred_age': 'specimen_inferred_age_unit',
'sample_inferred_age': 'sample_inferred_age_unit',
'site_inferred_age': 'site_inferred_age_unit', 'average_age': 'average_age_unit'}
Dec_keys = ['measurement_dec', 'specimen_dec',
'sample_dec', 'site_dec', 'average_dec']
Inc_keys = ['measurement_inc', 'specimen_inc',
'sample_inc', 'site_inc', 'average_inc']
Int_keys = ['measurement_magnitude', 'measurement_magn_moment', 'measurement_magn_volume',
'measurement_magn_mass', 'specimen_int', 'specimen_int_rel', 'sample_int',
'sample_int_rel', 'site_int', 'site_int_rel', 'average_int', 'average_int_rel']
Chi_keys = ['measurement_chi_volume', 'measurement_chi_mass']
Lat_keys = ['sample_lat', 'site_lat', 'average_lat']
VLat_keys = ['vgp_lat']
VLon_keys = ['vgp_lon']
Vdm_keys = ['vdm']
Vadm_keys = ['vadm']
method_col_name = "magic_method_codes"
else:
# 3.0 keys
supported = ["specimens", "samples", "sites", "locations"] # available file types
Depth_keys = [ "height", "core_depth", "elevation", "composite_depth" ]
Age_keys = [ "age" ]
Unit_keys = { "age": "age" }
Chi_keys = [ "susc_chi_volume", "susc_chi_mass" ]
Int_keys = [ "magn_moment", "magn_volume", "magn_mass", "int_abs", "int_rel" ]
Inc_keys = [ "dir_inc" ]
Dec_keys = [ "dir_dec" ]
Lat_Keys = [ "lat" ]
VLat_keys = [ "vgp_lat", "pole_lat" ]
VLon_keys = [ "vgp_lon", "pole_lon" ]
Vdm_keys = [ "vdm", "pdm" ]
Vadm_keys = [ "vadm", "padm" ]
method_col_name = "method_codes"
#
X_keys = [Age_keys, Depth_keys]
Y_keys = [Dec_keys, Inc_keys, Int_keys, Chi_keys,
VLat_keys, VLon_keys, Vdm_keys, Vadm_keys]
method, fmt = "", 'svg'
FIG = {'strat': 1}
plotexp, pTS = 0, 0
dir_path = pmag.get_named_arg("-WD", ".")
# default files
if data_model_num == 3:
res_file = pmag.get_named_arg("-f", "sites.txt")
else:
res_file = pmag.get_named_arg("-f", "pmag_results.txt")
res_file = pmag.resolve_file_name(res_file, dir_path)
if '-fmt' in sys.argv:
ind = sys.argv.index('-fmt')
fmt = sys.argv[ind+1]
if '-obj' in sys.argv:
ind = sys.argv.index('-obj')
obj = sys.argv[ind+1]
if '-x' in sys.argv:
ind = sys.argv.index('-x')
xaxis = sys.argv[ind+1]
if '-y' in sys.argv:
ind = sys.argv.index('-y')
yaxis = sys.argv[ind+1]
if yaxis == 'dec':
ykeys = Dec_keys
if yaxis == 'inc':
ykeys = Inc_keys
if yaxis == 'int':
ykeys = Int_keys
if yaxis == 'chi':
ykeys = Chi_keys
if yaxis == 'lat':
ykeys = VLat_keys
if yaxis == 'lon':
ykeys = VLon_keys
if yaxis == 'vdm':
ykeys = Vdm_keys
if yaxis == 'vadm':
ykeys = Vadm_keys
if '-mcd' in sys.argv:
ind = sys.argv.index('-mcd')
method = sys.argv[ind+1]
if '-ts' in sys.argv:
ind = sys.argv.index('-ts')
ts = sys.argv[ind+1]
amin = float(sys.argv[ind+2])
amax = float(sys.argv[ind+3])
pTS = 1
if '-Iex' in sys.argv:
plotexp = 1
if '-sav' in sys.argv:
plot = 1
#
#
# get data read in
Results, file_type = pmag.magic_read(res_file)
if file_type not in supported:
print("Unsupported file type ({}), try again".format(file_type))
sys.exit()
PltObjs = ['all']
if data_model_num == 2:
if file_type == 'pmag_results': # find out what to plot
for rec in Results:
resname = rec['pmag_result_name'].split()
if 'Sample' in resname and 'sam' not in PltObjs:
PltObjs.append('sam')
if 'Site' in resname and 'sit' not in PltObjs:
PltObjs.append('sit')
methcodes = []
# need to know all the measurement types from method_codes
if "magic_method_codes" in list(Results[0].keys()):
for rec in Results:
meths = rec["magic_method_codes"].split(":")
for meth in meths:
if meth.strip() not in methcodes and 'LP' in meth:
# look for the lab treatments
methcodes.append(meth.strip())
#
# initialize some variables
X_unit = "" # Unit for age or depth plotting (meters if depth)
Xplots, Yplots = [], []
Xunits = []
yplotind, xplotind = 0, 0
#
# step through possible plottable keys
#
if xaxis == "" or yaxis == "":
for key in list(Results[0].keys()):
for keys in X_keys:
for xkeys in keys:
if key in xkeys:
for ResRec in Results:
if ResRec[key] != "":
# only plot something if there is something to plot!
Xplots.append(key)
break
for keys in Y_keys:
for pkeys in keys:
if key in pkeys:
for ResRec in Results:
if ResRec[key] != "":
Yplots.append(key)
break
X, Y = [], []
for plt in Xplots:
if plt in Age_keys and 'age' not in X:
X.append('age')
if plt in Depth_keys and 'pos' not in X:
X.append('pos')
for plt in Yplots:
if plt in Dec_keys and 'dec' not in Y:
Y.append('dec')
if plt in Inc_keys and 'inc' not in Y:
Y.append('inc')
if plt in Int_keys and 'int' not in Y:
Y.append('int')
if plt in Chi_keys and 'chi' not in Y:
Y.append('chi')
if plt in VLat_keys and 'lat' not in Y:
Y.append('lat')
if plt in VLon_keys and 'lon' not in Y:
Y.append('lon')
if plt in Vadm_keys and 'vadm' not in Y:
Y.append('vadm')
if plt in Vdm_keys and 'vdm' not in Y:
Y.append('vdm')
if file_type == 'pmag_results':
print('available objects for plotting: ', PltObjs)
print('available X plots: ', X)
print('available Y plots: ', Y)
print('available method codes: ', methcodes)
f = open(dir_path+'/.striprc', 'w')
for x in X:
f.write('x:'+x+'\n')
for y in Y:
f.write('y:'+y+'\n')
for m in methcodes:
f.write('m:'+m+'\n')
for obj in PltObjs:
f.write('obj:'+obj+'\n')
sys.exit()
if plotexp == 1:
for lkey in Lat_keys:
for key in list(Results[0].keys()):
if key == lkey:
lat = float(Results[0][lkey])
Xinc = [pmag.pinc(lat), -pmag.pinc(lat)]
break
if Xinc == "":
print('can not plot expected inc for site - lat unknown')
if method != "" and method not in methcodes:
print('your method not available, but these are: ')
print(methcodes)
print('use ', methcodes[0], '? ^D to quit')
if xaxis == 'age':
for akey in Age_keys:
for key in list(Results[0].keys()):
if key == akey:
Xplots.append(key)
Xunits.append(Unit_keys[key])
if xaxis == 'pos':
for dkey in Depth_keys:
for key in list(Results[0].keys()):
if key == dkey:
Xplots.append(key)
if len(Xplots) == 0:
print('desired X axis information not found')
sys.exit()
if xaxis == 'age':
age_unit = Results[0][Xunits[0]]
if len(Xplots) > 1:
print('multiple X axis keys found, using: ', Xplots[xplotind])
for ykey in ykeys:
for key in list(Results[0].keys()):
if key == ykey:
Yplots.append(key)
if len(Yplots) == 0:
print('desired Y axis information not found')
sys.exit()
if len(Yplots) > 1:
print('multiple Y axis keys found, using: ', Yplots[yplotind])
# check if age or depth info
if len(Xplots) == 0:
print("Must have either age or height info to plot ")
sys.exit()
#
# check for variable to plot
#
#
# determine X axis (age or depth)
#
if xaxis == "age":
plotind = "1"
if method == "":
try:
method = methcodes[0]
except IndexError:
method = ""
if xaxis == 'pos':
xlab = "Stratigraphic Height (meters)"
else:
xlab = "Age ("+age_unit+")"
Xkey = Xplots[xplotind]
Ykey = Yplots[yplotind]
ylab = Ykey
#
# collect the data for plotting
XY = []
isign = 1.
# if float(Results[0][Xkey])/float(Results[-1][Xkey])>0 and float(Results[0][Xkey])<0:
# isign=-1. # x axis all same sign and negative, take positive (e.g.,for depth in core)
# xlab="Stratigraphic Position (meters)"
# else:
# isign=1.
for rec in Results:
if "magic_method_codes" in list(rec.keys()):
meths = rec["magic_method_codes"].split(":")
if method in meths: # make sure it is desired lab treatment step
if obj == 'all' and rec[Xkey].strip() != "":
XY.append([isign*float(rec[Xkey]), float(rec[Ykey])])
elif rec[Xkey].strip() != "":
name = rec['pmag_result_name'].split()
if obj == 'sit' and "Site" in name:
XY.append([isign*float(rec[Xkey]), float(rec[Ykey])])
if obj == 'sam' and "Sample" in name:
XY.append([isign*float(rec[Xkey]), float(rec[Ykey])])
elif method == "":
if obj == 'all' and rec[Xkey].strip() != "":
XY.append([isign*float(rec[Xkey]), float(rec[Ykey])])
elif rec[Xkey].strip() != "":
name = rec['pmag_result_name'].split()
if obj == 'sit' and "Site" in name:
XY.append([isign*float(rec[Xkey]), float(rec[Ykey])])
if obj == 'sam' and "Sample" in name:
XY.append([isign*float(rec[Xkey]), float(rec[Ykey])])
else:
print("Something wrong with your plotting choices")
break
XY.sort()
title = ""
if "er_locations_names" in list(Results[0].keys()):
title = Results[0]["er_location_names"]
if "er_locations_name" in list(Results[0].keys()):
title = Results[0]["er_location_name"]
labels = [xlab, ylab, title]
pmagplotlib.plot_init(FIG['strat'], 10, 5)
pmagplotlib.plot_strat(FIG['strat'], XY, labels) # plot them
if plotexp == 1:
pmagplotlib.plot_hs(FIG['strat'], Xinc, 'b', '--')
if yaxis == 'inc' or yaxis == 'lat':
pmagplotlib.plot_hs(FIG['strat'], [0], 'b', '-')
pmagplotlib.plot_hs(FIG['strat'], [-90, 90], 'g', '-')
if pTS == 1:
FIG['ts'] = 2
pmagplotlib.plot_init(FIG['ts'], 10, 5)
pmagplotlib.plot_ts(FIG['ts'], [amin, amax], ts)
files = {}
for key in list(FIG.keys()):
files[key] = key+'.'+fmt
if pmagplotlib.isServer:
black = '#000000'
purple = '#800080'
files = {}
files['strat'] = xaxis+'_'+yaxis+'_.'+fmt
files['ts'] = 'ts.'+fmt
titles = {}
titles['strat'] = 'Depth/Time Series Plot'
titles['ts'] = 'Time Series Plot'
FIG = pmagplotlib.add_borders(FIG, titles, black, purple)
pmagplotlib.save_plots(FIG, files)
elif plot == 1:
pmagplotlib.save_plots(FIG, files)
else:
pmagplotlib.draw_figs(FIG)
ans = input(" S[a]ve to save plot, [q]uit without saving: ")
if ans == "a":
pmagplotlib.save_plots(FIG, files)
if __name__ == "__main__":
main()
| bsd-3-clause |
ContextLab/hypertools | setup.py | 1 | 2235 | # -*- coding: utf-8 -*-
import os
import subprocess
import sys
from setuptools import setup, find_packages
from setuptools.command.install import install
os.environ["MPLCONFIGDIR"] = "."
NAME = 'hypertools'
VERSION = '0.7.0'
AUTHOR = 'Contextual Dynamics Lab'
AUTHOR_EMAIL = 'contextualdynamics@gmail.com'
URL = 'https://github.com/ContextLab/hypertools'
DOWNLOAD_URL = URL
LICENSE = 'MIT'
REQUIRES_PYTHON = '>=3.5'
PACKAGES = find_packages(exclude=('images', 'examples', 'tests'))
with open('requirements.txt', 'r') as f:
REQUIREMENTS = f.read().splitlines()
DESCRIPTION = 'A python package for visualizing and manipulating high-dimensional data'
LONG_DESCRIPTION = """\
HyperTools is a library for visualizing and manipulating high-dimensional data in Python. It is built on top of matplotlib (for plotting), seaborn (for plot styling), and scikit-learn (for data manipulation).
For sample Jupyter notebooks using the package: https://github.com/ContextLab/hypertools-paper-notebooks
For more examples: https://github.com/ContextLab/hypertools/tree/master/examples
Some key features of HyperTools are:
- Functions for plotting high-dimensional datasets in 2/3D.
- Static and animated plots
- Simple API for customizing plot styles
- A set of powerful data manipulation tools including hyperalignment, k-means clustering, normalizing and more.
- Support for lists of Numpy arrays, Pandas dataframes, String, Geos or mixed lists.
"""
CLASSIFIERS = [
'Intended Audience :: Science/Research',
'Programming Language :: Python :: 3.5',
'Programming Language :: Python :: 3.6',
'Programming Language :: Python :: 3.7',
'Programming Language :: Python :: 3.8',
'Topic :: Scientific/Engineering :: Visualization',
'Topic :: Multimedia :: Graphics',
'Operating System :: POSIX',
'Operating System :: Unix',
'Operating System :: MacOS'
]
setup(
name=NAME,
version=VERSION,
description=DESCRIPTION,
long_description=LONG_DESCRIPTION,
author=AUTHOR,
author_email=AUTHOR_EMAIL,
url=URL,
download_url = DOWNLOAD_URL,
license=LICENSE,
python_requires=REQUIRES_PYTHON,
packages=PACKAGES,
install_requires=REQUIREMENTS,
classifiers=CLASSIFIERS
)
| mit |
MatthieuBizien/scikit-learn | sklearn/feature_extraction/dict_vectorizer.py | 37 | 12559 | # Authors: Lars Buitinck
# Dan Blanchard <dblanchard@ets.org>
# License: BSD 3 clause
from array import array
from collections import Mapping
from operator import itemgetter
import numpy as np
import scipy.sparse as sp
from ..base import BaseEstimator, TransformerMixin
from ..externals import six
from ..externals.six.moves import xrange
from ..utils import check_array, tosequence
from ..utils.fixes import frombuffer_empty
def _tosequence(X):
"""Turn X into a sequence or ndarray, avoiding a copy if possible."""
if isinstance(X, Mapping): # single sample
return [X]
else:
return tosequence(X)
class DictVectorizer(BaseEstimator, TransformerMixin):
"""Transforms lists of feature-value mappings to vectors.
This transformer turns lists of mappings (dict-like objects) of feature
names to feature values into Numpy arrays or scipy.sparse matrices for use
with scikit-learn estimators.
When feature values are strings, this transformer will do a binary one-hot
(aka one-of-K) coding: one boolean-valued feature is constructed for each
of the possible string values that the feature can take on. For instance,
a feature "f" that can take on the values "ham" and "spam" will become two
features in the output, one signifying "f=ham", the other "f=spam".
However, note that this transformer will only do a binary one-hot encoding
when feature values are of type string. If categorical features are
represented as numeric values such as int, the DictVectorizer can be
followed by OneHotEncoder to complete binary one-hot encoding.
Features that do not occur in a sample (mapping) will have a zero value
in the resulting array/matrix.
Read more in the :ref:`User Guide <dict_feature_extraction>`.
Parameters
----------
dtype : callable, optional
The type of feature values. Passed to Numpy array/scipy.sparse matrix
constructors as the dtype argument.
separator: string, optional
Separator string used when constructing new features for one-hot
coding.
sparse: boolean, optional.
Whether transform should produce scipy.sparse matrices.
True by default.
sort: boolean, optional.
Whether ``feature_names_`` and ``vocabulary_`` should be sorted when fitting.
True by default.
Attributes
----------
vocabulary_ : dict
A dictionary mapping feature names to feature indices.
feature_names_ : list
A list of length n_features containing the feature names (e.g., "f=ham"
and "f=spam").
Examples
--------
>>> from sklearn.feature_extraction import DictVectorizer
>>> v = DictVectorizer(sparse=False)
>>> D = [{'foo': 1, 'bar': 2}, {'foo': 3, 'baz': 1}]
>>> X = v.fit_transform(D)
>>> X
array([[ 2., 0., 1.],
[ 0., 1., 3.]])
>>> v.inverse_transform(X) == \
[{'bar': 2.0, 'foo': 1.0}, {'baz': 1.0, 'foo': 3.0}]
True
>>> v.transform({'foo': 4, 'unseen_feature': 3})
array([[ 0., 0., 4.]])
See also
--------
FeatureHasher : performs vectorization using only a hash function.
sklearn.preprocessing.OneHotEncoder : handles nominal/categorical features
encoded as columns of integers.
"""
def __init__(self, dtype=np.float64, separator="=", sparse=True,
sort=True):
self.dtype = dtype
self.separator = separator
self.sparse = sparse
self.sort = sort
def fit(self, X, y=None):
"""Learn a list of feature name -> indices mappings.
Parameters
----------
X : Mapping or iterable over Mappings
Dict(s) or Mapping(s) from feature names (arbitrary Python
objects) to feature values (strings or convertible to dtype).
y : (ignored)
Returns
-------
self
"""
feature_names = []
vocab = {}
for x in X:
for f, v in six.iteritems(x):
if isinstance(v, six.string_types):
f = "%s%s%s" % (f, self.separator, v)
if f not in vocab:
feature_names.append(f)
vocab[f] = len(vocab)
if self.sort:
feature_names.sort()
vocab = dict((f, i) for i, f in enumerate(feature_names))
self.feature_names_ = feature_names
self.vocabulary_ = vocab
return self
def _transform(self, X, fitting):
# Sanity check: Python's array has no way of explicitly requesting the
# signed 32-bit integers that scipy.sparse needs, so we use the next
# best thing: typecode "i" (int). However, if that gives larger or
# smaller integers than 32-bit ones, np.frombuffer screws up.
assert array("i").itemsize == 4, (
"sizeof(int) != 4 on your platform; please report this at"
" https://github.com/scikit-learn/scikit-learn/issues and"
" include the output from platform.platform() in your bug report")
dtype = self.dtype
if fitting:
feature_names = []
vocab = {}
else:
feature_names = self.feature_names_
vocab = self.vocabulary_
# Process everything as sparse regardless of setting
X = [X] if isinstance(X, Mapping) else X
indices = array("i")
indptr = array("i", [0])
# XXX we could change values to an array.array as well, but it
# would require (heuristic) conversion of dtype to typecode...
values = []
# collect all the possible feature names and build sparse matrix at
# same time
for x in X:
for f, v in six.iteritems(x):
if isinstance(v, six.string_types):
f = "%s%s%s" % (f, self.separator, v)
v = 1
if f in vocab:
indices.append(vocab[f])
values.append(dtype(v))
else:
if fitting:
feature_names.append(f)
vocab[f] = len(vocab)
indices.append(vocab[f])
values.append(dtype(v))
indptr.append(len(indices))
if len(indptr) == 1:
raise ValueError("Sample sequence X is empty.")
indices = frombuffer_empty(indices, dtype=np.intc)
indptr = np.frombuffer(indptr, dtype=np.intc)
shape = (len(indptr) - 1, len(vocab))
result_matrix = sp.csr_matrix((values, indices, indptr),
shape=shape, dtype=dtype)
# Sort everything if asked
if fitting and self.sort:
feature_names.sort()
map_index = np.empty(len(feature_names), dtype=np.int32)
for new_val, f in enumerate(feature_names):
map_index[new_val] = vocab[f]
vocab[f] = new_val
result_matrix = result_matrix[:, map_index]
if self.sparse:
result_matrix.sort_indices()
else:
result_matrix = result_matrix.toarray()
if fitting:
self.feature_names_ = feature_names
self.vocabulary_ = vocab
return result_matrix
def fit_transform(self, X, y=None):
"""Learn a list of feature name -> indices mappings and transform X.
Like fit(X) followed by transform(X), but does not require
materializing X in memory.
Parameters
----------
X : Mapping or iterable over Mappings
Dict(s) or Mapping(s) from feature names (arbitrary Python
objects) to feature values (strings or convertible to dtype).
y : (ignored)
Returns
-------
Xa : {array, sparse matrix}
Feature vectors; always 2-d.
"""
return self._transform(X, fitting=True)
def inverse_transform(self, X, dict_type=dict):
"""Transform array or sparse matrix X back to feature mappings.
X must have been produced by this DictVectorizer's transform or
fit_transform method; it may only have passed through transformers
that preserve the number of features and their order.
In the case of one-hot/one-of-K coding, the constructed feature
names and values are returned rather than the original ones.
Parameters
----------
X : {array-like, sparse matrix}, shape = [n_samples, n_features]
Sample matrix.
dict_type : callable, optional
Constructor for feature mappings. Must conform to the
collections.Mapping API.
Returns
-------
D : list of dict_type objects, length = n_samples
Feature mappings for the samples in X.
"""
# COO matrix is not subscriptable
X = check_array(X, accept_sparse=['csr', 'csc'])
n_samples = X.shape[0]
names = self.feature_names_
dicts = [dict_type() for _ in xrange(n_samples)]
if sp.issparse(X):
for i, j in zip(*X.nonzero()):
dicts[i][names[j]] = X[i, j]
else:
for i, d in enumerate(dicts):
for j, v in enumerate(X[i, :]):
if v != 0:
d[names[j]] = X[i, j]
return dicts
def transform(self, X, y=None):
"""Transform feature->value dicts to array or sparse matrix.
Named features not encountered during fit or fit_transform will be
silently ignored.
Parameters
----------
X : Mapping or iterable over Mappings, length = n_samples
Dict(s) or Mapping(s) from feature names (arbitrary Python
objects) to feature values (strings or convertible to dtype).
y : (ignored)
Returns
-------
Xa : {array, sparse matrix}
Feature vectors; always 2-d.
"""
if self.sparse:
return self._transform(X, fitting=False)
else:
dtype = self.dtype
vocab = self.vocabulary_
X = _tosequence(X)
Xa = np.zeros((len(X), len(vocab)), dtype=dtype)
for i, x in enumerate(X):
for f, v in six.iteritems(x):
if isinstance(v, six.string_types):
f = "%s%s%s" % (f, self.separator, v)
v = 1
try:
Xa[i, vocab[f]] = dtype(v)
except KeyError:
pass
return Xa
def get_feature_names(self):
"""Returns a list of feature names, ordered by their indices.
If one-of-K coding is applied to categorical features, this will
include the constructed feature names but not the original ones.
"""
return self.feature_names_
def restrict(self, support, indices=False):
"""Restrict the features to those in support using feature selection.
This function modifies the estimator in-place.
Parameters
----------
support : array-like
Boolean mask or list of indices (as returned by the get_support
member of feature selectors).
indices : boolean, optional
Whether support is a list of indices.
Returns
-------
self
Examples
--------
>>> from sklearn.feature_extraction import DictVectorizer
>>> from sklearn.feature_selection import SelectKBest, chi2
>>> v = DictVectorizer()
>>> D = [{'foo': 1, 'bar': 2}, {'foo': 3, 'baz': 1}]
>>> X = v.fit_transform(D)
>>> support = SelectKBest(chi2, k=2).fit(X, [0, 1])
>>> v.get_feature_names()
['bar', 'baz', 'foo']
>>> v.restrict(support.get_support()) # doctest: +ELLIPSIS
DictVectorizer(dtype=..., separator='=', sort=True,
sparse=True)
>>> v.get_feature_names()
['bar', 'foo']
"""
if not indices:
support = np.where(support)[0]
names = self.feature_names_
new_vocab = {}
for i in support:
new_vocab[names[i]] = len(new_vocab)
self.vocabulary_ = new_vocab
self.feature_names_ = [f for f, i in sorted(six.iteritems(new_vocab),
key=itemgetter(1))]
return self
| bsd-3-clause |
agconti/kaggle-titanic | Python Examples/agcfirstforest.py | 6 | 4031 | #RandomForest, non parametric modeling
#agconti
import numpy as np
import csv as csv
from sklearn.ensemble import RandomForestClassifier
train_data=[] # Create a bin to hold our training data.
test_data=[] # Create a bin to hold our test data.
# Read in CSVs, train and test
with open('train.csv', 'rb') as f1:
header = csv_file_object.next()
for row in csv.reader(f1): # Skip through each row in the csv file
train_data.append(row) # Add each row to the data variable
train_data = np.array(train_data) # Then convert from a list to a NumPy array
with open('test.csv', 'rb') as f2: # Load in the test csv file
f2.next() # Skip the fist line because it is a header
for row in csv.reader(f2): # Skip through each row in the csv file
test_data.append(row) # Add each row to the data variable
test_data = np.array(test_data) # Then convert from a list to an array
# Convert strings to numbers so we can perform computational analysis
# The gender classifier in column 3: Male = 1, female = 0:
train_data[train_data[0::,3] == 'male', 3] = 1
train_data[train_data[0::,3] == 'female', 3] = 0
# Embark C = 0, S = 1, Q = 2
train_data[train_data[0::,10] == 'C', 10] = 0
train_data[train_data[0::,10] == 'S', 10] = 1
train_data[train_data[0::,10] == 'Q', 10] = 2
# Transfer Null observations
# So where there is no price, I will assume price on median of that class
# Where there is no age I will give median of all ages
# All the ages with no data make the median of the data
train_data[train_data[0::,4] == '',4] = np.median(train_data[train_data[0::,4]\
!= '',4].astype(np.float))
# All missing embarks just make them embark from most common place
train_data[train_data[0::,10] == '',10] = np.round(np.mean(train_data[train_data[0::,10]\
!= '',10].astype(np.float)))
train_data = np.delete(train_data,[2,7,9],1) #remove the name data, cabin and ticket
# I need to do the same with the test data now so that the columns are in the same
# as the training data
# I need to convert all strings to integer classifiers:
# male = 1, female = 0:
test_data[test_data[0::,2] == 'male',2] = 1
test_data[test_data[0::,2] == 'female',2] = 0
# Embark C = 0, S = 1, Q = 2
test_data[test_data[0::,9] == 'C',9] = 0
test_data[test_data[0::,9] == 'S',9] = 1
test_data[test_data[0::,9] =='Q',9] = 2
# All the ages with no data make the median of the data
test_data[test_data[0::,3] == '',3] = np.median(test_data[test_data[0::,3]\
!= '',3].astype(np.float))
# All missing embarks just make them embark from most common place
test_data[test_data[0::,9] == '',9] = np.round(np.median(test_data[test_data[0::,9]\
!= '',9].astype(np.float)))
# All the missing prices assume median of their respective class
for i in xrange(np.size(test_data[0::,0])):
if test_data[i,7] == '':
test_data[i,7] = np.median(test_data[(test_data[0::,7] != '') &\
(test_data[0::,0] == test_data[i,0])\
,7].astype(np.float))
test_data = np.delete(test_data,[1,6,8],1) # Remove the name data, cabin and ticket
# The data is now ready to go. So lets train then test!
print 'Training '
forest = RandomForestClassifier(n_estimators = 1000)
forest = forest.fit(train_data[0::,1::],\
train_data[0::,0])
print 'Predicting'
output = forest.predict(test_data) #predict results using our CLEANED data
# Write Results to fie
# open csv
seedling=open("agcfirstforest.csv", "wb")
test=open('test.csv', 'rb')
forest_Csv = csv.writer(seedling)
test_file_object = csv.reader(test)
test_file_object.next() # Header control
i = 0
for row in test_file_object:
row.insert(0,output[i].astype(np.uint8))
forest_Csv.writerow(row)
i += 1
test.close()
seedling.close()
print "Analysis has Finished"
| apache-2.0 |
GeoMop/Intersections | src/bspline_plot.py | 1 | 7295 | """
Functions to plot Bspline curves and surfaces.
"""
plot_lib = "plotly"
import plotly.offline as pl
import plotly.graph_objs as go
import matplotlib.pyplot as plt
from matplotlib import cm
from mpl_toolkits.mplot3d import Axes3D
import numpy as np
class PlottingPlotly:
def __init__(self):
self.i_figure = -1
self._reinit()
def _reinit(self):
self.i_figure += 1
self.data_3d = []
self.data_2d = []
def add_curve_2d(self, X, Y, **kwargs):
self.data_2d.append( go.Scatter(x=X, y=Y, mode = 'lines') )
def add_points_2d(self, X, Y, **kwargs):
marker = dict(
size=10,
color='red',
)
self.data_2d.append( go.Scatter(x=X, y=Y,
mode = 'markers',
marker=marker) )
def add_surface_3d(self, X, Y, Z, **kwargs):
hue = (120.0*(len(self.data_3d)))%360
colorscale = [[0.0, 'hsv({}, 50%, 10%)'.format(hue)], [1.0, 'hsv({}, 50%, 90%)'.format(hue)]]
self.data_3d.append( go.Surface(x=X, y=Y, z=Z, colorscale=colorscale))
def add_points_3d(self, X, Y, Z, **kwargs):
marker = dict(
size=5,
color='red',
# line=dict(
# color='rgba(217, 217, 217, 0.14)',
# width=0.5
# ),
opacity=0.6
)
self.data_3d.append( go.Scatter3d(
x=X, y=Y, z=Z,
mode='markers',
marker=marker
))
def show(self):
"""
Show added plots and clear the list for other plotting.
:return:
"""
if self.data_3d:
fig_3d = go.Figure(data=self.data_3d)
pl.plot(fig_3d, filename='bc_plot_3d_%d.html'%(self.i_figure))
if self.data_2d:
fig_2d = go.Figure(data=self.data_2d)
pl.plot(fig_2d, filename='bc_plot_2d_%d.html'%(self.i_figure))
self._reinit()
class PlottingMatplot:
def __init__(self):
self.fig_2d = plt.figure(1)
self.fig_3d = plt.figure(2)
self.ax_3d = self.fig_3d.gca(projection='3d')
def add_curve_2d(self, X, Y, **kwargs):
plt.figure(1)
plt.plot(X, Y, **kwargs)
def add_points_2d(self, X, Y, **kwargs):
plt.figure(1)
plt.plot(X, Y, 'bo', color='red', **kwargs)
def add_surface_3d(self, X, Y, Z, **kwargs):
plt.figure(2)
self.ax_3d.plot_surface(X, Y, Z, **kwargs)
def add_points_3d(self, X, Y, Z, **kwargs):
plt.figure(2)
return self.ax_3d.scatter(X, Y, Z, color='red', **kwargs)
def show(self):
"""
Show added plots and clear the list for other plotting.
:return:
"""
plt.show()
class Plotting:
"""
Debug plotting class. Several 2d and 3d plots can be added and finally displayed on common figure
calling self.show(). Matplotlib or plotly library is used as backend.
"""
def __init__(self, backend = PlottingPlotly()):
self.backend = backend
def plot_2d(self, X, Y):
"""
Add line scatter plot. Every plot use automatically different color.
:param X: x-coords of points
:param Y: y-coords of points
"""
self.backend.add_curve_2d(X,Y)
def scatter_2d(self, X, Y):
"""
Add point scatter plot. Every plot use automatically different color.
:param X: x-coords of points
:param Y: y-coords of points
"""
self.backend.add_points_2d(X,Y)
def plot_surface(self, X, Y, Z):
"""
Add line scatter plot. Every plot use automatically different color.
:param X: x-coords of points
:param Y: y-coords of points
"""
self.backend.add_surface_3d(X, Y, Z)
def plot_curve_2d(self, curve, n_points=100, poles=False):
"""
Add plot of a 2d Bspline curve.
:param curve: Curve t -> x,y
:param n_points: Number of evaluated points.
:param: kwargs: Additional parameters passed to the mtplotlib plot command.
"""
basis = curve.basis
t_coord = np.linspace(basis.domain[0], basis.domain[1], n_points)
coords = [curve.eval(t) for t in t_coord]
x_coord, y_coord = zip(*coords)
self.backend.add_curve_2d(x_coord, y_coord)
if poles:
self.plot_curve_poles_2d(curve)
def plot_curve_poles_2d(self, curve):
"""
Plot poles of the B-spline curve.
:param curve: Curve t -> x,y
:return: Plot object.
"""
x_poles, y_poles = curve.poles.T[0:2, :] # remove weights
return self.backend.add_points_2d(x_poles, y_poles)
def scatter_3d(self, X, Y, Z):
"""
Add point scatter plot. Every plot use automatically different color.
:param X: x-coords of points
:param Y: y-coords of points
"""
self.backend.add_points_3d(X, Y, Z)
def plot_surface_3d(self, surface, n_points=(100, 100), poles=False):
"""
Plot a surface in 3d.
Usage:
plotting=Plotting()
plotting.plot_surface_3d(surf_1)
plotting.plot_surface_3d(surf_2)
plotting.show()
:param surface: Parametric surface in 3d.
:param n_points: (nu, nv), nu*nv - number of evaluation point
"""
u_basis, v_basis = surface.u_basis, surface.v_basis
u_coord = np.linspace(u_basis.domain[0], u_basis.domain[1], n_points[0])
v_coord = np.linspace(v_basis.domain[0], v_basis.domain[1], n_points[1])
U, V = np.meshgrid(u_coord, v_coord)
points = np.stack( [U.ravel(), V.ravel()], axis = 1 )
xyz = surface.eval_array(points)
X, Y, Z = xyz.T
X = X.reshape(U.shape)
Y = Y.reshape(U.shape)
Z = Z.reshape(U.shape)
# Plot the surface.
self.backend.add_surface_3d(X, Y, Z)
if poles:
self.plot_surface_poles_3d(surface)
def plot_grid_surface_3d(self, surface, n_points=(100, 100)):
"""
Plot a surface in 3d, on UV plane.
:param surface: Parametric surface in 3d.
:param n_points: (nu, nv), nu*nv - number of evaluation point
"""
u_coord = np.linspace(0, 1.0, n_points[0])
v_coord = np.linspace(0, 1.0, n_points[1])
U, V = np.meshgrid(u_coord, v_coord)
points = np.stack( [U.ravel(), V.ravel()], axis = 1 )
xyz = surface.eval_array(points)
X, Y, Z = xyz.T
Z = Z.reshape(U.shape)
# Plot the surface.
self.backend.add_surface_3d(U, V, Z)
def plot_surface_poles_3d(self, surface, **kwargs):
"""
Plot poles of the B-spline curve.
:param curve: Curve t -> x,y
:param: kwargs: Additional parameters passed to the mtplotlib plot command.
:return: Plot object.
"""
x_poles, y_poles, z_poles = surface.poles[:, :, 0:3].reshape(-1, 3).T # remove weights and flatten nu, nv
return self.backend.add_points_3d(x_poles, y_poles, z_poles, **kwargs)
def show(self):
"""
Display added plots. Empty the queue.
:return:
"""
self.backend.show()
| gpl-3.0 |
MJuddBooth/pandas | pandas/tests/frame/test_asof.py | 2 | 4640 | # coding=utf-8
import numpy as np
import pytest
from pandas import DataFrame, Series, Timestamp, date_range, to_datetime
import pandas.util.testing as tm
from .common import TestData
class TestFrameAsof(TestData):
def setup_method(self, method):
self.N = N = 50
self.rng = date_range('1/1/1990', periods=N, freq='53s')
self.df = DataFrame({'A': np.arange(N), 'B': np.arange(N)},
index=self.rng)
def test_basic(self):
df = self.df.copy()
df.loc[15:30, 'A'] = np.nan
dates = date_range('1/1/1990', periods=self.N * 3,
freq='25s')
result = df.asof(dates)
assert result.notna().all(1).all()
lb = df.index[14]
ub = df.index[30]
dates = list(dates)
result = df.asof(dates)
assert result.notna().all(1).all()
mask = (result.index >= lb) & (result.index < ub)
rs = result[mask]
assert (rs == 14).all(1).all()
def test_subset(self):
N = 10
rng = date_range('1/1/1990', periods=N, freq='53s')
df = DataFrame({'A': np.arange(N), 'B': np.arange(N)},
index=rng)
df.loc[4:8, 'A'] = np.nan
dates = date_range('1/1/1990', periods=N * 3,
freq='25s')
# with a subset of A should be the same
result = df.asof(dates, subset='A')
expected = df.asof(dates)
tm.assert_frame_equal(result, expected)
# same with A/B
result = df.asof(dates, subset=['A', 'B'])
expected = df.asof(dates)
tm.assert_frame_equal(result, expected)
# B gives self.df.asof
result = df.asof(dates, subset='B')
expected = df.resample('25s', closed='right').ffill().reindex(dates)
expected.iloc[20:] = 9
tm.assert_frame_equal(result, expected)
def test_missing(self):
# GH 15118
# no match found - `where` value before earliest date in index
N = 10
rng = date_range('1/1/1990', periods=N, freq='53s')
df = DataFrame({'A': np.arange(N), 'B': np.arange(N)},
index=rng)
result = df.asof('1989-12-31')
expected = Series(index=['A', 'B'], name=Timestamp('1989-12-31'))
tm.assert_series_equal(result, expected)
result = df.asof(to_datetime(['1989-12-31']))
expected = DataFrame(index=to_datetime(['1989-12-31']),
columns=['A', 'B'], dtype='float64')
tm.assert_frame_equal(result, expected)
def test_all_nans(self):
# GH 15713
# DataFrame is all nans
result = DataFrame([np.nan]).asof([0])
expected = DataFrame([np.nan])
tm.assert_frame_equal(result, expected)
# testing non-default indexes, multiple inputs
dates = date_range('1/1/1990', periods=self.N * 3, freq='25s')
result = DataFrame(np.nan, index=self.rng, columns=['A']).asof(dates)
expected = DataFrame(np.nan, index=dates, columns=['A'])
tm.assert_frame_equal(result, expected)
# testing multiple columns
dates = date_range('1/1/1990', periods=self.N * 3, freq='25s')
result = DataFrame(np.nan, index=self.rng,
columns=['A', 'B', 'C']).asof(dates)
expected = DataFrame(np.nan, index=dates, columns=['A', 'B', 'C'])
tm.assert_frame_equal(result, expected)
# testing scalar input
result = DataFrame(np.nan, index=[1, 2], columns=['A', 'B']).asof([3])
expected = DataFrame(np.nan, index=[3], columns=['A', 'B'])
tm.assert_frame_equal(result, expected)
result = DataFrame(np.nan, index=[1, 2], columns=['A', 'B']).asof(3)
expected = Series(np.nan, index=['A', 'B'], name=3)
tm.assert_series_equal(result, expected)
@pytest.mark.parametrize(
"stamp,expected",
[(Timestamp('2018-01-01 23:22:43.325+00:00'),
Series(2.0, name=Timestamp('2018-01-01 23:22:43.325+00:00'))),
(Timestamp('2018-01-01 22:33:20.682+01:00'),
Series(1.0, name=Timestamp('2018-01-01 22:33:20.682+01:00'))),
]
)
def test_time_zone_aware_index(self, stamp, expected):
# GH21194
# Testing awareness of DataFrame index considering different
# UTC and timezone
df = DataFrame(data=[1, 2],
index=[Timestamp('2018-01-01 21:00:05.001+00:00'),
Timestamp('2018-01-01 22:35:10.550+00:00')])
result = df.asof(stamp)
tm.assert_series_equal(result, expected)
| bsd-3-clause |
pgandhi999/spark | python/pyspark/serializers.py | 5 | 30967 | #
# Licensed to the Apache Software Foundation (ASF) under one or more
# contributor license agreements. See the NOTICE file distributed with
# this work for additional information regarding copyright ownership.
# The ASF licenses this file to You under the Apache License, Version 2.0
# (the "License"); you may not use this file except in compliance with
# the License. You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
#
"""
PySpark supports custom serializers for transferring data; this can improve
performance.
By default, PySpark uses :class:`PickleSerializer` to serialize objects using Python's
`cPickle` serializer, which can serialize nearly any Python object.
Other serializers, like :class:`MarshalSerializer`, support fewer datatypes but can be
faster.
The serializer is chosen when creating :class:`SparkContext`:
>>> from pyspark.context import SparkContext
>>> from pyspark.serializers import MarshalSerializer
>>> sc = SparkContext('local', 'test', serializer=MarshalSerializer())
>>> sc.parallelize(list(range(1000))).map(lambda x: 2 * x).take(10)
[0, 2, 4, 6, 8, 10, 12, 14, 16, 18]
>>> sc.stop()
PySpark serializes objects in batches; by default, the batch size is chosen based
on the size of objects and is also configurable by SparkContext's `batchSize`
parameter:
>>> sc = SparkContext('local', 'test', batchSize=2)
>>> rdd = sc.parallelize(range(16), 4).map(lambda x: x)
Behind the scenes, this creates a JavaRDD with four partitions, each of
which contains two batches of two objects:
>>> rdd.glom().collect()
[[0, 1, 2, 3], [4, 5, 6, 7], [8, 9, 10, 11], [12, 13, 14, 15]]
>>> int(rdd._jrdd.count())
8
>>> sc.stop()
"""
import sys
from itertools import chain, product
import marshal
import struct
import types
import collections
import zlib
import itertools
if sys.version < '3':
import cPickle as pickle
from itertools import izip as zip, imap as map
else:
import pickle
basestring = unicode = str
xrange = range
pickle_protocol = pickle.HIGHEST_PROTOCOL
from pyspark import cloudpickle
from pyspark.util import _exception_message
__all__ = ["PickleSerializer", "MarshalSerializer", "UTF8Deserializer"]
class SpecialLengths(object):
END_OF_DATA_SECTION = -1
PYTHON_EXCEPTION_THROWN = -2
TIMING_DATA = -3
END_OF_STREAM = -4
NULL = -5
START_ARROW_STREAM = -6
class Serializer(object):
def dump_stream(self, iterator, stream):
"""
Serialize an iterator of objects to the output stream.
"""
raise NotImplementedError
def load_stream(self, stream):
"""
Return an iterator of deserialized objects from the input stream.
"""
raise NotImplementedError
def _load_stream_without_unbatching(self, stream):
"""
Return an iterator of deserialized batches (iterable) of objects from the input stream.
If the serializer does not operate on batches the default implementation returns an
iterator of single element lists.
"""
return map(lambda x: [x], self.load_stream(stream))
# Note: our notion of "equality" is that output generated by
# equal serializers can be deserialized using the same serializer.
# This default implementation handles the simple cases;
# subclasses should override __eq__ as appropriate.
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not self.__eq__(other)
def __repr__(self):
return "%s()" % self.__class__.__name__
def __hash__(self):
return hash(str(self))
class FramedSerializer(Serializer):
"""
Serializer that writes objects as a stream of (length, data) pairs,
where `length` is a 32-bit integer and data is `length` bytes.
"""
def __init__(self):
# On Python 2.6, we can't write bytearrays to streams, so we need to convert them
# to strings first. Check if the version number is that old.
self._only_write_strings = sys.version_info[0:2] <= (2, 6)
def dump_stream(self, iterator, stream):
for obj in iterator:
self._write_with_length(obj, stream)
def load_stream(self, stream):
while True:
try:
yield self._read_with_length(stream)
except EOFError:
return
def _write_with_length(self, obj, stream):
serialized = self.dumps(obj)
if serialized is None:
raise ValueError("serialized value should not be None")
if len(serialized) > (1 << 31):
raise ValueError("can not serialize object larger than 2G")
write_int(len(serialized), stream)
if self._only_write_strings:
stream.write(str(serialized))
else:
stream.write(serialized)
def _read_with_length(self, stream):
length = read_int(stream)
if length == SpecialLengths.END_OF_DATA_SECTION:
raise EOFError
elif length == SpecialLengths.NULL:
return None
obj = stream.read(length)
if len(obj) < length:
raise EOFError
return self.loads(obj)
def dumps(self, obj):
"""
Serialize an object into a byte array.
When batching is used, this will be called with an array of objects.
"""
raise NotImplementedError
def loads(self, obj):
"""
Deserialize an object from a byte array.
"""
raise NotImplementedError
class ArrowCollectSerializer(Serializer):
"""
Deserialize a stream of batches followed by batch order information. Used in
DataFrame._collectAsArrow() after invoking Dataset.collectAsArrowToPython() in the JVM.
"""
def __init__(self):
self.serializer = ArrowStreamSerializer()
def dump_stream(self, iterator, stream):
return self.serializer.dump_stream(iterator, stream)
def load_stream(self, stream):
"""
Load a stream of un-ordered Arrow RecordBatches, where the last iteration yields
a list of indices that can be used to put the RecordBatches in the correct order.
"""
# load the batches
for batch in self.serializer.load_stream(stream):
yield batch
# load the batch order indices or propagate any error that occurred in the JVM
num = read_int(stream)
if num == -1:
error_msg = UTF8Deserializer().loads(stream)
raise RuntimeError("An error occurred while calling "
"ArrowCollectSerializer.load_stream: {}".format(error_msg))
batch_order = []
for i in xrange(num):
index = read_int(stream)
batch_order.append(index)
yield batch_order
def __repr__(self):
return "ArrowCollectSerializer(%s)" % self.serializer
class ArrowStreamSerializer(Serializer):
"""
Serializes Arrow record batches as a stream.
"""
def dump_stream(self, iterator, stream):
import pyarrow as pa
writer = None
try:
for batch in iterator:
if writer is None:
writer = pa.RecordBatchStreamWriter(stream, batch.schema)
writer.write_batch(batch)
finally:
if writer is not None:
writer.close()
def load_stream(self, stream):
import pyarrow as pa
reader = pa.ipc.open_stream(stream)
for batch in reader:
yield batch
def __repr__(self):
return "ArrowStreamSerializer"
class ArrowStreamPandasSerializer(ArrowStreamSerializer):
"""
Serializes Pandas.Series as Arrow data with Arrow streaming format.
:param timezone: A timezone to respect when handling timestamp values
:param safecheck: If True, conversion from Arrow to Pandas checks for overflow/truncation
:param assign_cols_by_name: If True, then Pandas DataFrames will get columns by name
"""
def __init__(self, timezone, safecheck, assign_cols_by_name):
super(ArrowStreamPandasSerializer, self).__init__()
self._timezone = timezone
self._safecheck = safecheck
self._assign_cols_by_name = assign_cols_by_name
def arrow_to_pandas(self, arrow_column):
from pyspark.sql.types import _check_series_localize_timestamps
# If the given column is a date type column, creates a series of datetime.date directly
# instead of creating datetime64[ns] as intermediate data to avoid overflow caused by
# datetime64[ns] type handling.
s = arrow_column.to_pandas(date_as_object=True)
s = _check_series_localize_timestamps(s, self._timezone)
return s
def _create_batch(self, series):
"""
Create an Arrow record batch from the given pandas.Series or list of Series,
with optional type.
:param series: A single pandas.Series, list of Series, or list of (series, arrow_type)
:return: Arrow RecordBatch
"""
import pandas as pd
import pyarrow as pa
from pyspark.sql.types import _check_series_convert_timestamps_internal
# Make input conform to [(series1, type1), (series2, type2), ...]
if not isinstance(series, (list, tuple)) or \
(len(series) == 2 and isinstance(series[1], pa.DataType)):
series = [series]
series = ((s, None) if not isinstance(s, (list, tuple)) else s for s in series)
def create_array(s, t):
mask = s.isnull()
# Ensure timestamp series are in expected form for Spark internal representation
if t is not None and pa.types.is_timestamp(t):
s = _check_series_convert_timestamps_internal(s, self._timezone)
try:
array = pa.Array.from_pandas(s, mask=mask, type=t, safe=self._safecheck)
except pa.ArrowException as e:
error_msg = "Exception thrown when converting pandas.Series (%s) to Arrow " + \
"Array (%s). It can be caused by overflows or other unsafe " + \
"conversions warned by Arrow. Arrow safe type check can be " + \
"disabled by using SQL config " + \
"`spark.sql.execution.pandas.arrowSafeTypeConversion`."
raise RuntimeError(error_msg % (s.dtype, t), e)
return array
arrs = []
for s, t in series:
if t is not None and pa.types.is_struct(t):
if not isinstance(s, pd.DataFrame):
raise ValueError("A field of type StructType expects a pandas.DataFrame, "
"but got: %s" % str(type(s)))
# Input partition and result pandas.DataFrame empty, make empty Arrays with struct
if len(s) == 0 and len(s.columns) == 0:
arrs_names = [(pa.array([], type=field.type), field.name) for field in t]
# Assign result columns by schema name if user labeled with strings
elif self._assign_cols_by_name and any(isinstance(name, basestring)
for name in s.columns):
arrs_names = [(create_array(s[field.name], field.type), field.name)
for field in t]
# Assign result columns by position
else:
arrs_names = [(create_array(s[s.columns[i]], field.type), field.name)
for i, field in enumerate(t)]
struct_arrs, struct_names = zip(*arrs_names)
arrs.append(pa.StructArray.from_arrays(struct_arrs, struct_names))
else:
arrs.append(create_array(s, t))
return pa.RecordBatch.from_arrays(arrs, ["_%d" % i for i in xrange(len(arrs))])
def dump_stream(self, iterator, stream):
"""
Make ArrowRecordBatches from Pandas Series and serialize. Input is a single series or
a list of series accompanied by an optional pyarrow type to coerce the data to.
"""
batches = (self._create_batch(series) for series in iterator)
super(ArrowStreamPandasSerializer, self).dump_stream(batches, stream)
def load_stream(self, stream):
"""
Deserialize ArrowRecordBatches to an Arrow table and return as a list of pandas.Series.
"""
batches = super(ArrowStreamPandasSerializer, self).load_stream(stream)
import pyarrow as pa
for batch in batches:
yield [self.arrow_to_pandas(c) for c in pa.Table.from_batches([batch]).itercolumns()]
def __repr__(self):
return "ArrowStreamPandasSerializer"
class ArrowStreamPandasUDFSerializer(ArrowStreamPandasSerializer):
"""
Serializer used by Python worker to evaluate Pandas UDFs
"""
def __init__(self, timezone, safecheck, assign_cols_by_name, df_for_struct=False):
super(ArrowStreamPandasUDFSerializer, self) \
.__init__(timezone, safecheck, assign_cols_by_name)
self._df_for_struct = df_for_struct
def arrow_to_pandas(self, arrow_column):
import pyarrow.types as types
if self._df_for_struct and types.is_struct(arrow_column.type):
import pandas as pd
series = [super(ArrowStreamPandasUDFSerializer, self).arrow_to_pandas(column)
.rename(field.name)
for column, field in zip(arrow_column.flatten(), arrow_column.type)]
s = pd.concat(series, axis=1)
else:
s = super(ArrowStreamPandasUDFSerializer, self).arrow_to_pandas(arrow_column)
return s
def dump_stream(self, iterator, stream):
"""
Override because Pandas UDFs require a START_ARROW_STREAM before the Arrow stream is sent.
This should be sent after creating the first record batch so in case of an error, it can
be sent back to the JVM before the Arrow stream starts.
"""
def init_stream_yield_batches():
should_write_start_length = True
for series in iterator:
batch = self._create_batch(series)
if should_write_start_length:
write_int(SpecialLengths.START_ARROW_STREAM, stream)
should_write_start_length = False
yield batch
return ArrowStreamSerializer.dump_stream(self, init_stream_yield_batches(), stream)
def __repr__(self):
return "ArrowStreamPandasUDFSerializer"
class BatchedSerializer(Serializer):
"""
Serializes a stream of objects in batches by calling its wrapped
Serializer with streams of objects.
"""
UNLIMITED_BATCH_SIZE = -1
UNKNOWN_BATCH_SIZE = 0
def __init__(self, serializer, batchSize=UNLIMITED_BATCH_SIZE):
self.serializer = serializer
self.batchSize = batchSize
def _batched(self, iterator):
if self.batchSize == self.UNLIMITED_BATCH_SIZE:
yield list(iterator)
elif hasattr(iterator, "__len__") and hasattr(iterator, "__getslice__"):
n = len(iterator)
for i in xrange(0, n, self.batchSize):
yield iterator[i: i + self.batchSize]
else:
items = []
count = 0
for item in iterator:
items.append(item)
count += 1
if count == self.batchSize:
yield items
items = []
count = 0
if items:
yield items
def dump_stream(self, iterator, stream):
self.serializer.dump_stream(self._batched(iterator), stream)
def load_stream(self, stream):
return chain.from_iterable(self._load_stream_without_unbatching(stream))
def _load_stream_without_unbatching(self, stream):
return self.serializer.load_stream(stream)
def __repr__(self):
return "BatchedSerializer(%s, %d)" % (str(self.serializer), self.batchSize)
class FlattenedValuesSerializer(BatchedSerializer):
"""
Serializes a stream of list of pairs, split the list of values
which contain more than a certain number of objects to make them
have similar sizes.
"""
def __init__(self, serializer, batchSize=10):
BatchedSerializer.__init__(self, serializer, batchSize)
def _batched(self, iterator):
n = self.batchSize
for key, values in iterator:
for i in range(0, len(values), n):
yield key, values[i:i + n]
def load_stream(self, stream):
return self.serializer.load_stream(stream)
def __repr__(self):
return "FlattenedValuesSerializer(%s, %d)" % (self.serializer, self.batchSize)
class AutoBatchedSerializer(BatchedSerializer):
"""
Choose the size of batch automatically based on the size of object
"""
def __init__(self, serializer, bestSize=1 << 16):
BatchedSerializer.__init__(self, serializer, self.UNKNOWN_BATCH_SIZE)
self.bestSize = bestSize
def dump_stream(self, iterator, stream):
batch, best = 1, self.bestSize
iterator = iter(iterator)
while True:
vs = list(itertools.islice(iterator, batch))
if not vs:
break
bytes = self.serializer.dumps(vs)
write_int(len(bytes), stream)
stream.write(bytes)
size = len(bytes)
if size < best:
batch *= 2
elif size > best * 10 and batch > 1:
batch //= 2
def __repr__(self):
return "AutoBatchedSerializer(%s)" % self.serializer
class CartesianDeserializer(Serializer):
"""
Deserializes the JavaRDD cartesian() of two PythonRDDs.
Due to pyspark batching we cannot simply use the result of the Java RDD cartesian,
we additionally need to do the cartesian within each pair of batches.
"""
def __init__(self, key_ser, val_ser):
self.key_ser = key_ser
self.val_ser = val_ser
def _load_stream_without_unbatching(self, stream):
key_batch_stream = self.key_ser._load_stream_without_unbatching(stream)
val_batch_stream = self.val_ser._load_stream_without_unbatching(stream)
for (key_batch, val_batch) in zip(key_batch_stream, val_batch_stream):
# for correctness with repeated cartesian/zip this must be returned as one batch
yield product(key_batch, val_batch)
def load_stream(self, stream):
return chain.from_iterable(self._load_stream_without_unbatching(stream))
def __repr__(self):
return "CartesianDeserializer(%s, %s)" % \
(str(self.key_ser), str(self.val_ser))
class PairDeserializer(Serializer):
"""
Deserializes the JavaRDD zip() of two PythonRDDs.
Due to pyspark batching we cannot simply use the result of the Java RDD zip,
we additionally need to do the zip within each pair of batches.
"""
def __init__(self, key_ser, val_ser):
self.key_ser = key_ser
self.val_ser = val_ser
def _load_stream_without_unbatching(self, stream):
key_batch_stream = self.key_ser._load_stream_without_unbatching(stream)
val_batch_stream = self.val_ser._load_stream_without_unbatching(stream)
for (key_batch, val_batch) in zip(key_batch_stream, val_batch_stream):
# For double-zipped RDDs, the batches can be iterators from other PairDeserializer,
# instead of lists. We need to convert them to lists if needed.
key_batch = key_batch if hasattr(key_batch, '__len__') else list(key_batch)
val_batch = val_batch if hasattr(val_batch, '__len__') else list(val_batch)
if len(key_batch) != len(val_batch):
raise ValueError("Can not deserialize PairRDD with different number of items"
" in batches: (%d, %d)" % (len(key_batch), len(val_batch)))
# for correctness with repeated cartesian/zip this must be returned as one batch
yield zip(key_batch, val_batch)
def load_stream(self, stream):
return chain.from_iterable(self._load_stream_without_unbatching(stream))
def __repr__(self):
return "PairDeserializer(%s, %s)" % (str(self.key_ser), str(self.val_ser))
class NoOpSerializer(FramedSerializer):
def loads(self, obj):
return obj
def dumps(self, obj):
return obj
# Hack namedtuple, make it picklable
__cls = {}
def _restore(name, fields, value):
""" Restore an object of namedtuple"""
k = (name, fields)
cls = __cls.get(k)
if cls is None:
cls = collections.namedtuple(name, fields)
__cls[k] = cls
return cls(*value)
def _hack_namedtuple(cls):
""" Make class generated by namedtuple picklable """
name = cls.__name__
fields = cls._fields
def __reduce__(self):
return (_restore, (name, fields, tuple(self)))
cls.__reduce__ = __reduce__
cls._is_namedtuple_ = True
return cls
def _hijack_namedtuple():
""" Hack namedtuple() to make it picklable """
# hijack only one time
if hasattr(collections.namedtuple, "__hijack"):
return
global _old_namedtuple # or it will put in closure
global _old_namedtuple_kwdefaults # or it will put in closure too
def _copy_func(f):
return types.FunctionType(f.__code__, f.__globals__, f.__name__,
f.__defaults__, f.__closure__)
def _kwdefaults(f):
# __kwdefaults__ contains the default values of keyword-only arguments which are
# introduced from Python 3. The possible cases for __kwdefaults__ in namedtuple
# are as below:
#
# - Does not exist in Python 2.
# - Returns None in <= Python 3.5.x.
# - Returns a dictionary containing the default values to the keys from Python 3.6.x
# (See https://bugs.python.org/issue25628).
kargs = getattr(f, "__kwdefaults__", None)
if kargs is None:
return {}
else:
return kargs
_old_namedtuple = _copy_func(collections.namedtuple)
_old_namedtuple_kwdefaults = _kwdefaults(collections.namedtuple)
def namedtuple(*args, **kwargs):
for k, v in _old_namedtuple_kwdefaults.items():
kwargs[k] = kwargs.get(k, v)
cls = _old_namedtuple(*args, **kwargs)
return _hack_namedtuple(cls)
# replace namedtuple with the new one
collections.namedtuple.__globals__["_old_namedtuple_kwdefaults"] = _old_namedtuple_kwdefaults
collections.namedtuple.__globals__["_old_namedtuple"] = _old_namedtuple
collections.namedtuple.__globals__["_hack_namedtuple"] = _hack_namedtuple
collections.namedtuple.__code__ = namedtuple.__code__
collections.namedtuple.__hijack = 1
# hack the cls already generated by namedtuple.
# Those created in other modules can be pickled as normal,
# so only hack those in __main__ module
for n, o in sys.modules["__main__"].__dict__.items():
if (type(o) is type and o.__base__ is tuple
and hasattr(o, "_fields")
and "__reduce__" not in o.__dict__):
_hack_namedtuple(o) # hack inplace
_hijack_namedtuple()
class PickleSerializer(FramedSerializer):
"""
Serializes objects using Python's pickle serializer:
http://docs.python.org/2/library/pickle.html
This serializer supports nearly any Python object, but may
not be as fast as more specialized serializers.
"""
def dumps(self, obj):
return pickle.dumps(obj, pickle_protocol)
if sys.version >= '3':
def loads(self, obj, encoding="bytes"):
return pickle.loads(obj, encoding=encoding)
else:
def loads(self, obj, encoding=None):
return pickle.loads(obj)
class CloudPickleSerializer(PickleSerializer):
def dumps(self, obj):
try:
return cloudpickle.dumps(obj, pickle_protocol)
except pickle.PickleError:
raise
except Exception as e:
emsg = _exception_message(e)
if "'i' format requires" in emsg:
msg = "Object too large to serialize: %s" % emsg
else:
msg = "Could not serialize object: %s: %s" % (e.__class__.__name__, emsg)
cloudpickle.print_exec(sys.stderr)
raise pickle.PicklingError(msg)
class MarshalSerializer(FramedSerializer):
"""
Serializes objects using Python's Marshal serializer:
http://docs.python.org/2/library/marshal.html
This serializer is faster than PickleSerializer but supports fewer datatypes.
"""
def dumps(self, obj):
return marshal.dumps(obj)
def loads(self, obj):
return marshal.loads(obj)
class AutoSerializer(FramedSerializer):
"""
Choose marshal or pickle as serialization protocol automatically
"""
def __init__(self):
FramedSerializer.__init__(self)
self._type = None
def dumps(self, obj):
if self._type is not None:
return b'P' + pickle.dumps(obj, -1)
try:
return b'M' + marshal.dumps(obj)
except Exception:
self._type = b'P'
return b'P' + pickle.dumps(obj, -1)
def loads(self, obj):
_type = obj[0]
if _type == b'M':
return marshal.loads(obj[1:])
elif _type == b'P':
return pickle.loads(obj[1:])
else:
raise ValueError("invalid serialization type: %s" % _type)
class CompressedSerializer(FramedSerializer):
"""
Compress the serialized data
"""
def __init__(self, serializer):
FramedSerializer.__init__(self)
assert isinstance(serializer, FramedSerializer), "serializer must be a FramedSerializer"
self.serializer = serializer
def dumps(self, obj):
return zlib.compress(self.serializer.dumps(obj), 1)
def loads(self, obj):
return self.serializer.loads(zlib.decompress(obj))
def __repr__(self):
return "CompressedSerializer(%s)" % self.serializer
class UTF8Deserializer(Serializer):
"""
Deserializes streams written by String.getBytes.
"""
def __init__(self, use_unicode=True):
self.use_unicode = use_unicode
def loads(self, stream):
length = read_int(stream)
if length == SpecialLengths.END_OF_DATA_SECTION:
raise EOFError
elif length == SpecialLengths.NULL:
return None
s = stream.read(length)
return s.decode("utf-8") if self.use_unicode else s
def load_stream(self, stream):
try:
while True:
yield self.loads(stream)
except struct.error:
return
except EOFError:
return
def __repr__(self):
return "UTF8Deserializer(%s)" % self.use_unicode
def read_long(stream):
length = stream.read(8)
if not length:
raise EOFError
return struct.unpack("!q", length)[0]
def write_long(value, stream):
stream.write(struct.pack("!q", value))
def pack_long(value):
return struct.pack("!q", value)
def read_int(stream):
length = stream.read(4)
if not length:
raise EOFError
return struct.unpack("!i", length)[0]
def write_int(value, stream):
stream.write(struct.pack("!i", value))
def read_bool(stream):
length = stream.read(1)
if not length:
raise EOFError
return struct.unpack("!?", length)[0]
def write_with_length(obj, stream):
write_int(len(obj), stream)
stream.write(obj)
class ChunkedStream(object):
"""
This is a file-like object takes a stream of data, of unknown length, and breaks it into fixed
length frames. The intended use case is serializing large data and sending it immediately over
a socket -- we do not want to buffer the entire data before sending it, but the receiving end
needs to know whether or not there is more data coming.
It works by buffering the incoming data in some fixed-size chunks. If the buffer is full, it
first sends the buffer size, then the data. This repeats as long as there is more data to send.
When this is closed, it sends the length of whatever data is in the buffer, then that data, and
finally a "length" of -1 to indicate the stream has completed.
"""
def __init__(self, wrapped, buffer_size):
self.buffer_size = buffer_size
self.buffer = bytearray(buffer_size)
self.current_pos = 0
self.wrapped = wrapped
def write(self, bytes):
byte_pos = 0
byte_remaining = len(bytes)
while byte_remaining > 0:
new_pos = byte_remaining + self.current_pos
if new_pos < self.buffer_size:
# just put it in our buffer
self.buffer[self.current_pos:new_pos] = bytes[byte_pos:]
self.current_pos = new_pos
byte_remaining = 0
else:
# fill the buffer, send the length then the contents, and start filling again
space_left = self.buffer_size - self.current_pos
new_byte_pos = byte_pos + space_left
self.buffer[self.current_pos:self.buffer_size] = bytes[byte_pos:new_byte_pos]
write_int(self.buffer_size, self.wrapped)
self.wrapped.write(self.buffer)
byte_remaining -= space_left
byte_pos = new_byte_pos
self.current_pos = 0
def close(self):
# if there is anything left in the buffer, write it out first
if self.current_pos > 0:
write_int(self.current_pos, self.wrapped)
self.wrapped.write(self.buffer[:self.current_pos])
# -1 length indicates to the receiving end that we're done.
write_int(-1, self.wrapped)
self.wrapped.close()
@property
def closed(self):
"""
Return True if the `wrapped` object has been closed.
NOTE: this property is required by pyarrow to be used as a file-like object in
pyarrow.RecordBatchStreamWriter from ArrowStreamSerializer
"""
return self.wrapped.closed
if __name__ == '__main__':
import doctest
(failure_count, test_count) = doctest.testmod()
if failure_count:
sys.exit(-1)
| apache-2.0 |
magne-max/zipline-ja | zipline/data/resample.py | 1 | 24726 | # Copyright 2016 Quantopian, Inc.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
from collections import OrderedDict
from abc import ABCMeta, abstractmethod
import numpy as np
from numpy import nan
import pandas as pd
from pandas import DataFrame
from six import with_metaclass
from zipline.data.minute_bars import MinuteBarReader
from zipline.data.us_equity_pricing import NoDataOnDate
from zipline.data.session_bars import SessionBarReader
from zipline.utils.memoize import lazyval
_MINUTE_TO_SESSION_OHCLV_HOW = OrderedDict((
('open', 'first'),
('high', 'max'),
('low', 'min'),
('close', 'last'),
('volume', 'sum'),
))
def minute_to_session(minute_frame, calendar):
"""
Resample a DataFrame with minute data into the frame expected by a
BcolzDailyBarWriter.
Parameters
----------
minute_frame : pd.DataFrame
A DataFrame with the columns `open`, `high`, `low`, `close`, `volume`,
and `dt` (minute dts)
calendar : zipline.utils.calendars.trading_calendar.TradingCalendar
A TradingCalendar on which session labels to resample from minute
to session.
Return
------
session_frame : pd.DataFrame
A DataFrame with the columns `open`, `high`, `low`, `close`, `volume`,
and `day` (datetime-like).
"""
how = OrderedDict((c, _MINUTE_TO_SESSION_OHCLV_HOW[c])
for c in minute_frame.columns)
return minute_frame.groupby(calendar.minute_to_session_label).agg(
how)
class DailyHistoryAggregator(object):
"""
Converts minute pricing data into a daily summary, to be used for the
last slot in a call to history with a frequency of `1d`.
This summary is the same as a daily bar rollup of minute data, with the
distinction that the summary is truncated to the `dt` requested.
i.e. the aggregation slides forward during a the course of simulation day.
Provides aggregation for `open`, `high`, `low`, `close`, and `volume`.
The aggregation rules for each price type is documented in their respective
"""
def __init__(self, market_opens, minute_reader, trading_calendar):
self._market_opens = market_opens
self._minute_reader = minute_reader
self._trading_calendar = trading_calendar
# The caches are structured as (date, market_open, entries), where
# entries is a dict of asset -> (last_visited_dt, value)
#
# Whenever an aggregation method determines the current value,
# the entry for the respective asset should be overwritten with a new
# entry for the current dt.value (int) and aggregation value.
#
# When the requested dt's date is different from date the cache is
# flushed, so that the cache entries do not grow unbounded.
#
# Example cache:
# cache = (date(2016, 3, 17),
# pd.Timestamp('2016-03-17 13:31', tz='UTC'),
# {
# 1: (1458221460000000000, np.nan),
# 2: (1458221460000000000, 42.0),
# })
self._caches = {
'open': None,
'high': None,
'low': None,
'close': None,
'volume': None
}
# The int value is used for deltas to avoid extra computation from
# creating new Timestamps.
self._one_min = pd.Timedelta('1 min').value
def _prelude(self, dt, field):
session = self._trading_calendar.minute_to_session_label(dt)
dt_value = dt.value
cache = self._caches[field]
if cache is None or cache[0] != session:
market_open = self._market_opens.loc[session]
cache = self._caches[field] = (session, market_open, {})
_, market_open, entries = cache
market_open = market_open.tz_localize('UTC')
if dt != market_open:
prev_dt = dt_value - self._one_min
else:
prev_dt = None
return market_open, prev_dt, dt_value, entries
def opens(self, assets, dt):
"""
The open field's aggregation returns the first value that occurs
for the day, if there has been no data on or before the `dt` the open
is `nan`.
Once the first non-nan open is seen, that value remains constant per
asset for the remainder of the day.
Returns
-------
np.array with dtype=float64, in order of assets parameter.
"""
market_open, prev_dt, dt_value, entries = self._prelude(dt, 'open')
opens = []
session_label = self._trading_calendar.minute_to_session_label(dt)
for asset in assets:
if not asset.is_alive_for_session(session_label):
opens.append(np.NaN)
continue
if prev_dt is None:
val = self._minute_reader.get_value(asset, dt, 'open')
entries[asset] = (dt_value, val)
opens.append(val)
continue
else:
try:
last_visited_dt, first_open = entries[asset]
if last_visited_dt == dt_value:
opens.append(first_open)
continue
elif not pd.isnull(first_open):
opens.append(first_open)
entries[asset] = (dt_value, first_open)
continue
else:
after_last = pd.Timestamp(
last_visited_dt + self._one_min, tz='UTC')
window = self._minute_reader.load_raw_arrays(
['open'],
after_last,
dt,
[asset],
)[0]
nonnan = window[~pd.isnull(window)]
if len(nonnan):
val = nonnan[0]
else:
val = np.nan
entries[asset] = (dt_value, val)
opens.append(val)
continue
except KeyError:
window = self._minute_reader.load_raw_arrays(
['open'],
market_open,
dt,
[asset],
)[0]
nonnan = window[~pd.isnull(window)]
if len(nonnan):
val = nonnan[0]
else:
val = np.nan
entries[asset] = (dt_value, val)
opens.append(val)
continue
return np.array(opens)
def highs(self, assets, dt):
"""
The high field's aggregation returns the largest high seen between
the market open and the current dt.
If there has been no data on or before the `dt` the high is `nan`.
Returns
-------
np.array with dtype=float64, in order of assets parameter.
"""
market_open, prev_dt, dt_value, entries = self._prelude(dt, 'high')
highs = []
session_label = self._trading_calendar.minute_to_session_label(dt)
for asset in assets:
if not asset.is_alive_for_session(session_label):
highs.append(np.NaN)
continue
if prev_dt is None:
val = self._minute_reader.get_value(asset, dt, 'high')
entries[asset] = (dt_value, val)
highs.append(val)
continue
else:
try:
last_visited_dt, last_max = entries[asset]
if last_visited_dt == dt_value:
highs.append(last_max)
continue
elif last_visited_dt == prev_dt:
curr_val = self._minute_reader.get_value(
asset, dt, 'high')
if pd.isnull(curr_val):
val = last_max
elif pd.isnull(last_max):
val = curr_val
else:
val = max(last_max, curr_val)
entries[asset] = (dt_value, val)
highs.append(val)
continue
else:
after_last = pd.Timestamp(
last_visited_dt + self._one_min, tz='UTC')
window = self._minute_reader.load_raw_arrays(
['high'],
after_last,
dt,
[asset],
)[0].T
val = np.nanmax(np.append(window, last_max))
entries[asset] = (dt_value, val)
highs.append(val)
continue
except KeyError:
window = self._minute_reader.load_raw_arrays(
['high'],
market_open,
dt,
[asset],
)[0].T
val = np.nanmax(window)
entries[asset] = (dt_value, val)
highs.append(val)
continue
return np.array(highs)
def lows(self, assets, dt):
"""
The low field's aggregation returns the smallest low seen between
the market open and the current dt.
If there has been no data on or before the `dt` the low is `nan`.
Returns
-------
np.array with dtype=float64, in order of assets parameter.
"""
market_open, prev_dt, dt_value, entries = self._prelude(dt, 'low')
lows = []
session_label = self._trading_calendar.minute_to_session_label(dt)
for asset in assets:
if not asset.is_alive_for_session(session_label):
lows.append(np.NaN)
continue
if prev_dt is None:
val = self._minute_reader.get_value(asset, dt, 'low')
entries[asset] = (dt_value, val)
lows.append(val)
continue
else:
try:
last_visited_dt, last_min = entries[asset]
if last_visited_dt == dt_value:
lows.append(last_min)
continue
elif last_visited_dt == prev_dt:
curr_val = self._minute_reader.get_value(
asset, dt, 'low')
val = np.nanmin([last_min, curr_val])
entries[asset] = (dt_value, val)
lows.append(val)
continue
else:
after_last = pd.Timestamp(
last_visited_dt + self._one_min, tz='UTC')
window = self._minute_reader.load_raw_arrays(
['low'],
after_last,
dt,
[asset],
)[0].T
val = np.nanmin(np.append(window, last_min))
entries[asset] = (dt_value, val)
lows.append(val)
continue
except KeyError:
window = self._minute_reader.load_raw_arrays(
['low'],
market_open,
dt,
[asset],
)[0].T
val = np.nanmin(window)
entries[asset] = (dt_value, val)
lows.append(val)
continue
return np.array(lows)
def closes(self, assets, dt):
"""
The close field's aggregation returns the latest close at the given
dt.
If the close for the given dt is `nan`, the most recent non-nan
`close` is used.
If there has been no data on or before the `dt` the close is `nan`.
Returns
-------
np.array with dtype=float64, in order of assets parameter.
"""
market_open, prev_dt, dt_value, entries = self._prelude(dt, 'close')
closes = []
session_label = self._trading_calendar.minute_to_session_label(dt)
for asset in assets:
if not asset.is_alive_for_session(session_label):
closes.append(np.NaN)
continue
if prev_dt is None:
val = self._minute_reader.get_value(asset, dt, 'close')
entries[asset] = (dt_value, val)
closes.append(val)
continue
else:
try:
last_visited_dt, last_close = entries[asset]
if last_visited_dt == dt_value:
closes.append(last_close)
continue
elif last_visited_dt == prev_dt:
val = self._minute_reader.get_value(
asset, dt, 'close')
if pd.isnull(val):
val = last_close
entries[asset] = (dt_value, val)
closes.append(val)
continue
else:
val = self._minute_reader.get_value(
asset, dt, 'close')
if pd.isnull(val):
val = self.closes(
[asset],
pd.Timestamp(prev_dt, tz='UTC'))[0]
entries[asset] = (dt_value, val)
closes.append(val)
continue
except KeyError:
val = self._minute_reader.get_value(
asset, dt, 'close')
if pd.isnull(val):
val = self.closes([asset],
pd.Timestamp(prev_dt, tz='UTC'))[0]
entries[asset] = (dt_value, val)
closes.append(val)
continue
return np.array(closes)
def volumes(self, assets, dt):
"""
The volume field's aggregation returns the sum of all volumes
between the market open and the `dt`
If there has been no data on or before the `dt` the volume is 0.
Returns
-------
np.array with dtype=int64, in order of assets parameter.
"""
market_open, prev_dt, dt_value, entries = self._prelude(dt, 'volume')
volumes = []
session_label = self._trading_calendar.minute_to_session_label(dt)
for asset in assets:
if not asset.is_alive_for_session(session_label):
volumes.append(0)
continue
if prev_dt is None:
val = self._minute_reader.get_value(asset, dt, 'volume')
entries[asset] = (dt_value, val)
volumes.append(val)
continue
else:
try:
last_visited_dt, last_total = entries[asset]
if last_visited_dt == dt_value:
volumes.append(last_total)
continue
elif last_visited_dt == prev_dt:
val = self._minute_reader.get_value(
asset, dt, 'volume')
val += last_total
entries[asset] = (dt_value, val)
volumes.append(val)
continue
else:
after_last = pd.Timestamp(
last_visited_dt + self._one_min, tz='UTC')
window = self._minute_reader.load_raw_arrays(
['volume'],
after_last,
dt,
[asset],
)[0]
val = np.nansum(window) + last_total
entries[asset] = (dt_value, val)
volumes.append(val)
continue
except KeyError:
window = self._minute_reader.load_raw_arrays(
['volume'],
market_open,
dt,
[asset],
)[0]
val = np.nansum(window)
entries[asset] = (dt_value, val)
volumes.append(val)
continue
return np.array(volumes)
class MinuteResampleSessionBarReader(SessionBarReader):
def __init__(self, calendar, minute_bar_reader):
self._calendar = calendar
self._minute_bar_reader = minute_bar_reader
def _get_resampled(self, columns, start_dt, end_dt, assets):
minute_data = self._minute_bar_reader.load_raw_arrays(
columns, start_dt, end_dt, assets)
dts = self._calendar.minutes_in_range(start_dt, end_dt)
frames = []
for i, _ in enumerate(assets):
minute_frame = DataFrame((d.T[i] for d in minute_data),
index=columns, columns=dts).T
df = minute_to_session(minute_frame, self._calendar)
frames.append(df)
return frames
@property
def trading_calendar(self):
return self._calendar
def load_raw_arrays(self, columns, start_dt, end_dt, sids):
sessions = self._calendar.sessions_in_range(start_dt, end_dt)
range_open, _ = self._calendar.open_and_close_for_session(
start_dt)
_, range_close = self._calendar.open_and_close_for_session(
end_dt)
shape = len(sessions), len(sids)
results = []
for col in columns:
if col != 'volume':
out = np.full(shape, np.nan)
else:
out = np.zeros(shape, dtype=np.uint32)
results.append(out)
frames = self._get_resampled(columns, range_open, range_close, sids)
for i, result in enumerate(results):
for j, frame in enumerate(frames):
result[:, j] = frame.values[:, i]
return results
def get_value(self, sid, session, colname):
# WARNING: This will need caching or other optimization if used in a
# tight loop.
# This was developed to complete interface, but has not been tuned
# for real world use.
start, end = self._calendar.open_and_close_for_session(session)
frame = self._get_resampled([colname], start, end, [sid])[0]
return frame.loc[session, colname]
@lazyval
def sessions(self):
cal = self._calendar
first = self._minute_bar_reader.first_trading_day
last = cal.minute_to_session_label(
self._minute_bar_reader.last_available_dt)
return cal.sessions_in_range(first, last)
@lazyval
def last_available_dt(self):
return self.trading_calendar.minute_to_session_label(
self._minute_bar_reader.last_available_dt
)
@property
def first_trading_day(self):
return self._minute_bar_reader.first_trading_day
class ReindexBarReader(with_metaclass(ABCMeta)):
"""
A base class for readers which reindexes results, filling in the additional
indices with empty data.
Used to align the reading assets which trade on different calendars.
Currently only supports a ``trading_calendar`` which is a superset of the
``reader``'s calendar.
Also, the currenty implementation only reindexes the results from
``load_raw_arrays``, but in the future, `get_value` may also be made to
provide an empty result instead of raising on error.
Parameters
----------
- trading_calendar : zipline.utils.trading_calendar.TradingCalendar
The calendar to use when indexing results from the reader.
- reader : MinuteBarReader|SessionBarReader
The reader which has a calendar that is a subset of the desired
``trading_calendar``.
- first_trading_session : pd.Timestamp
The first trading session the reader should provide. Must be specified,
since the ``reader``'s first session may not exactly align with the
desired calendar. Specifically, in the case where the first session
on the target calendar is a holiday on the ``reader``'s calendar.
- last_trading_session : pd.Timestamp
The last trading session the reader should provide. Must be specified,
since the ``reader``'s last session may not exactly align with the
desired calendar. Specifically, in the case where the last session
on the target calendar is a holiday on the ``reader``'s calendar.
"""
def __init__(self,
trading_calendar,
reader,
first_trading_session,
last_trading_session):
self._trading_calendar = trading_calendar
self._reader = reader
self._first_trading_session = first_trading_session
self._last_trading_session = last_trading_session
@property
def last_available_dt(self):
return self._reader.last_available_dt
def get_last_traded_dt(self, sid, dt):
return self._reader.get_last_traded_dt(sid, dt)
@property
def first_trading_day(self):
return self._reader.first_trading_day
def get_value(self, sid, dt, field):
try:
return self._reader.get_value(sid, dt, field)
except NoDataOnDate:
if field == 'volume':
return 0
else:
return nan
@abstractmethod
def _outer_dts(self, start_dt, end_dt):
raise NotImplementedError
@abstractmethod
def _inner_dts(self, start_dt, end_dt):
raise NotImplementedError
@property
def trading_calendar(self):
return self._trading_calendar
@lazyval
def sessions(self):
return self.trading_calendar.sessions_in_range(
self._first_trading_session,
self._last_trading_session
)
def load_raw_arrays(self, fields, start_dt, end_dt, sids):
outer_dts = self._outer_dts(start_dt, end_dt)
inner_dts = self._inner_dts(start_dt, end_dt)
indices = outer_dts.searchsorted(inner_dts)
shape = len(outer_dts), len(sids)
outer_results = []
if len(inner_dts) > 0:
inner_results = self._reader.load_raw_arrays(
fields, inner_dts[0], inner_dts[-1], sids)
else:
inner_results = None
for i, field in enumerate(fields):
if field != 'volume':
out = np.full(shape, np.nan)
else:
out = np.zeros(shape, dtype=np.uint32)
if inner_results is not None:
out[indices] = inner_results[i]
outer_results.append(out)
return outer_results
class ReindexMinuteBarReader(ReindexBarReader, MinuteBarReader):
"""
See: ``ReindexBarReader``
"""
def _outer_dts(self, start_dt, end_dt):
return self._trading_calendar.minutes_in_range(start_dt, end_dt)
def _inner_dts(self, start_dt, end_dt):
return self._reader.calendar.minutes_in_range(start_dt, end_dt)
class ReindexSessionBarReader(ReindexBarReader, SessionBarReader):
"""
See: ``ReindexBarReader``
"""
def _outer_dts(self, start_dt, end_dt):
return self.trading_calendar.sessions_in_range(start_dt, end_dt)
def _inner_dts(self, start_dt, end_dt):
return self._reader.trading_calendar.sessions_in_range(
start_dt, end_dt)
| apache-2.0 |
anntzer/scikit-learn | examples/manifold/plot_manifold_sphere.py | 89 | 5055 | #!/usr/bin/python
# -*- coding: utf-8 -*-
"""
=============================================
Manifold Learning methods on a severed sphere
=============================================
An application of the different :ref:`manifold` techniques
on a spherical data-set. Here one can see the use of
dimensionality reduction in order to gain some intuition
regarding the manifold learning methods. Regarding the dataset,
the poles are cut from the sphere, as well as a thin slice down its
side. This enables the manifold learning techniques to
'spread it open' whilst projecting it onto two dimensions.
For a similar example, where the methods are applied to the
S-curve dataset, see :ref:`sphx_glr_auto_examples_manifold_plot_compare_methods.py`
Note that the purpose of the :ref:`MDS <multidimensional_scaling>` is
to find a low-dimensional representation of the data (here 2D) in
which the distances respect well the distances in the original
high-dimensional space, unlike other manifold-learning algorithms,
it does not seeks an isotropic representation of the data in
the low-dimensional space. Here the manifold problem matches fairly
that of representing a flat map of the Earth, as with
`map projection <https://en.wikipedia.org/wiki/Map_projection>`_
"""
# Author: Jaques Grobler <jaques.grobler@inria.fr>
# License: BSD 3 clause
print(__doc__)
from time import time
import numpy as np
import matplotlib.pyplot as plt
from mpl_toolkits.mplot3d import Axes3D
from matplotlib.ticker import NullFormatter
from sklearn import manifold
from sklearn.utils import check_random_state
# Next line to silence pyflakes.
Axes3D
# Variables for manifold learning.
n_neighbors = 10
n_samples = 1000
# Create our sphere.
random_state = check_random_state(0)
p = random_state.rand(n_samples) * (2 * np.pi - 0.55)
t = random_state.rand(n_samples) * np.pi
# Sever the poles from the sphere.
indices = ((t < (np.pi - (np.pi / 8))) & (t > ((np.pi / 8))))
colors = p[indices]
x, y, z = np.sin(t[indices]) * np.cos(p[indices]), \
np.sin(t[indices]) * np.sin(p[indices]), \
np.cos(t[indices])
# Plot our dataset.
fig = plt.figure(figsize=(15, 8))
plt.suptitle("Manifold Learning with %i points, %i neighbors"
% (1000, n_neighbors), fontsize=14)
ax = fig.add_subplot(251, projection='3d')
ax.scatter(x, y, z, c=p[indices], cmap=plt.cm.rainbow)
ax.view_init(40, -10)
sphere_data = np.array([x, y, z]).T
# Perform Locally Linear Embedding Manifold learning
methods = ['standard', 'ltsa', 'hessian', 'modified']
labels = ['LLE', 'LTSA', 'Hessian LLE', 'Modified LLE']
for i, method in enumerate(methods):
t0 = time()
trans_data = manifold\
.LocallyLinearEmbedding(n_neighbors, 2,
method=method).fit_transform(sphere_data).T
t1 = time()
print("%s: %.2g sec" % (methods[i], t1 - t0))
ax = fig.add_subplot(252 + i)
plt.scatter(trans_data[0], trans_data[1], c=colors, cmap=plt.cm.rainbow)
plt.title("%s (%.2g sec)" % (labels[i], t1 - t0))
ax.xaxis.set_major_formatter(NullFormatter())
ax.yaxis.set_major_formatter(NullFormatter())
plt.axis('tight')
# Perform Isomap Manifold learning.
t0 = time()
trans_data = manifold.Isomap(n_neighbors, n_components=2)\
.fit_transform(sphere_data).T
t1 = time()
print("%s: %.2g sec" % ('ISO', t1 - t0))
ax = fig.add_subplot(257)
plt.scatter(trans_data[0], trans_data[1], c=colors, cmap=plt.cm.rainbow)
plt.title("%s (%.2g sec)" % ('Isomap', t1 - t0))
ax.xaxis.set_major_formatter(NullFormatter())
ax.yaxis.set_major_formatter(NullFormatter())
plt.axis('tight')
# Perform Multi-dimensional scaling.
t0 = time()
mds = manifold.MDS(2, max_iter=100, n_init=1)
trans_data = mds.fit_transform(sphere_data).T
t1 = time()
print("MDS: %.2g sec" % (t1 - t0))
ax = fig.add_subplot(258)
plt.scatter(trans_data[0], trans_data[1], c=colors, cmap=plt.cm.rainbow)
plt.title("MDS (%.2g sec)" % (t1 - t0))
ax.xaxis.set_major_formatter(NullFormatter())
ax.yaxis.set_major_formatter(NullFormatter())
plt.axis('tight')
# Perform Spectral Embedding.
t0 = time()
se = manifold.SpectralEmbedding(n_components=2,
n_neighbors=n_neighbors)
trans_data = se.fit_transform(sphere_data).T
t1 = time()
print("Spectral Embedding: %.2g sec" % (t1 - t0))
ax = fig.add_subplot(259)
plt.scatter(trans_data[0], trans_data[1], c=colors, cmap=plt.cm.rainbow)
plt.title("Spectral Embedding (%.2g sec)" % (t1 - t0))
ax.xaxis.set_major_formatter(NullFormatter())
ax.yaxis.set_major_formatter(NullFormatter())
plt.axis('tight')
# Perform t-distributed stochastic neighbor embedding.
t0 = time()
tsne = manifold.TSNE(n_components=2, init='pca', random_state=0)
trans_data = tsne.fit_transform(sphere_data).T
t1 = time()
print("t-SNE: %.2g sec" % (t1 - t0))
ax = fig.add_subplot(2, 5, 10)
plt.scatter(trans_data[0], trans_data[1], c=colors, cmap=plt.cm.rainbow)
plt.title("t-SNE (%.2g sec)" % (t1 - t0))
ax.xaxis.set_major_formatter(NullFormatter())
ax.yaxis.set_major_formatter(NullFormatter())
plt.axis('tight')
plt.show()
| bsd-3-clause |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.