$\newcommand{\kk}{\boldsymbol{k}} \newcommand{\eek}{\boldsymbol{e}_\boldsymbol{k}} \newcommand{\eeh}{\boldsymbol{e}_\boldsymbol{h}} \newcommand{\eez}{\boldsymbol{e}_\boldsymbol{z}} \newcommand{\cc}{\boldsymbol{c}} \newcommand{\uu}{\boldsymbol{u}} \newcommand{\vv}{\boldsymbol{v}} \newcommand{\bnabla}{\boldsymbol{\nabla}} \newcommand{\Dt}{\mbox{D}_t} \newcommand{\p}{\partial} \newcommand{\R}{\mathcal{R}} \newcommand{\eps}{\varepsilon} \newcommand{\mean}[1]{\langle #1 \rangle} \newcommand{\epsK}{\varepsilon_{\!\scriptscriptstyle K}} \newcommand{\epsA}{\varepsilon_{\!\scriptscriptstyle A}} \newcommand{\epsP}{\varepsilon_{\!\scriptscriptstyle P}} \newcommand{\epsm}{\varepsilon_{\!\scriptscriptstyle m}} \newcommand{\CKA}{C_{K\rightarrow A}} \newcommand{\D}{\mbox{D}}$
where $\eps_P$ and $\eps_K$ are the potential and kinetic energy dissipation.
stirring: injection of kinetic energy ($P_K$)
conversion from kinetic energy to potential energy (buoyancy flux) $- \mean{b w} = C_{K\rightarrow A} = \eps_P$
dissipation of kinetic and potential energy $\eps_K + \eps_P = P_K$
energy spent for mixing = dissipation of potential energy
where
$$ \mathcal{E}_P = -\frac{(HN)^2}{12} - \mean{b z}. $$
$$\mathcal{E}_P + \frac{(HN)^2}{12} = \mean{b z}$$
$$\begin{align} \D_t \vv &= -\bnabla p + b\eez + \mathbf{F} + \nu \bnabla^2 \vv \\ \D_t b &= - N^2 w + \kappa \bnabla^2 b \end{align}$$
$$\begin{align} d_t E_K &= -\bnabla \cdot (p \vv) - \CKA - \epsK\\ d_t E_A &= \CKA - \epsA \end{align}$$
where
Ocean models are LES (scale filter $[]$).
with $\Gamma = 0.2$ a constant!
Introduction of 2 important non-dimensional numbers
$$F_h = \frac{U}{NL_h} = \frac{L_b}{L_h} < 1$$
Linborg (2006)
Different scales:
buoyancy length scale $L_b=U/N$
Ozmidov length scale $l_o= (\varepsilon_K/N^3)^{1/2}$
"the largest horizontal scale that can overturn" (Riley & Lindborg, 2008)
Kolmogorov length scale $\eta$ (dissipative structures)
buoyancy Reynolds number $$\displaystyle \R = \left( \frac{l_o}{\eta} \right)^{4/3} \sim Re {F_h}^2 \gg 1$$
In the oceans and the atmosphere, $L_h \gg 1$ so that $F_h \ll 1$ and $\R \gg 1$
DNS fine resolution ($\sim 10^3 \times 10^3 \times 10^2$ grid points): $\R \simeq 10$
In the laboratory experiments
In water with stratification with salt $\Rightarrow N \simeq 1$ rad/s and $\nu \simeq 10^{-6}$ m$^2$/s
$F_h = \frac{U}{L_hN} \ll 1 \Rightarrow$ slow motion
$\R = Re {F_h}^2 \gg 1 \Rightarrow$ need very large $Re$
Large Reynolds + slow motion $\Rightarrow$ very large apparatus
Summer 2016 (a collaboration between KTH, Stockholm, Sweden and LEGI): we tried many things.
Summer 2017: focused on mixing without rotation
Big changes in how programs are developed:
serious, good quality, good coding practices:
new tools and environments (for example Python)
Transparency in scientific methods and results
Openness to full scrutiny
Ease of reproducibility
No more "reinventing the wheel" - particularly in code development
Doing sciences with open-source methods and tools
Share using the web
Fluid mechanics not in advance... Dominance of Fortran and Matlab,
A well-thought language:
Scientific ecosystem: strong dynamics, rich and complicated landscape (many, many projects):
Difficult to use efficiently (very specialized coding)
low-level languages versus high-level languages?
Tensorflow (deep learning library by Google)
$\rightarrow$ main APIs in Python
Open-source, documented, tested, continuous integration
(in collaboration with Julien Salors, ENS Lyon)
For the MILESTONE experiments:
probes attached to a transverses (Modbus TCP)
scanning Particle Image Velocimetry (PIV):
Issue: control with computers the interaction and synchronization of the objects
(in collaboration with Julien Salors, ENS Lyon)
rcpy
motor.py
position_sensor.py
position_sensor_server.py
position_sensor_client.py
carriage.py
carriage_server.py
carriage_client.py
A little bit of Graphical User Interface is easy, fun and useful. We use PyQt.
Remark: reusable code, here, random movement for another experiment.
Kinetic energy dissipation rate $\eps_K$:
$\eta$ very small: impossible to measure accuratly the velocity gradients
decay of kinetic energy after a stroke. Need many vector fields $\Rightarrow$ scanned horizontal PIV
APE dissipation rate $\eps_P$:
$\kappa$ smaller than $\nu$!
APE decay after one stroke...
long-term evolution of the stratification after many strokes $\Rightarrow$ density profiles
Very large series of images and probe data $\Rightarrow$ calcul on the LEGI clusters
(in collaboration with Cyrille Bonamy and Antoine Campagne, LEGI)
Many images (~ 20 To of raw data): embarrassingly parallel problem
Efficient algorithms and tools for fast computation with Python (Pythran, Theano, Pycuda, ...)
Images preprocessing
2D and scanning stereo PIV
Utilities to display and analyze the PIV fields
Remark: we continue to use UVmat for calibration
Example of scripts to launch a PIV computation:
from fluidimage.topologies.piv import TopologyPIV
params = TopologyPIV.create_default_params()
params.series.path = '../../image_samples/Karman/Images'
params.series.ind_start = 1
params.piv0.shape_crop_im0 = 32
params.multipass.number = 2
params.multipass.use_tps = True
# params.saving.how has to be equal to 'complete' for idempotent jobs
# (on clusters)
params.saving.how = 'complete'
params.saving.postfix = 'piv_complete'
topology = TopologyPIV(params, logging_level='info')
topology.compute()
Remark: parameters in an instance of fluiddyn.util.paramcontainer.ParamContainer
. Much better than in text files or free Python variables!
Remark: launching computations on cluster is highly simplified by using fluiddyn:
from fluiddyn.clusters.legi import Calcul7 as Cluster
cluster = Cluster()
cluster.submit_script(
'piv_complete.py', name_run='fluidimage',
nb_cores_per_node=8,
walltime='3:00:00',
omp_num_threads=1,
idempotent=True, delay_signal_walltime=300)
from fluidcoriolis.milestone17 import Experiment as Experiment17
iexp = 21
exp = Experiment17(iexp)
exp.name
'Exp21_2017-07-11_D0.5_N0.55_U0.12'
print(f'N = {exp.N} rad/s and Uc = {exp.Uc} m/s')
N = 0.55 rad/s and Uc = 0.12 m/s
print(f'Rc = {exp.Rc:.0f} and Fh = {exp.Fhc:.2f}')
Rc = 11425 and Fh = 0.44
print(f'{exp.nb_periods} periods of {exp.period} s')
3 periods of 125.0 s
print(f'{exp.nb_levels} levels for the scanning PIV')
5 levels for the scanning PIV
from fluidcoriolis.milestone import Experiment
exp = Experiment(73)
cam = 'PCO_top' # MILESTONE16
# cam = 'Cam_horiz' # MILESTONE17
pack = exp.get_piv_pack(camera=cam)
piv_fields = pack.get_piv_array_toverT(i_toverT=80)
/home/pierre/16MILESTONE/Data_light/PCO_top/Exp73_2016-07-13_N0.8_L6.0_V0.16_piv3d/v_exp73_t080.h5
piv_fields = piv_fields.gaussian_filter(0.5).truncate(2)
piv = pack.get_piv2d(ind_time=10, level=1)
piv = piv.gaussian_filter(0.5).truncate(2)
piv.display()
_ = plt.xlim([-1.7, 0.5])
_ = plt.ylim([-1.3, 1.3])
piv.display()
_ = plt.xlim([-1., -0.])
_ = plt.ylim([-0.5, 0.5])
from fluidcoriolis.milestone.results_energy_budget import ResultEnergyBudgetExp
r = ResultEnergyBudgetExp(73, camera=cam)
iexp = 73; N = 0.8 rad/s; Uc = 16 cm/s epsK = 5.49e-05 m^2 s^-3 urms = 3.14e-02 m/s (urms/Uc)^2 = 4e-02 ; epsK/epsc = 3e-03 Fht = 1.0e-01 ; Rt = 1.3e+02
r.plot_energy_vs_time()
fig = plt.gcf()
fig.set_size_inches(12, 5, forward=1)
from fluidcoriolis.milestone17.time_signals import SignalsExperiment
signals = SignalsExperiment(iexp)
/fsnet/project/watu/2017/17MILESTONE/Data/Exp21_2017-07-11_D0.5_N0.55_U0.12
signals.plot_vs_times()
signals.plot_vs_times(corrected=1)
probe = signals.probes_profiles[0]
probe.plot_profiles(corrected=1)
signals.plot_profiles_probe_averaged(corrected=1, sort=1, extend=1, len_extend=0.005)
signals.plot_energy_pot_vs_time()
Mass conservation check: max(m/m0 - 1) = 1.6755e-04
non-dimensional mixing: 0.001983362753471777
Experimental flow close to the strongly stratified regime!
We should be able to get strongly stratified turbulence by using alcool
Good measurements of the mixing (with MILESTONE17)
$\Rightarrow$ soon good evaluation of the mixing coefficient
Easier than experiments! :-)
Science in fluid mechanics with open-source methods and Python
Open-data: data in auto-descriptive formats (hdf5, netcdf, ...) + code to use and understand the data
Development of open-source, clean, reusable codes (fluiddyn project)