Wednesday, September 21, 2022

Shri Radhe Shri Radhe Shri Radhe Shri Radhe Silicon nanopillars for quantum communication....

Around the world, specialists are working on implementing quantum information technologies. One important path involves light: Looking ahead, single light packages, also known as light quanta or photons, could transmit data that is both coded and effectively tap proof. To this end, new photon sources are required that emit single light quanta in a controlled fashion—and on demand. Only recently has it been discovered that silicon can host sources of single-photons with properties suitable for quantum communication. So far, however, no one has known how to integrate the sources into modern photonic circuits.

Shri Radhe Shri Radhe Shri Radhe Shri Radhe Shri Radhe Shri Radhe..
For the first time, a team led by the Helmholtz-Zentrum Dresden-Rossendorf (HZDR) has now presented an appropriate production technology using silicon nanopillars: A chemical etching method followed by ion bombardment. Their research is published in the Journal of Applied Physics.

"Silicon and single-photon sources in the telecommunication field have long been the missing link in speeding up the development of quantum communication by optical fibers. Now we have created the necessary preconditions for it," explains Dr. Yonder Berencén of HZDR's Institute of Ion Beam Physics and Materials Research who led the current study. Although single-photon sources have been fabricated in materials like diamonds, only silicon-based sources generate light particles at the right wavelength to proliferate in optical fibers—a considerable advantage for practical purposes.

The researchers achieved this technical breakthrough by choosing a wet etching technique—what is known as MacEtch (metal-assisted chemical etching)—rather than the conventional dry etching techniques for processing the silicon on a chip. These standard methods, which allow the creation of silicon photonic structures, use highly reactive ions. These ions induce light-emitting defects caused by the radiation damage in the silicon. However, they are randomly distributed and overlay the desired optical signal with noise. Metal-assisted chemical etching, on the other hand does not generate these defects—instead, the material is etched away chemically under a kind of metallic mask.

The goal: Single photon sources compatible with the fiber-optic network

Using the MacEtch method, researchers initially fabricated the simplest form of a potential light wave-guiding structure: silicon nanopillars on a chip. They then bombarded the finished nanopillars with carbon ions, just as they would with a massive silicon block, and thus generated photon sources embedded in the pillars. Employing the new etching technique means the size, spacing, and surface density of the nanopillars can be precisely controlled and adjusted to be compatible with modern photonic circuits. Per square millimeter chip, thousands of silicon nanopillars conduct and bundle the light from the sources by directing it vertically through the pillars.

The researchers varied the diameter of the pillars because "we had hoped this would mean we could perform single defect creation on thin pillars and actually generate a single photon source per pillar," explains Berencén. "It didn't work perfectly the first time. By comparison, even for the thinnest pillars, the dose of our carbon bombardment was too high. But now it's just a short step to single photon sources."

This is a step on which the team is already working intensively because the new technique has also unleashed something of a race for future applications.

"My dream is to integrate all the elementary building blocks, from a single photon source via photonic elements through to a single photon detector, on one single chip and then connect lots of chips via commercial optical fibers to form a modular quantum network," says Berencén...

Lord for acquire more knowledge....

Shri Hari Shri Krishna Govinda Hare Murari..

Sunday, September 18, 2022

Ultrasound technique captures micron-scale images of brain activity... Shri Radhe

Neuroimaging has increased our understanding of brain function. Such techniques often involve measuring blood flow variations to detect brain activation, exploiting the fundamental interaction between the brain’s vascular and neuronal activities. Any alterations in this so-called neurovascular coupling are strongly linked to cerebral dysfunction. The ability to image cerebral microcirculation is particularly important, as neurodegenerative diseases such as dementia and Alzheimer’s involve dysfunction of the small cerebral vessels.

Researchers at Institute Physics For Medicine Paris (Inserm/ESPCI PSL University/CNRS) have now developed a method called functional ultrasound localization microscopy (fULM) that can capture cerebral activity at the micron scale. The team published the first micron-scale, whole-brain images of rodent vascular activity in Nature Methods, along with a detailed explanation of the fULM image acquisition and analysis procedures.  


Researchers have previously used ULM to reveal microvascular anatomy at the whole-brain scale in rodents and humans. The spatial resolution of ULM is 16-fold better than that achieved with functional ultrasound imaging. But because the acquisition process is slow, ULM can only produce static maps of blood flow induced by the neuronal activity.

The fULM technique overcomes this limitation. In addition to imaging the brain microvasculature, the technique detects local brain activation by calculating the number and speed of microbubbles passing in each vessel. When a brain region activates, neurovascular coupling causes the blood volume to increase locally, dilating the vessels and allowing more microbubbles to pass. fULM provides local estimates of multiple parameters that characterize such vascular dynamics, including microbubble flow, speed and vessel diameters.

According to principal investigator Mickael Tanter and colleagues, integrating fULM into a cost-efficient, easy-to-use ultrasound scanner provides “a quantitative look at the cerebral microcirculatory network and its haemodynamic changes by combining a brain-wide spatial extent with a microscopic resolution and a 1 s temporal resolution compatible with neurofunctional imaging”.

In vivo studies
To demonstrate the fULM concept, the researchers first imaged laboratory rats with functional ultrasound (without contrast), followed by ULM in the same imaging plane. They combined sensory stimulations (whiskers deflections or visual stimulation) in anesthetized rats with continuous microbubble injection. For ULM, the rats received a continuous slow injection of microbubbles during a 20 min imaging session, leading to roughly 30 microbubbles per ultrasound frame.

During ULM processing, the researchers saved every track with each microbubble position and its respective time position. They constructed ULM images by selecting a pixel size and sorting each microbubble within each pixel. Only pixels with at least five different microbubble detections during the total acquisition time were used for analyses.

The technique allowed the researchers to map functional hyperaemia (increased blood in the vessels) in both cortical and subcortical areas with 6.5 µm resolution. They quantified the temporal haemodynamic responses during whisker stimulations for four rats and during visual stimulations for three rats, by measuring the microbubble flux and velocity.

The team quantified the involvement of blood vessels during functional hyperaemia. They observed increases in microbubble count, speed and diameter for a representative arteriole and venule (very small arteries/veins leading into/out of the capillaries), noting that control animals did not exhibit any changes. They also introduced a “perfusion” and “drainage area index” to quantify further the involvement of each individual blood vessel. These increased by 28% and 54% during stimulation for the arteriole and venule, respectively.

Due to the large field-of-view, the researchers could perform quantitative analyses simultaneously for every vessel across the whole rat brain slice image, even in deep structures such as the thalamus for whisker stimulations and superior colliculus for visual stimulations.

“The achieved spatiotemporal resolution enables fULM to image different vascular compartments in the whole brain and to discriminate their respective contributions, in particular in the precapillary arterioles known to have a major contribution to vascular changes during neuronal activities,” write the authors.

READ MORETranscranial ULM
Ultrafast ultrasound maps tiny blood vessels deep in the human brain

They add: “fULM shows that the relative increase in microbubble flow is greater in intra-parenchymal vessels rather than in arterioles. fULM also confirms depth-dependent characteristics for blood flow and speed in penetrating arterioles at baseline, and highlights a depth-dependent variation in blood speed during activation. It also quantifies large increases of microbubble flux, blood speed and diameter in venules during activation.”

As a new imaging research tool, fULM provides a way to track dynamic changes during brain activation and will offer insights into neural brain circuits. It will aid the study of functional connectivity, layer-specific cortical activity and or neurovascular coupling alterations on a brain-wide scale.

Tanter notes that researchers at Institute Physics for Medicine are collaborating with the Paris-based medical technology company Iconeus, to make this technology available for the neuroscience community and for clinical imaging very rapidly..

Ram Ram Ram Ram Ram Ram Ram Ram Ram 
.






It is not a good sign that Supreme Court has become the sole forum to protect personal liberty... Shri Radhe...

Listening for disease: heart sound maps provide low-cost diagnosis Shri Radhe Shri Radhe

 stenosis, the narrowing of the aortic valve, is one of the most common and serious heart valve dysfunctions. Usually caused by a build-up of calcium deposits (or sometimes due to a congenital heart defect), this narrowing restricts blood flow from the left ventricle to the aorta and, in severe cases, can lead to heart failure.

The development of sensitive, cost-effective techniques to identify the condition is paramount, particularly for use in remote areas without access to sophisticated technology. To meet this challenge, researchers from India and Slovenia have created an accurate, easy-to-use and low-cost method to identify heart valve dysfunction using complex network analysis.

“Many rural health centres don’t have the necessary technology for analysing diseases like this,” explains team member M S Swapna from the University of Nova Gorica, in a press statement. “For our technique, we just need a stethoscope and a computer.”

Hear the difference
A healthy person produces two heart sounds: the first (“lub”) due to the closing of mitral and tricuspid valves and the second (“dub”) as the aortic and pulmonary valves close, with a pause (the systolic region) in between. These signals contain carry information about the blood flow through the heart, with variations in pitch, intensity, location and timing of the sounds providing essential information related to a patient’s health.

Swapna and colleagues – Vijayan Vijesh, K Satheesh Kumar and S Sankararaman from the University of Kerala – aimed to develop a simple method based on graph theory to identify aortic stenosis heart murmur. To do this, they examined 60 digital heart sound signals from normal hearts (NMH) and hearts with aortic stenosis (ASH). They subjected the signals to fast Fourier transform (FFT), complex network analyses and machine-learning-based classification, reporting their findings in the Journal of Applied Physics.

The researchers first converted each audio signal into a time series. The signal from a representative healthy heart clearly showed the two heart sounds and the separation between them, while signals from hearts with aortic stenosis displayed diamond-shaped murmurs....


Saturday, September 17, 2022

For the first time, doctors can measure the effectiveness of chemotherapy for “stiff heart syndrome”, using an advanced form of cardiac magnetic resonance imaging (MRI). Shri Radhe Shri Radhe Shri Radhe


Shri Radhe
Shri Hari
Shri Ram
Sita Ram..

Riko mht
For the first time, doctors can measure the effectiveness of chemotherapy for “stiff heart syndrome”, using an advanced form of cardiac magnetic resonance imaging (MRI). Researchers at the National Amyloidosis Centre of University College London (UCL) have been developing and refining the non-invasive technique for the past 10 years.

Light-chain cardiac amyloidosis, also known as stiff heart syndrome, is a condition in which the heart muscle thickens due to the build up of amyloid fibrils throughout the heart. In early stages, the pumping function is typically preserved, but eventually the heart muscle can no longer efficiently pump blood and pressure starts to build up, leading to shortness of breath and fluid retention in the lungs and limbs. Without treatment, this can rapidly lead to heart failure and death.

Chemotherapy is the first-line treatment to reduce amyloid protein, but until now there has been no way to efficiently measure its therapeutic effect. A patient’s haematological response to chemotherapy is generally evaluated using measurements of serum free light chains (FLC), while echocardiography parameters and serum concentration of brain natriuretic peptides are currently the reference standards for assessing cardiac organ response. But these indirect biological markers do not directly measure cardiac amyloid burden.

The new imaging procedure combines cardiovascular MR (CMR) with extracellular volume (ECV) mapping to measure the presence and, importantly, the amount of amyloid protein in the heart. This approach can determine whether the chemotherapy is effective in triggering cardiac amyloid regression, information that will help guide better, more timely, treatment strategies for patients.

Principal investigator Ana Martinez-Naharro and colleagues assessed the ability of CMR with ECV mapping to measure changes in response to chemotherapy in a study following 176 patients with light-chain cardiac amyloidosis for two years. They report their findings in the European Heart Journal.

Really very good article to enhance skill and development of mind and mental health..

For the delicious diagnosed patients, who were enrolled in a long-term prospective observational study at the National Amyloidosis Centre, underwent a series of assessments. These included N-terminal pro-B-type natriuretic peptide (NT-proBNP) measurements and CMR with T1 mapping and ECV measurements at baseline and at six, 12 and 24 months after the start of chemotherapy with bortezomib. The team also measured FLC monthly to assess haematological response.

When combined with results of blood tests, the imaging exams revealed that almost 40% of patients had a substantial reduction in amyloid deposition following chemotherapy. “The scans and data made available using this technique, combined with correlating data from indirect markers that currently exist, gave us the information to both see the amount of amyloid protein and also the regression in amyloid during the course of chemotherapy treatments,” says Martinez-Naharro.

Senior author Marianna Fontana, of the UCL Division of Medicine, recommends that the MRI technique should now be employed immediately to diagnose and assess all cases of light-chain cardiac amyloidosis. “By developing ECV mapping for 1.5 T MR scanners, we hope that its use can be made available to more patients. The aim would be to use these scans routinely for all patients with the disease to help improve patient survival, which is very poor in patients who do not respond to treatment,” she explains..

Saturday, September 10, 2022

a quantum entanglement between two physically separated ultra-cold atomic clouds.. Shri Radhe Shri Radhe

Members of the Department of Theoretical Physics and History of Science of the UPV/EHU's Faculty of Science and Technology together with researchers from the University of Hannover have achieved quantum entanglement between two spatially separated Bose-Einstein condensates, ultra-cold atomic ensembles.......

Shri Radhe Shri Radhe.

The team lead by Géza Tóth, Ikerbasque Research Professor, concentrated on verifying the presence of entanglement by measurements, while the experiment at Hannover was carried out in the group of Carsten Klempt. The study is published in Science.

Quantum entanglement was discovered by Schrödinger and later studied by Einstein and other scientists in the 20th century. It is a quantum phenomenon with no counterparts in classical physics. The groups of entangled particles lose their individuality and behave as a single entity. Any change in one of the particles leads to an immediate response in the other, even if they are spatially separated. "Quantum entanglement is essential in applications such as quantum computing, since it enables certain tasks to be performed much faster than in classical computing," explained Toth..

Shri Radhe Shri Radhe Shri Radhe Shri Radhe......

Unlike previous methods of quantum entanglement involving incoherent and thermal clouds of particles, in this experiment, the researchers used a cloud of atoms in the Bose-Einstein condensate state. Tóth said, "Bose-Einstein condensates are achieved by cooling down the atoms to very low temperatures, close to absolute zero. At that temperature, all the atoms are in a highly coherent quantum state; in a sense, they all occupy the same position in space. In that state, quantum entanglement exists between the atoms of the ensemble." Subsequently, the ensemble was split into two atomic clouds. "We separated the two clouds from each other by a distance, and we were able to demonstrate that the two parts remained entangled with each other," he continued.

The demonstration that entanglement can be created between two ensembles in the Bose-Einstein condensate state could lead to an improvement in many fields in which quantum technology is used, such as quantum computing, quantum simulation and quantum metrology, since these require the creation and control of large ensembles of entangled particles. "The advantage of cold atoms is that it is possible to create highly entangled states containing quantities of particles outnumbering any other physical systems by several orders of magnitude, which could provide a basis for large scale quantum computing," said the researcher..

Shri Radhe Shri Radhe Shri Radhe....

A quantum network of entangled atomic clocks...A auspicious invention of physics intelligence..... Shri Radhe Shri Radhe....

For the first time, scientists at the University of Oxford have been able to demonstrate a network of two entangled optical atomic clocks and show how the entanglement between the remote clocks can be used to improve their measurement precision, according to research published this week by Nature..

Shri Radhe Shri Radhe Shri Radhe Shri Radhe...

Improving the precision of frequency comparisons between multiple atomic clocks offers the potential to unlock our understanding of all sorts of natural phenomena. It is essential, for example, in measuring the space-time variation of fundamental constants, for geodesy where the frequency of the atomic clocks is used to measure the heights of two locations, and even in the search for dark matter.

Fundamental limit of precision

Entanglement—a quantum phenomenon in which two or more particles become linked together so that they can no longer be described independently, even at vast distances—is the key to reaching the fundamental limit of precision that's determined by quantum theory. While previous experiments have demonstrated that entanglement between clocks in the same system can be used to improve the quality of measurements, this is the first time researchers have been able to achieve this between clocks in two separate remotely entangled systems. This development paves the way for applications like those mentioned above, where comparing the frequencies of atoms in separate locations to the highest possible precision is vital.

Bethan Nichol, one of the authors of the paper published in Nature, said, "Thanks to years of hard work from the whole team at Oxford, our network apparatus can produce entangled pairs of ions with high fidelity and high rate at the push of a button. Without this capability this demonstration would not have been possible."

State-of-the-art quantum network

The Oxford team used a state-of-the-art quantum network to achieve their results. Developed by the UK's Quantum Computing and Simulation (QCS) Hub, a consortium of 17 universities led by the University of Oxford, this network was designed for quantum computing and for communication rather than for quantum-enhanced metrology, but the researchers' work demonstrates the versatility of such systems. The two clocks used for the experiment were only 2 meters apart, but in principle such networks can be scaled up to cover much larger distances.

"While our result is very much a proof-of-principle, and the absolute precision we achieve is a few orders of magnitude below the state of the art, we hope that the techniques shown here might someday improve state-of the art systems," explains Dr. Raghavendra Srinivas, another of the paper's authors. "At some point, entanglement will be required as it provides a path to the ultimate precision allowed by quantum theory."

Professor David Lucas, whose team at Oxford were responsible for the experiment, said, "Our experiment shows the importance of quantum networks for metrology, with applications to fundamental physics, as well as to the more well-known areas of quantum cryptography and quantum computing.".........

Hare Krishna hare Krishna.

Krishna Krishna hare hare

Hare Rama hare Rama

Rama Rama hare hare.




Finding funds: On COP28 and the ‘loss and damage’ fund....

A healthy loss and damage (L&D) fund, a three-decade-old demand, is a fundamental expression of climate justice. The L&D fund is a c...