National Physical Laboratory’s picture

By: National Physical Laboratory

Graphene and related 2D materials (GR2Ms) could help reduce greenhouse gas emissions from the production of advanced materials. Using GR2M nanoplatelets in applications such as reinforcing concrete or improving battery performance will require a dramatic increase in production. As the production of GR2Ms is scaled up, the ability to effectively and efficiently measure the material properties of the nanoplatelets—both in the final product and as part of process control systems—will become critical.

The need

It is likely that instruments used for in-line measurements will have a lower resolution than lab-based, research-grade instruments. The effect this has on the resulting measurement metrics needs to be understood to provide confidence in the results reported to customers.

Currently, Raman spectroscopy is used to monitor the material properties of the nanoplatelets. But this requires samples to be taken out and additional preparation to make them suitable for analysis, delaying the process and incurring additional storage costs.

Keith Irwin’s picture

By: Keith Irwin

Many times per week we’re asked, “Is X-ray or CT the correct inspection method for my project?” The answer, of course, isn’t always straightforward. This article will highlight some strengths and weaknesses of each method.

The subject for inspection is a carbon fiber board with inclusions throughout. The piece is very large at 26 in. x 26 in. Using X-ray to inspect it is fine, but with computed tomography (CT) it could prove difficult.

Fact-finding with focus scans

Because this project was intended for R&D and technique development, we were able to cut a section out of the sheet. This probably should have been done by professionals; we ended up running through five blades cutting through the armor-grade material.

The purpose of the small section is to acquire the best possible data to be used as a gauge for the more complex development. If we can understand what the material looks like at higher resolution, it will help inform our higher-energy, lower-resolution technique development.

The focus scan highlights carbon fiber layers, layer orientation, and resin gaps.

Peter Büscher’s picture

By: Peter Büscher

This article will focus on a sampling method that is commonly used to analyze fluid: direct liquid filtration. Simply put, direct liquid filtration is a sampling technique used to determine the particles present in a liquid.

In direct liquid filtration, the liquid with suspended particles is filtered through a membrane filter so the particles collect on its surface. Then the filter membrane is imaged with a light microscope for particle analysis. Keep reading to learn more about this fluid sampling method, including best practices to follow when extracting a fluid sample for cleanliness analysis.

Small particles, great effects: The importance of clean fluids in machinery

Shipping, energy, offshore, and pharmaceuticals are some of the industries that have discovered the benefits of clean fluids.


Milling machine using oil

Bo Ingves’s picture

By: Bo Ingves

Rising costs from inflation and increased focus on reducing carbon dioxide emissions make product loss management more important than ever in dairy plants. One major reason these losses occur is because timings or other process parameters are set incorrectly, causing a lot of valuable dairy product to be unnecessarily washed out together with the wastewater. Collo’s unique liquid fingerprint technology addresses this problem by detecting any type of liquid in the pipes in real time, offering an easy way to optimize production and cut product losses.

The rising costs of raw milk, electricity, and other expenses are driving dairy plants to put more focus on reducing milk loss. The cost of raw milk has risen by 64 percent in two years in the EU region, and in some countries significantly more. With thin profit margins, it is a strategic imperative for many plants to minimize product loss. At the same time, there is global pressure to reduce the CO2 footprint of dairy production, and here milk loss at the plant plays a key role.

NIST’s picture

By: NIST

Detailed virtual copies of physical objects, called digital twins, are opening doors for better products across automotive, healthcare, aerospace, and other industries. According to a new study, cybersecurity may also fit neatly into the digital twin portfolio.

As more robots and other manufacturing equipment become remotely accessible, new entry points for malicious cyberattacks are created. To keep pace with the growing cyberthreat, a team of researchers at the National Institute of Standards and Technology (NIST) and the University of Michigan devised a cybersecurity framework that brings digital-twin technology together with machine learning and human expertise to flag indicators of cyberattacks.

In a paper published in IEEE Transactions on Automation Science and Engineering, NIST and University of Michigan researchers demonstrated the feasibility of their strategy by detecting cyberattacks aimed at a 3D printer in their lab. They also noted that the framework could be applied to a broad range of manufacturing technologies.

Henry Zumbrun’s picture

By: Henry Zumbrun

Small-force measurement is crucial for many applications, such as testing materials, monitoring biomedical devices, and studying the behavior of cells and molecules. In this article, we’ll explore the challenges of measuring small forces and the solutions that are available to address them.

There are several challenges to measuring small forces:
1. Sensitivity: Measuring small forces requires high-sensitivity instruments that are capable of detecting very small changes in force. However, these instruments can also be affected by environmental factors, such as temperature and vibrations, which can result in measurement errors.
2. Dynamic range: The dynamic range of a measuring instrument refers to the range of forces that can be measured accurately.
3. Signal-to-noise ratio: The signal-to-noise ratio refers to the ratio of the desired signal to the unwanted noise that is present in the measurement. Measuring small forces often requires a high signal-to-noise ratio, but this can be difficult to achieve with traditional instruments.
4. Controlling the forces: Having a control system to control the force, so as not to overload the force-measuring device.

Katie Rapp’s picture

By: Katie Rapp

A major focus of the current administration is revitalizing American manufacturing as new technologies are changing the way things are made. Manufacturing Extension Partnership (MEP) director Pravina Raghavan recently appeared on Government Matters TV, where she discussed how MEP National Network experts around the country are working with small manufacturers to revolutionize U.S. manufacturing.

From 3D-printed body organs and predictive maintenance to sensors that improve supply chain visibility, the future of manufacturing is about making processes more efficient and getting people what they need when they need it. “Last year alone, we helped do $18.8 billion in sales for manufacturers,” says Raghavan. “These types of technology modifications ensure that manufacturers are getting what they need at the right time, and that real-time inventory gets out to the public.”

Trupti Dhere’s picture

By: Trupti Dhere

The healthcare industry is known for rapidly adopting advanced technologies that offer improved treatment for various diseases. Consequently, digital twin technology in healthcare has gained popularity during the past few years, owing to the range of advantages it offers.

Digital twin technology in healthcare refers to a virtual model of a patient’s physical health that can be used to understand the potential effects of various treatments and interventions on that patient’s health. Data from different sources, including wearable devices; electronic health records; and data from MRI, PET scans, and CT scans, can be integrated in a patient’s digital twin to offer a more inclusive view of their health. This can assist healthcare professionals to understand, monitor, and manage patients’ health more effectively.

Healthcare providers across the globe use digital twin technology to optimize treatment plans and minimize the risk of adverse events. Advantages of this technology include not only improved patient outcomes but also reduced operational costs. Moreover, companies from pharmaceutical and medical device manufacturing industries are focusing on digital twin technology to improve the efficiency of drug and other product development.

Etienne Nichols’s picture

By: Etienne Nichols

In a highly regulated industry like medical technology, manufacturing processes must undergo either process verification or process validation to ensure they’re consistently producing the correct result. The question is, which one should you use?

Verification and validation are two different activities, and they’re used under different circumstances. Knowing when to validate or verify a process is essential from both a quality and regulatory perspective.

So, let’s take a look at what process verification and process validation refer to, and when you should use each of them in medical device manufacturing: What’s the difference between process verification vs. process validation?

Anil More’s picture

By: Anil More

It’s been decades since air gauging came into existence, and many changes and refinements have been made over the years. It has proved itself as a reliable and accurate method of gauging parts, particularly in cases of close tolerances and fine finishes. Its high accuracy, simplicity, and relatively low cost have made it the ideal solution for reliable high-volume, noncontact measurement right on the shop floor.

But every technology has its do’s and don’ts. If used correctly, air gauging is a boon—but if used improperly, it could spell disaster.

To get the most out of air gauging, consider the following do’s and don’ts:

The compressed air must be free from dust and other contaminants. The combination of dust, oil, and water is guaranteed to affect measurement accuracy or damage your equipment... or both.

The air pressure must be adequately high and constant, at least 1 to 1.5 bars (15 to 30 psi) higher than the regulated system pressure.

A good filter and a precise regulator are necessary to keep the compressed air clean and constant. Clean regulated air will result in faultless performance and provide longer life to your air-gauging system.

Syndicate content