Quality Digest’s picture

By Quality Digest

 

Download directory

 

Welcome to Quality Digest’s 2008 Flowchart/Process Simulation Software Directory. The companies in this buyers guide create or distribute software whose primary function is to aid in analyzing a company’s existing designs or operations to create important process-improvements and cost-savings opportunities. Functions include the flowchart, as in a schematic representation of a process used to help the user visualize the content or to find the flaws in the process; and process simulation, such as viewing a computer simulation that mimics a company’s operations to determine optimal conditions, bottlenecks, or sensitivity to process changes.

Included in this buyers guide are company names, addresses, telephone and fax numbers, and web addresses. Further information that has been provided to us, such as descriptions of the software modules/suites, is available online at www.qualitydigest.com/content/buyers-guides.

Gary Nesteby’s default image

By Gary Nesteby

Tough economic times are upon us. The leaders of the Big Three automakers have to stoop to driving their own cars, our nation’s leaders have to separate themselves into two parties, and the people affected by the layoffs have to go home and lead their families through troubled times. Which do you think is the toughest job and requires more leadership?

We all accept the role as leaders of our families, churches, the neighborhood association, or perhaps the local school board. Those roles are more important to us as individuals than the roles played by Congress or the car manufacturers’ officers. It is a choice that we make personally, and this decision requires us to question not only our time commitment, but also the alignment of our personal belief system with that of the organization.

Roderick A. Munro, Ph.D’s default image

By Roderick A. Munro, Ph.D


As more companies embrace Six Sigma, the need to hire and train employees in the methodology grows. One issue facing beleaguered managers and human resource departments is how to determine whether an applicant truly possesses the Six Sigma skills required by the company. If he or she has a certificate, does it have any value? If not, how does your organization verify employees' Six Sigma skills? Once you get beyond the marketing hype of Six Sigma, what will really help your organization eliminate or even prevent problems?

These questions and many more based on your particular needs should be addressed as you review what you and your organization will accept as qualified certification.

This article presents commentary on important items that apply to the value (or lack thereof) of Six Sigma certification in your organization.

Understand your needs

Whether you decide to grow your own Six Sigma practitioners or hire from the outside, management must understand the role that it wants Six Sigma to play in the organization. Just stating in a job posting that a person must be Six Sigma-certified is meaningless unless the organization knows what it really wants.

Dirk Dusharme @ Quality Digest’s picture

By Dirk Dusharme @ Quality Digest

Here's the nightmare: You arrive at work to find your best customer has just returned $10,000 worth of precision ceramic parts. They are all neatly boxed and sitting on the inspection room floor with a nasty note saying that they are all out of tolerance. You stand to lose one of your best contracts, not to mention your job, unless you get to the bottom of the problem right away.

So you immediately go to your tool crib and remove your precision digital micrometer from its padded box where it lay with its anvils neatly closed.

First, you check the calibration sticker. The micrometer has a six-month calibration schedule and was calibrated five months ago. No problem there. You check the absolute zero setting on the micrometer. It reads 0.00000". Exactly where you set it when you put a fresh battery in last month. So the micrometer should be OK. The micrometer and the parts have been at the same temperature for several hours, so you should be OK there, too. It's time to check the parts. You remeasure every one of them. They're in spec. All of them.

The customer must be wrong.

Geraldine S. Cheok, Alan M. Lytle, and Kamel S. Saidi, Ph.D.’s default image

By Geraldine S. Cheok, Alan M. Lytle, and Kamel S. Saidi, Ph.D.

3-D Imaging Terminology

One of the documents to come out of committee E57 was E2544-08 -- "Standard terminology for three- dimensional (3-D) imaging systems." What follows is an excerpt from the document of some of the 3-D imaging terminology. To keep the excerpt short, we have included the definition of just a few of the terms listed.

3.2 Definitions of terms specific to this standard

3-D imaging system--a noncontact measurement instrument used to produce a 3-D representation (e.g., a point cloud) of an object or a site.

 

Angular increment--the angle between samples, Da, where Da = ai- ai-1, in either the azimuth or elevation directions (or a combination of both) with respect to the instrument’s internal frame of reference

Brenda Boughton’s default image

By Brenda Boughton

Electronic records--theircreation, modification, maintenance, retrieval, and archiving--can create ongoing challenges for all organizations. For industries regulated by the U.S. Food and Drug Administration (FDA), such as pharmaceutical companies, medical device manufacturers, food processing plants, and biotech companies, the FDA’s Code of Federal Regulations Title 21 Part 11 applies to the specifications, use, and control of electronic records and electronic signatures.

The requirements of FDA 21 CFR Part 11 for electronic records are based on good practices, organization, and, most of all, common sense to ensure the efficient and secure handling of these records. In general, these requirements state that:

• All information is complete, and all records can be tracked to their originator and corresponding records.

• Appropriate securities are in place to ensure that tampering that would alter the record from its original intent does not take place.

• Only the appropriate parties can access the records, and only those so identified can create, modify, or review those records.

Thomas Hill, Ph.D.; Robert Eames; and Sachin Lahoti’s default image

By Thomas Hill, Ph.D.; Robert Eames; and Sachin Lahoti

Data mining methods have many origins, including drawing on insights into learning as it naturally occurs in humans (cognitive science), and advances in computer science and algorithm design on how to best detect patterns in unstructured data. Although traditional statistical methods for analyzing data, based on statistical theories and models, are now widely accepted throughout various industries, data mining methods have only been widely embraced in business for a decade or two. However, their effectiveness for root cause analysis, and for modeling, optimizing and improving complex processes, are making data mining increasingly popular--and even necessary--in many real-world discrete manufacturing, batch manufacturing, and continuous-process applications.

There is no single, generally agreed-upon definition of data mining. As a practical matter, whenever data describing a process are available, in manufacturing for example, then any systematic review of those data to identify useful patterns, correlations, trends, and so forth, could be called “data mining.” Put simply, data mining uncovers nuggets of information from a sometimes vast repository of data describing the process of interest.

Peter Schulz’s picture

By Peter Schulz

 

The idea of mixing optics and measurement has its origins hundreds of years ago in the realm of pure science, i.e., astronomy (telescopy) and microscopy. Manufacturing first adopted optics for routine inspection and measurement of machined and molded parts in the 1920s with James Hartness’ development of instruments capable of projecting the magnified silhouette of a workpiece onto a ground glass screen. Hartness, as longtime chairman of the United States’ National Screw-Thread Commission, applied his pet interest in optics to the problem of screw-thread inspection. For many years, the Hartness Screw-Thread Comparator was a profitable product for the Jones and Lamson Machine Company, of which Hartness was president.

Horizontal vs. vertical instrument configurations

 

Jeff Bibee’s default image

By Jeff Bibee

Optical measurement, when clearly understood and applied, can bring huge benefits. It can also be an investment disaster. To avoid the latter, we need to start with an understanding of the basics--the capabilities and limitations of optical measurement. Then, we can consider the applications where it might provide a better solution over current methods, such as touch probes, optical comparators, hand gauges, or microscopes. Digging deeper, we can discover the challenges that those applications present to optical measurement, the limitations, and the potentials for failure. In this article, we will investigate the optical tools and software strategies that have been developed to meet those challenges. With a deeper understanding, the right technology can be applied to the task, and the investment dollars will make sense.

The basics

The diagram in figure 1 below illustrates the basics of optical measurement: lighting, optics, XY Stage, and a Z axis that handles the focus.

Mario Perez-Wilson’s default image

By Mario Perez-Wilson


The process potential index, or Cp, measures a process's potential capability, which is defined as the allowable spread over the actual spread. The allowable spread is the difference between the upper specification limit and the lower specification limit. The actual spread is determined from the process data collected and is calculated by multiplying six times the standard deviation, s. The standard deviation quantifies a process's variability. As the standard deviation increases in a process, the Cp decreases in value. As the standard deviation decreases (i.e., as the process becomes less variable), the Cp increases in value.

By convention, when a process has a Cp value less than 1.0, it is considered potentially incapable of meeting specification requirements. Conversely, when a process Cp is greater than or equal to 1.0, the process has the potential of being capable.

Ideally, the Cp should be as high as possible. The higher the Cp, the lower the variability with respect to the specification limits. In a process qualified as a Six Sigma process (i.e., one that allows plus or minus six standard deviations within the specifications limits), the Cp is greater than or equal to 2.0.