Started by davidlt89, August 06, 2023, 06:17:34 AM

Previous topic - Next topic

0 Members and 1 Guest are viewing this topic.


So, I was going to Message Paul about a chronograph question, but thought I might get some more opinion on this subject and it is a good forum discussion.
   I have three rounds I need to get chronographed so I pulled out my chronograph last night and proceeded to read the directions on since I have not used it in 10 years. The chronograph in question is an Alpha shooters chrony. 
    Now it seems like I remember reading something years ago that "sunlight" can throw these things off a little since the chronograph reads the changes in light, as the bullet flies over it, to get the velocity.
    I took my middle son out an hour ago because he was highly interested in learning something about this. When we started with his 300 win mag the sun was primarily behind some clouds. the first two shots were pretty close together in velocity and not far off of the quickload prediction.
    When we went to shoot the 3rd shot, the sun came out in full force and all of a sudden the bullet was moving over 100fps faster. This happened with the 4 shot also, in fact it was 150fps faster than the first two.
    I then waited for the sun to back behind a cloud and we took another shot that did not fall from the first two shots.
  So my question is, is it better to chronograph on overcast days without the sun? do some people find it does not matter?

I also realize those extreme deviations in velocity could be my reloading, but there is a fat change of that! :biggrin:
Romans 12:2
2 Don't copy the behavior and customs of this world, but let God transform you into a new person by changing the way you think. Then you will learn to know God's will for you, which is good and pleasing and perfect.


An exercise you might have your son perform is to calculate the actual time it takes a bullet, (your bullets at the expected MV), to pass between the screens. Then calculate the difference, in time, at 150 fps faster/slower. Another way to do it is to just calculate how many seconds per foot (not feet per second) your bullet is traveling and how that number changes with a change in velocity.

Here's the point:
A 'light-based' chronograph works with "shadows". The shadow of the bullet, cast by the sun, passing over the first sensor starts the timer. The shadow of the bullet passing over the second sensor stops the timer. Calculation of speed is accomplished by solving the velocity equation: rate (speed) = distance/time.
Or r=d/t.

In order to answer you question definitively, we would need to know the clock speed (how many times per second) the clock in the chronograph 'ticks'. That clock speed determines two things; 1) the smallest interval of time (remember the 'time' element in the velocity equation) your chrony can detect, and 2) the highest speed your chrony can measure.

Let me do some math for clarification.

The normal distance for the screens of a light-based projectile chronograph, is 2 feet or 24". Using one of my loads that produces expected MVs in the 2650 f/s range, I divide the 2 foot screen separation by 2 times the 2650 MV, (FOOT (1) per second, and TWO FEET traveled in the chronograph), and get 0.000377358 seconds. (No rounding here, (unless it's a repeating decimal), because we're after a very high level of precision. As high as we can get. (I'll come back to the issue of the level of precision we are ALLOWED to use, after I explain the points of the process that are specific to your question.) To put that time in words, it takes 377 millionths of a second for a projectile traveling 2650 f/s to go two feet. Now let's see how long it takes one that is going 150 f/s slower to go that same two feet. 2/2500 = 0.000400 seconds, or 400 millionths of a second. OR... 23 millionths of a second difference. Now, armed with that information, we can ask, "How far does a bullet going 2500 f/s go in 23 millionths of a second?" And the answer is: 0.69 inches. Which you can see is a small proportion (2.8%) of the 2' between screens. What does that mean? Because we KNOW that the chrony can measure velocities faster than 2650 f/s, the clock speed, (or sampling rate), of the chrony has to be faster than 46 million times per second (remember, two feet traveled, not one). Actually, in this day and age, a clock speed of one million times per second would be quite 'pedestrian'. By the same token, it has been my experience that the electronic devices associated with the shooting and fishing sports are pathetically weak on sampling rates. FOR NO GOOD REASON! Other than it MIGHT save a dime, AT MOST, on the cost of a component. (Sampling rates dictate, to some degree, the cost of manufacturing. However, below a certain speed, and a millionth of a second is WAY below that threshold, sampling rate has no effect on cost. Sampling rates of 1/1000000 (one millionth of a second) have been commercially available for more than 50 years. That's HALF A CENTURY.)

Back to the topic at hand. Before we make an arbitrary assumption about the sampling speed of the average, over-the-counter shooting chronograph, let's do a little math and see if we can't make an "educated guess" at that sampling rate. In the examples above, the difference was 150 f/s not 1 f/s! Most MVs are reported to the foot per second precision. For example, 2655 f/s, 3267 f/s, or whatever, but that last 'foot per second' is always reported. Therefore, it is assumed, (dangerously), that the machine measuring the velocity has the capability of actually determining that level of precision. Mathematically speaking, what does that mean?

The good news is that we have a well-defined system: a 24" measuring pathway, and a "known" velocity. (This is a bit circular in that we're assuming we 'know' the velocity based on the measurement of the device we're trying to figure out, but for the sake of this exercise, that assumption is acceptable.) Most chronograph manufacturers assume that no one will be shooting a projectile over 5000 f/s. Or at least that's their 'attitude'. "Why do you need to go that fast?" and "Bullets won't hold together at velocities that fast." Both of which are 'stupid' comments based more on their ignorance and laziness, than reality. Nevertheless, their assumptions about us users sets the ceiling on the sampling rate they 'have' to meet. Given a 2' distance, "How fast do I have to sample to measure to a precision of 1 foot per second for the fastest bullet that can be shot? is their design constraint. (That is, if we believe they care about that level of precision. I know from personal conversations with them, that they don't.) Therefore, I want to measure speed differentials of 1 f/s over a 2' interval when the projectile is doing 5000 f/s.

First, how many seconds does it take to move 1 foot if the projectile is doing 5000 f/s? Answer: 1/5000 or 0.0002 seconds. So for our chrony, a bullet traveling 5000 f/s will take 0.0004 seconds, (4 ten thousandths of a second OR 40 millionths of a second) to go from 'start' to 'finish' in our chrony's 2 foot path. Therefore, mathematically speaking, if we sample at least once every 4 ten thousandths of a second, or 2500 times per second, we could theoretically, 'catch' a bullet doing 5000 f/s.

In order to explain the above italicized 'theoretically', let's look at the actual process of measuring velocity with a discrete sampling device. (A digital chronograph.) Let's start with our minimal sampling rate - 2500 times per second, or once every 0.0004 seconds.

1) The chrony is 'on' and sampling.
2) The bullet leaves the barrel at 5000 f/s and heads for the front screen of the chrony.
3) Right AFTER the chrony samples the front screen, the bullet's shadow passes the chrony's screen. In other words, the chrony "misses" the front of the bullet to start the timer. We know from above, that at 5000 f/s and a sampling rate of only 2500 samples per second, a bullet would have to be a foot long+ in order to 'catch' the bullet with the front screen IF the bullet "just misses" the sampling point with it's point. If the front screen doesn't 'catch' the bullet, the timer doesn't start, and no time/velocity can be measured. We 'missed' that bullet.
We can continue doubling our sampling rate until we get a "distance traveled" (the 1 FOOT in 0.0004 seconds) is "short enough" to catch the bullet, but how short is "short enough"? What's the shortest bullet you want to measure the velocity of? In fairness to chrony manufacturers, I'd say a .17 caliber bullet doing 5000 f/s. The shortest .17 caliber bullet I have is about 0.25" long. Therefore, traveling at 5000 f/s, how long does it take a projectile to travel 0.24999999" long? (Has to slightly less than our bullet length so we can catch the 'tail' - at worst.) Our projectile, doing 5000 f/s will take 0.000004167 seconds, (a little more than 4 millionths of a second), to travel 0.25". Therefore, we MUST sample AT LEAST, 240,000 times a second, to 'catch' the smallest, fastest bullet.

Again, that sampling rate is slower even than 'pedestrian', in our current digital world. And yet, based on my conversations with them, I'd bet dollars to doughnuts that the light-based chrony manufacturers use the absolute minimum sampling rate they can 'get away with'.
Now that we have an estimate of the minimum sampling rate, is what is the maximum error that our 'minimal' chrony can cause in measuring the velocity of a small, fast bullet? Let's set our new system up.
1) Sampling rate of 240000 times per second.
2) Bullet length 1.25".
3) Expected muzzle velocity about 3000 f/s.
IF the chrony catches the very tip of the bullet's nose to start the timer, AND it catches the nose 'minus 0.25" ' due to sampling speed error at the end screen, at 3000 f/s (36000 inches per second) a bullet will travel 0.25*36000 inches in one sample, or 9000 inches per second, 750 f/s. No way any self-respecting chrony manufacturer would tolerate that level of error due to sampling rate. However, it's simple to calculate what rate it would take to get to ONE FOOT of precision. Simply multiply the estimated sampling rate of 250000 per second by 750, and voila', there's your minimal sampling rate to catch a bullet doing 3000 f/s. If you want the sampling rate to catch a bullet doing 5000 f/s, you'll have to multiply by an additional 5000/3000rds. Just to 'do the math' for you, that would be; 750*250000*(5/3)= 312,500,000, or 312 megaherz.

We're not finished with 'potential sources of error'. Some light-based chronographs (LBC from hereafter), use "sky screens" to diffuse the light from the sun on bright, cloudless days. Hmm, why would they do that... do you think? It's a rhetorical question. They KNOW that errors in velocity measurements occur when the bullet's shadow is 'different' from one shot to the next due to changing light conditions. Those LBCs that don't use sky screens are at the mercy of clouds AND the angle of the sun. If the ANGLE of the sun relative to the chrony changes over the course of shooting, the length of the shadow cast by the bullet, AND it's intensity, (think about a sharp pointed nose shadow, and when that shadow's point is big enough to trigger the device), will change. As you can see from above, that matters. How much is difficult to quantify, but it matters, nonetheless.

Welcome to metrology - the study of measuring things. It is a widely ignored science, but science nonetheless. This was a pet peeve of the late j0e_bl0ggs, and one I share with him, albeit if not with the same 'intensity'. Too many people simply 'take the word' of manufacturers of the tools they use. There is great danger in doing that. As head of the state of Alaska's Sonar and Technical Services, I was constantly confronted with the willingness of fisheries biologists to accept whatever the sonar manufacturers told them as far as the specs and "how it works". Way, WAY too often, it was barely short of a 'pack of lies'. I even 'called' out the biggest fisheries sonar manufacturer in the world at an international symposium for selling the state of Alaska A LOAD OF CRAP, even if the gear had worked as spec'd. Which it didn't! It was deplorable! This gear was being used to manage 'your' fisheries. Think about how less the pressure is on "just" hunters and fishermen. I PROMISE you, "those guys" are deceiving you. Some to a small degree, but some to a huge degree. What do they have to lose? Are you going to call them out? At what forum? There are no "international symposia" on chronograph performance specifications and use. (Neither is there for consumer fisheries sonar systems which are insanely deceitful!) By the way, I got that sonar manufacturer to 1) Make a public apology, and 2) TAKE BACK all of their gear, including the cost of pulling a large research vessel out of the water to remove the installed gear. The cost to them was over a million dollars. The state of Alaska was 'annoyed' with me, because they were "doing good science" with that gear. Shows you how much people WANT TO BE LIED TO.

Finally, there are chronographs that do not use light. Oehler makes one that uses sound, and another that uses radar. If you have to ask "how much", TRUST ME, you can't afford them. 20 years ago they STARTED at $2,000 for the "cheap" models. Then there's the MagnetoSpeed. A good chronograph that used the "magnetic", (actually, it's NOT the magnetism, but rather the INDUCTION), characteristics of bullets (tell me, how magnetic are copper and lead, exactly?) to measure MV. I have one. They're good. Once set up properly, they rarely 'miss' a bullet. HOWEVER, and it's a big "however", they have one significant flaw - the bayonet straps on the muzzle. This has a BIG effect on the rifle's precision. Seriously. Which means, you can't get MV data AND precision data at the same time. You CAN get MV and precision data simultaneously from LBCs.

Be nicer than necessary.