Considerations about the Cost of Conveyor Belting: Discussing re-evaluated Conveyor Belt Safety Factors

15 Oct.,2022

 

Rubber Conveyor Belt For Sale

By R. Steven A new generation of conveyor belts and belt monitoring technology reduces conveyor belt capital and operating costs by using lower belt strengths than previously thought possible. Key factors are improvements in splice performance, energy efficient rubbers and real-time belt condition surveillance systems.
(From the archive of ”bulk solids handling“, article published in Vol. 36 (2016) No. 5 , ©2016 bulk-online.com
)

Lowering a belt’s strength and safety factor reduces belt cost. But, how can you reduce cost without compromising quality or increasing risk? This article explains how recent improvements in steel cord conveyor belt technology in the areas of dynamic splice efficiency (when used with real time belt condition monitoring), and low rolling resistance rubbers make lower belt strengths a safe option.

Introduction

Fig. 1 shows the consequences of a belt break where over 1 km of belt accumulated at the bottom of a slope on an overland conveyor after a failure. It took two weeks of downtime to sort it out. In order to avoid such situations it is important to consider belt safety factors. Safety factors for steel cord belts were originally developed from many years of experience (first steel cord belt was installed in 1942). They were traditionally designed using conservative values. The most common safety factor being 6.67:1 or 15% of the belt breaking strength. That is, the breaking strength of the belt is 6.67 times the maximum running tension of the conveyor. If we could find a safe way to use a lower safety factor we would be able to use a lower rated, lower cost, belt. Fig. 2 shows the relationship between safety factor and required belt strength.

In Fig. 2, the Y-axis to the right of the chart considers a conveyor with a maximum running tension of 150 kN/m. Using conventional wisdom and a safety factor of 6.67:1, the required belt would be a ST1000, that is, a belt with a breaking strength of 1000 kN/m. On the same Y-axis scale, the chart then shows what would be the required belt strength if we were able to safely reduce the safety factor to a value indicated on the X-axis. The “Safety Factor Index” on the left Y-axis is the multiplier you would use for any belt strength to determine what the required belt strength should be for a different safety factor if you know the ST rating of the belt using the conventional 6.67:1 safety factor. For example, say we have an existing ST5000 belt that was designed using a 6.67:1 safety factor. If you were able to use a 5.0:1 safety factor you could use a 0.75 x ST5000 = ST3750 belt strength.

Not All Safety Factors are Created Equal!

The generally accepted definition of belt Safety Factor is the minimum breaking strength of a belt divided by the maximum belt running tension. (Note that DIN 22101-2002 also defines two belt safety factors that when multiplied together, relate to the ratio of the maximum belt edge running tension and the measured dynamic splice strength of the belt). However, some confusion has arisen in the industry due to different definitions of the minimum belt strengths to be used.

Around the world we have a number of international standards for steel cord belts which define the belt construction for standard steel cord belt strengths. They define the number of cords and the minimum cord strengths. In some cases, as in the now obsolete DIN 22131, some additional strength (around 10%) was added to the nominal belt strength to ensure a more robust safety margin. Fig. 3, on the right hand side, shows the actual belt design strengths compared to the stated “ST” (kN/m) rating for Germany’s DIN 22131, Europe’s ISO 15236-2 (nontabular), Australia’s AS1333 and China’s GB/T9770 for two different belt widths, 1000mm and 2000mm. The X-axis is the nominal belt strength or “ST” rating in kN/m. The Y-axis is the actual design belt strength expressed as a percentage of nominal. In practice the belt strengths would be even higher as the actual cord strengths supplied will always be slightly stronger than the specified minimum.

The main point here is that it is important for the end user to understand that there are significant variations between design belt strengths conforming to different standards. If not correctly applied, these differences can lead to incorrect comparisons and conclusions.

For example, Table 1 shows a number of well-known conveyor belt installations where the reported and the actual safety factors are compared. The reported safety factors are all based on the “nominal ST” belt strength, not the actual belt strength. The conveyor slope is also indicated in the last column.

From Table 1, we can make the observations that:

  1. There are several belts that have been operating successfully for many years with safety factors between 5.1:1 and 5.5:1
  2. Belts with slopes greater than six deg typically have safety factors of 6.0:1 or higher.

Although our sample size is small, there are good reasons to use higher safety factors on high inclined belts. Inclined conveyors have considerably more stored potential energy in the belt than horizontal conveyors, even when the belt is empty.

How Can I Get Lower Cost Belts?

In order to get lower cost belts without compromising belt quality there are two basic approaches.

  1. Use lower belt safety factors based on higher dynamic splice efficiencies and 24/7 real-time belt condition monitoring.
  2. Use low rolling resistance (“LRR”) rubber to reduce operating belt tension.
1. Lower safety factors

Lower safety factors for a given application can be achieved through better dynamic splice efficiency. This is based on the premises that:

  1. Maximum belt tension is related to dynamic splice efficiency;
  2. Cord fatigue life exceeds dynamic splice life;
  3. Dynamic splice life exceeds belt wear life;
  4. 24/7 cord monitoring can avert catastrophes.

Dynamic splice efficiencies for steel cord belt splices are most commonly determined by the test defined in DIN 22110 Part 3. In this standard the dynamic splice efficiency (also known as the Relative Reference Strength of the splice) is defined as the maximum test load that can achieve 10 000 load cycles on a 2-pulley splice test. Each load cycle increases the test load from 6.6% of belt break to the test load (e.g. 50% of break) in 42 seconds and returns to 6.6% in 8 seconds to complete a 50 second cycle. During each 50 second cycle, the test belt splice loop also completes 18 revolutions on the fixture. Typically, four tests are conducted at different peak test loads. For each test, the number of cycles achieved before the splice fails are plotted against the peak test load as a percentage of belt break strength. A trend line is drawn through the four points to generate a characteristic fatigue curve, also called a Wöhler curve. The significance of the Wöhlers curve is that it relates the lab test results to anticipated field performance.

The DIN 22110 Part 3 splice test was developed by Hannover University in the 1980’s. For reference, in 1985 the Prosper Haniel ST7500 belt achieved 36.7% dynamic splice efficiency which was thought to be good at the time. In service it had a 6.0:1 safety factor (Table 1). Since then, a new generation of belts with new splice materials and splice patterns has been developed and today, splice efficiencies of 50% and over are commonplace. Fig. 4 shows Wöhler splice fatigue curves for splices with a 50% and a 60% dynamic splice efficiencies. It also plots:

  1. The running tension (= 15% of break) for a conveyor running at a 6.67:1 safety factor.
  2. The accelerating tension for the same belt assuming an additional 40% belt tension.

Significantly, the difference between the accelerating tension and the splice fatigue curves at 10 000 load cycles illustrates the reserve tension left in the belt to accommodate belt degradation factors and accidental belt damage. Degradation factors are defined as belt aging, misalignment, pulley bending, splice construction errors, etc.. (For example, in a study conducted by Syncrude in Canada on a movable conveyor, when they deliberately misaligned sets of idler frames, they could increase the power requirement (and belt tension) by 14%.). From Fig. 4 it is clear that a 60% splice efficiency increases this reserve tension. That gives us confidence to consider reducing the belt safety factor (SF) from, say, 15% of belt break to 20% of belt break, that is, from a 6.67:1 SF to a 5.0.SF, as there is still adequate reserve belt tension for degradation and accidental damage considerations, Fig. 5.

This figure shows the 5.0:1 acceleration tension line as an addition 40% of the 5.0:1 running tension. This can be reduced with the use of a soft start fluid coupling or through VFD or DC drive controls.

DIN 22101-2011 offers a convenient approach to determining a suggested belt safety factor for any given application (Ref. 1). In this case the degradation factors are embedded in two safety factors called S0 and S1. Suggested values for each of these safety factors are offered in tabular form. Belt dynamic splice efficiency is included in the method and default values are given as 45% for steel cord belts and 30% to 35% for different types of fabric belts splices. The method also considers the maximum belt running tension as the maximum tension at the edge of the belt in the transitions instead of the average belt running tension. Methods to calculate the belt edge tensions and default values are also included. A summary of this SF method is given below. Further details can be obtained from the standard.

As lower safety factors are implemented, typically smaller and/or fewer steel cords are employed in the belt design. This makes the belt more vulnerable to accidental damage from, say, impact or trapped material. In order to reduce the risk of these events having catastrophic consequences for lifeline conveyors it is highly recommended that the condition of the cords be monitored 24/7. Such systems are readily available using well established technology.

Recent field examples justify the cost. For example, a lifeline conveyor in a copper mine in Chile broke in two when the level controller of the stockpile it was feeding failed. The stockpile engulfed the head pulley and the additional material drag on the belt broke the belt in two at a section with 20% of the cords already broken from a previous event. Downtime cost to the mine was stated as US $50 million. Effective cord condition monitoring could have avoided this.

2. Low Rolling Resistance (LRR) Rubber

Energy efficient rubber technology has been offered for 17 years now for conveyor belts. The technology is well established and LRR belt manufacturers are continuing to improve the energy savings offered. The technology has been well reported previously, (Refs. 2 to 11), but the following is a brief recap for those new to the industry.

Fig. 6 illustrates the source of idler indentation rolling resistance. The pulley cover rubber of a belt deforms on every idler under the load carried by the belt. The deformation creates resistance to belt motion and the action of the rubber being compressed generates heat. This heat is lost energy. On a long horizontal overland conveyor this resistance can easily be 60% of the total belt resistance and belt tension. Rubbers can be designed to minimize this effect and so lose less heat energy. They can reduce the lost energy by up to 40%. The effect is known as indentation rolling resistance (IRR) and several research institutions have developed tests to measure it. A German standard, DIN 22123, describes one test method to measure it. An Australian standard is currently being developed for a similar test method.

These are both “full scale” belt tests where belt samples are built into an endless loop and driven at a constant speed on a 2-pulley test fixture. The indentation rolling resistance is measured by means of an instrumented idler roll running against the pulley cover of the belt and a simulated normal load is applied through the belt to the idler. The entire fixture is located inside a walk-in environmental chamber. Ambient temperature, normal idler load, belt speed, idler diameter and belt pulley cover thickness are all controlled variables. Indentation rolling resistance is measured for each set of conditions in N/m of belt width.

Smaller scale methods are also used where the rheological properties of the pulley cover rubber are measured. From these tests, the energy absorbed for any given temperature, rubber strain and frequency of strain can be determined. Studies have shown relatively good correlation between the small scale and the large scale test methods.

There are two main benefits from using a low rolling resistance rubber on a conveyor belt.

  1. Lower belt tension
  2. Lower operational power/energy

The following example illustrates these benefits:

An existing 2300 m long overland conveyor with 5.5 m lift transporting 4600 stph of coal at 5.1 mps was designed in 2007 using conventional CEMA design methodology. The belt is a 72” (1829 mm) ST2500. In Figs. 7 and 8, the original design is shown as the red line. Fig. 7 shows the required belt strength at different operating temperatures based on a 6.67 safety factor and Fig. 8 shows the power utilized at different temperatures. For each chart, conveyor belt characteristic curves are shown for two different pulley cover rubber types:

  1. Standard ARPM (formerly RMA) Grade I (blue line),
  2. Energy Optimized Belt (EOB) LRR rubber (pink line).

 

The charts show:

  1. The significant temperature dependency of each pulley cover rubber.
  2. That the new technology rubbers permit lower belt strengths
  3. That the new technology rubbers use less power at all temperatures that the CEMA calculation

As a consequence of the analysis the required belt strengths can be summarized as follows:

  1. ST2220 for a standard ARPM Grade I belt
  2. ST2170 for a EOB LRR belt

The belt strength for each rubber type is dictated by the lowest operating temperature. In this case, the energy efficient EOB LRR rubber offers a small (2.2%) reduction in belt strength compared to a standard ARPM Grade I belt. As the additional cost of the EOB LRR rubber may be more than 2.2%, the belt’s operating energy cost savings must be calculated and considered in the choice.

In order to calculate the annual cost savings we determine the power requirement for each month considering the average day and night temperatures. Table 2 shows the monthly maximum and minimum temperatures for the conveyor location.

Based on the cost of power for the facility ($0.05/kW.h) and considering the total power requirement for the annual temperature profile, the annual operating cost for each option is:

  1. ARPM Grade I $247,959
  2. EOB LRR $184,213
Annual Savings Potential $ 63 746

The cost savings for the EOB LRR of approximately $637,460 over 10 years, suitably adjusted for inflation, should be considered together with the initial belt capital cost of each belt option in order to properly evaluate the maximum potential cost savings.

As in the discussion on safety factors, if a lower strength belt is chosen, smaller and/or fewer steel cords are employed in the belt design and the belt is more vulnerable to damage from impact. In addition to 24/7 cord condition monitoring, another highly developed technology that should be considered to reduce potential impact damage and excessive belt wear is engineered chute design. This subject is discussed in Ref. 1. ■

References:

About the Authors

Dr. Robin Steven
PHOENIX Conveyor Belt Systems, USA

More information on PHOENIX Conveyor Belt Systems

Google Search – Web

Google Search – Images

PHOENIX Conveyor Belt Systems on the Portal

PHOENIX Conveyor Belt Systems Videos on bulk-online