Pakistan cricket coach Bob Woolmer dies at age 58

Sunday, March 18, 2007

Bob Woolmer, Pakistan‘s cricket coach died at a university hospital in Kingston, Jamaica earlier today. Woolmer was found unconscious in his room at the Pegasus Hotel at 10:45 Sunday morning, only hours after Pakistan’s Cricket World Cup exit at the hands of Ireland.

After being found by hotel workers in his room, he was rushed to the emergency department at the University Hospital in Kingston, but doctors where not able to save him. Team spokesman Parvez Mir reported that when he was found “He had blood on him and there was vomit on the walls.”

Mir also told media that Woolmer suffered from a medical condition, but that it was too early to decide whether or not it had anything to do with his death.

Woolmer played 19 Tests for England and was the coach for Warwickshire and South Africa before becoming coach for Pakistan in 2004. He was to remain coach until after the 2007 World Cup, and was tipped to replace Duncan Fletcher as the coach of England after his contract with the PCB expired.

Retrieved from “https://en.wikinews.org/w/index.php?title=Pakistan_cricket_coach_Bob_Woolmer_dies_at_age_58&oldid=568402”

Buying Train Boxes

Train boxes are a smart idea to store the different parts of a toy wooden or metal train and its parts such as the Lionel trains in one special location. Your dream model train with railway tracks and all the necessary assembling items would easily fit into large sized train boxes and will guarantee a special and organized home for the child’s prized possession.

Many train models give you instruction guide books to build special models with their large array of building parts. However unfortunately with kids, they tend to lose their toys an if they do not have all the parts and items, there is no fun to be had as one part may be that crucial to the train model. You should keep the accessories assembled in a singular carrier box.

Whether you are off for a picnic in the woods or any other place, a train box that is ideal and properly fits all of the parts in a train box, is a must to keep your son or daughter happy.

You can find such boxes online through many toy and merchandise websites. Online is a easy and affordable method of purchase that is recommended.

Building trains part by part is a must-have for every boy’s toy collection. It is a dream that most boys have of owning or designing their very own railway tracks. These are items that need to be kept in close-knit order to be assembled properly for reaching the height of the fun of the game. No game of building is complete when a single part is missing and this can result into a frustrating search or sense of defeat. So it is a must that we keep the necessary parts safe in the boxes!

Wikinews interviews Joe Schriner, Independent U.S. presidential candidate

Saturday, April 17, 2010

Journalist, counselor, painter, and US 2012 Presidential candidate Joe Schriner of Cleveland, Ohio took some time to discuss his campaign with Wikinews in an interview.

Schriner previously ran for president in 2000, 2004, and 2008, but failed to gain much traction in the races. He announced his candidacy for the 2012 race immediately following the 2008 election. Schriner refers to himself as the “Average Joe” candidate, and advocates a pro-life and pro-environmentalist platform. He has been the subject of numerous newspaper articles, and has published public policy papers exploring solutions to American issues.

Wikinews reporter William Saturn? talks with Schriner and discusses his campaign.

Retrieved from “https://en.wikinews.org/w/index.php?title=Wikinews_interviews_Joe_Schriner,_Independent_U.S._presidential_candidate&oldid=4497624”

Ontario Votes 2007: Interview with Family Coalition Party candidate Bob Innes, Hamilton East—Stoney Creek

Monday, October 1, 2007

Robert (Bob) Innes is running for the Family Coalition Party in the Ontario provincial election, in the Hamilton East—Stoney Creek riding. Wikinews’ Nick Moreau interviewed him regarding his values, his experience, and his campaign.

Stay tuned for further interviews; every candidate from every party is eligible, and will be contacted. Expect interviews from Liberals, Progressive Conservatives, New Democratic Party members, Ontario Greens, as well as members from the Family Coalition, Freedom, Communist, Libertarian, and Confederation of Regions parties, as well as independents.

Retrieved from “https://en.wikinews.org/w/index.php?title=Ontario_Votes_2007:_Interview_with_Family_Coalition_Party_candidate_Bob_Innes,_Hamilton_East—Stoney_Creek&oldid=1978970”

Keep your eyes peeled for cosmic debris: Andrew Westphal about Stardust@home

Sunday, May 28, 2006

Stardust is a NASA space capsule that collected samples from comet 81P/Wild (also known as “Wild 2) in deep space and landed back on Earth on January 15, 2006. It was decided that a collaborative online review process would be used to “discover” the microscopically small samples the capsule collected. The project is called Stardust@home. Unlike distributed computing projects like SETI@home, Stardust@home relies entirely on human intelligence.

Andrew Westphal is the director of Stardust@home. Wikinews interviewed him for May’s Interview of the Month (IOTM) on May 18, 2006. As always, the interview was conducted on IRC, with multiple people asking questions.

Some may not know exactly what Stardust or Stardust@home is. Can you explain more about it for us?

Stardust is a NASA Discovery mission that was launched in 1999. It is really two missions in one. The primary science goal of the mission was to collect a sample from a known primitive solar-system body, a comet called Wild 2 (pronounced “Vilt-two” — the discoverer was German, I believe). This is the first [US]] “sample return” mission since Apollo, and the first ever from beyond the moon. This gives a little context. By “sample return” of course I mean a mission that brings back extraterrestrial material. I should have said above that this is the first “solid” sample return mission — Genesis brought back a sample from the Sun almost two years ago, but Stardust is also bringing back the first solid samples from the local interstellar medium — basically this is a sample of the Galaxy. This is absolutely unprecedented, and we’re obviously incredibly excited. I should mention parenthetically that there is a fantastic launch video — taken from the POV of the rocket on the JPL Stardust website — highly recommended — best I’ve ever seen — all the way from the launch pad, too. Basically interplanetary trajectory. Absolutely great.

Is the video available to the public?

Yes [see below]. OK, I digress. The first challenge that we have before can do any kind of analysis of these interstellar dust particles is simply to find them. This is a big challenge because they are very small (order of micron in size) and are somewhere (we don’t know where) on a HUGE collector— at least on the scale of the particle size — about a tenth of a square meter. So

We’re right now using an automated microscope that we developed several years ago for nuclear astrophysics work to scan the collector in the Cosmic Dust Lab in Building 31 at Johnson Space Center. This is the ARES group that handles returned samples (Moon Rocks, Genesis chips, Meteorites, and Interplanetary Dust Particles collected by U2 in the stratosphere). The microscope collects stacks of digital images of the aerogel collectors in the array. These images are sent to us — we compress them and convert them into a format appropriate for Stardust@home.

Stardust@home is a highly distributed project using a “Virtual Microscope” that is written in html and javascript and runs on most browsers — no downloads are required. Using the Virtual Microscope volunteers can search over the collector for the tracks of the interstellar dust particles.

How many samples do you anticipate being found during the course of the project?

Great question. The short answer is that we don’t know. The long answer is a bit more complicated. Here’s what we know. The Galileo and Ulysses spacecraft carried dust detectors onboard that Eberhard Gruen and his colleagues used to first detect and them measure the flux of interstellar dust particles streaming into the solar system. (This is a kind of “wind” of interstellar dust, caused by the fact that our solar system is moving with respect to the local interstellar medium.) Markus Landgraf has estimated the number of interstellar dust particles that should have been captured by Stardust during two periods of the “cruise” phase of the interplanetary orbit in which the spacecraft was moving with this wind. He estimated that there should be around 45 particles, but this number is very uncertain — I wouldn’t be surprised if it is quite different from that. That was the long answer! One thing that I should say…is that like all research, the outcome of what we are doing is highly uncertain. There is a wonderful quote attributed to Einstein — “If we knew what we were doing, it wouldn’t be called “research”, would it?”

How big would the samples be?

We expect that the particles will be of order a micron in size. (A millionth of a meter.) When people are searching using the virtual microscope, they will be looking not for the particles, but for the tracks that the particles make, which are much larger — several microns in diameter. Just yesterday we switched over to a new site which has a demo of the VM (virtual microscope) I invite you to check it out. The tracks in the demo are from submicron carbonyl iron particles that were shot into aerogel using a particle accelerator modified to accelerate dust particles to very high speeds, to simulate the interstellar dust impacts that we’re looking for.

And that’s on the main Stardust@home website [see below]?

Yes.

How long will the project take to complete?

Partly the answer depends on what you mean by “the project”. The search will take several months. The bottleneck, we expect (but don’t really know yet) is in the scanning — we can only scan about one tile per day and there are 130 tiles in the collector…. These particles will be quite diverse, so we’re hoping that we’ll continue to have lots of volunteers collaborating with us on this after the initial discoveries. It may be that the 50th particle that we find will be the real Rosetta stone that turns out to be critical to our understanding of interstellar dust. So we really want to find them all! Enlarging the idea of the project a little, beyond the search, though is to actually analyze these particles. That’s the whole point, obviously!

And this is the huge advantage with this kind of a mission — a “sample return” mission.

Most missions rather do things quite differently… you have to build an instrument to make a measurement and that instrument design gets locked in several years before launch practically guaranteeing that it will be obsolete by the time you launch. Here exactly the opposite is true. Several of the instruments that are now being used to analyze the cometary dust did not exist when the mission was launched. Further, some instruments (e.g., synchrotrons) are the size of shopping malls — you don’t have a hope of flying these in space. So we can and will study these samples for many years. AND we have to preserve some of these dust particles for our grandchildren to analyze with their hyper-quark-gluon plasma microscopes (or whatever)!

When do you anticipate the project to start?

We’re really frustrated with the delays that we’ve been having. Some of it has to do with learning how to deal with the aerogel collectors, which are rougher and more fractured than we expected. The good news is that they are pretty clean — there is very little of the dust that you see on our training images — these were deliberately left out in the lab to collect dust so that we could give people experience with the worst case we could think of. In learning how to do the scanning of the actual flight aerogel, we uncovered a couple of bugs in our scanning software — which forced us to go back and rescan. Part of the other reason for the delay was that we had to learn how to handle the collector — it would cost $200M to replace it if something happened to it, so we had to develop procedures to deal with it, and add several new safety features to the Cosmic Dust Lab. This all took time. Finally, we’re distracted because we also have many responsibilities for the cometary analysis, which has a deadline of August 15 for finishing analysis. The IS project has no such deadline, so at times we had to delay the IS (interstellar, sorry) in order to focus on the cometary work. We are very grateful to everyone for their patience on this — I mean that very sincerely.

And rest assured that we’re just as frustrated!

I know there will be a “test” that participants will have to take before they can examine the “real thing”. What will that test consist of?

The test will look very similar to the training images that you can look at now. But.. there will of course be no annotation to tell you where the tracks are!

Why did NASA decide to take the route of distributed computing? Will they do this again?

I wouldn’t say that NASA decided to do this — the idea for Stardust@home originated here at U. C. Berkeley. Part of the idea of course came…

If I understand correctly it isn’t distributed computing, but distributed eyeballing?

…from the SETI@home people who are just down the hall from us. But as Brian just pointed out. this is not really distributed computing like SETI@home the computers are just platforms for the VM and it is human eyes and brains who are doing the real work which makes it fun (IMHO).

That said… There have been quite a few people who have expressed interested in developing automated algorithms for searching. Just because WE don’t know how to write such an algorithm doesn’t mean nobody does. We’re delighted at this and are happy to help make it happen

Isn’t there a catch 22 that the data you’re going to collect would be a prerequisite to automating the process?

That was the conclusion that we came to early on — that we would need some sort of training set to be able to train an algorithm. Of course you have to train people too, but we’re hoping (we’ll see!) that people are more flexible in recognizing things that they’ve never seen before and pointing them out. Our experience is that people who have never seen a track in aerogel can learn to recognize them very quickly, even against a big background of cracks, dust and other sources of confusion… Coming back to the original question — although NASA didn’t originate the idea, they are very generously supporting this project. It wouldn’t have happened without NASA’s financial support (and of course access to the Stardust collector). Did that answer the question?

Will a project like this be done again?

I don’t know… There are only a few projects for which this approach makes sense… In fact, I frankly haven’t run across another at least in Space Science. But I am totally open to the idea of it. I am not in favor of just doing it as “make-work” — that is just artificially taking this approach when another approach would make more sense.

How did the idea come up to do this kind of project?

Really desperation. When we first thought about this we assumed that we would use some sort of automated image recognition technique. We asked some experts around here in CS and the conclusion was that the problem was somewhere between trivial and impossible, and we wouldn’t know until we had some real examples to work with. So we talked with Dan Wertheimer and Dave Anderson (literally down the hall from us) about the idea of a distributed project, and they were quite encouraging. Dave proposed the VM machinery, and Josh Von Korff, a physics grad student, implemented it. (Beautifully, I think. I take no credit!)

I got to meet one of the stardust directors in March during the Texas Aerospace Scholars program at JSC. She talked about searching for meteors in Antarctica, one that were unblemished by Earth conditions. Is that our best chance of finding new information on comets and asteroids? Or will more Stardust programs be our best solution?

That’s a really good question. Much will depend on what we learn during this official “Preliminary Examination” period for the cometary analysis. Aerogel capture is pretty darn good, but it’s not perfect and things are altered during capture in ways that we’re still understanding. I think that much also depends on what question you’re asking. For example, some of the most important science is done by measuring the relative abundances of isotopes in samples, and these are not affected (at least not much) by capture into aerogel.

Also, she talked about how some of the agencies that they gave samples to had lost or destroyed 2-3 samples while trying to analyze them. That one, in fact, had been statically charged, and stuck to the side of the microscope lens and they spent over an hour looking for it. Is that really our biggest danger? Giving out samples as a show of good faith, and not letting NASA example all samples collected?

These will be the first measurements, probably, that we’ll make on the interstellar dust There is always a risk of loss. Fortunately for the cometary samples there is quite a lot there, so it’s not a disaster. NASA has some analytical capabilities, particularly at JSC, but the vast majority of the analytical capability in the community is not at NASA but is at universities, government labs and other institutions all over the world. I should also point out that practically every analytical technique is destructive at some level. (There are a few exceptions, but not many.) The problem with meteorites is that except in a very few cases, we don’t know where they specifically came from. So having a sample that we know for sure is from the comet is golden!

I am currently working on my Bachelor’s in computer science, with a minor in astronomy. Do you see successes of programs like Stardust to open up more private space exploration positions for people such as myself. Even though I’m not in the typical “space” fields of education?

Can you elaborate on your question a little — I’m not sure that I understand…

Well, while at JSC I learned that they mostly want Engineers, and a few science grads, and I worry that my computer science degree with not be very valuable, as the NASA rep told me only 1% of the applicants for their work study program are CS majors. I’m just curious as to your thoughts on if CS majors will be more in demand now that projects like Stardust and the Mars missions have been great successes? Have you seen a trend towards more private businesses moving in that direction, especially with President Bush’s statement of Man on the Moon in 2015?

That’s a good question. I am personally not very optimistic about the direction that NASA is going. Despite recent successes, including but not limited to Stardust, science at NASA is being decimated.

I made a joke with some people at the TAS event that one day SpaceShipOne will be sent up to save stranded ISS astronauts. It makes me wonder what kind of private redundancy the US government is taking for future missions.

I guess one thing to be a little cautious about is that despite SpaceShipOne’s success, we haven’t had an orbital project that has been successful in that style of private enterprise It would be nice to see that happen. I know that there’s a lot of interest…!

Now I know the answer to this question… but a lot do not… When samples are found, How will they be analyzed? Who gets the credit for finding the samples?

The first person who identifies an interstellar dust particle will be acknowledged on the website (and probably will be much in demand for interviews from the media!), will have the privilege of naming the particle, and will be a co-author on any papers that WE (at UCB) publish on the analysis of the particle. Also, although we are precluded from paying for travel expenses, we will invite those who discover particles AND the top performers to our lab for a hands-on tour.

We have some fun things, including micromachines.

How many people/participants do you expect to have?

About 113,000 have preregistered on our website. Frankly, I don’t have a clue how many will actually volunteer and do a substantial amount of searching. We’ve never done this before, after all!

One last thing I want to say … well, two. First, we are going to special efforts not to do any searching ourselves before we go “live”. It would not be fair to all the volunteers for us to get a jumpstart on the search. All we are doing is looking at a few random views to make sure that the focus and illumination are good. (And we haven’t seen anything — no surprise at all!) Also, the attitude for this should be “Have Fun”. If you’re not having fun doing it, stop and do something else! A good maxim for life in general!

Retrieved from “https://en.wikinews.org/w/index.php?title=Keep_your_eyes_peeled_for_cosmic_debris:_Andrew_Westphal_about_Stardust@home&oldid=4608360”

Glucosamine As An Arthritis Medication Here’s The Real Facts

By Paul Elms

There are a many different opinions on treatments for arthritis. A commonly used arthritis medication called glucosamine has recently been getting a lot of attention. Lets take a closer look at this interesting product.

What is it? – Glucosamine is an amino sugar that is naturally found in the human body and in other animals. Commercially it is produced from the shells of crustaceans. It is widely held up as being of use to patients who suffer from osteoarthritis. This is a condition where the cartilage covering the ends of bones in a joint becomes worn down. This results in inflammation, pain and swelling. It is estimated that there are over 20 million sufferers of osteoarthritis in the US.

How does it work? – It is believed that glucosamine works by helping to restore damaged cartilage. This is important because cartilage provides protection and shock absorption in a joint. It may also have an anti-inflammatory effect. The product is commonly used in combination with chondroitin.

YouTube Preview Image

How is it taken? – For arthritis medication, glucosamine is taken in either a liquid or tablet form. It is recommended that around 1500mg per day is taken. This maybe achieved by taking a 500mg tablet 3 times daily. This has been found to be effective in clinical trials. Some products provide considerably less than this figure so it is worth checking before making a purchase.

Is it licensed? – Glucosamine is not licensed by the FDA for use as a medical product in humans. However, it is currently classed as a dietary supplement and can be marketed provided that no specific claims to treat a specific condition are made.

Is it safe? – Glucosamine id taken internally as a food supplement. There have been numerous studies carried out that have concluded that the taking of glucosamine is safe. Because it is derived from the shells of crustaceans, those with a seafood allergy may want to proceed with caution.

Does it actually work? – There have been a number of scientific studies to evaluate the effectiveness of glucosamine. A growing number have concluded that there is a clear benefit of using glucosamine especially if it is used in combination with chondroitin. Other studies have failed to show a benefit over placebo.

Where can I get it? – Glucosamine is available for a number of suppliers including health food stores, online retailers and mail order supplement suppliers. Ensure that you use a reputable company to obtain the product from. Some products have very low levels in them and others have poorer quality ingredients.

I want to try it – If you would like to try glucosamine, speak to your doctor first. Do not think of it as a miracle cure, but as an addition to a sensible treatment plan that also includes suitable exercise, good diet and weight reduction (if needed). The recommended dose to take on a daily basis is a tablet containing 500mg of glucosamine and 400mg of chondroitin, three times each day. Do not stop taking any medication that your doctor has prescribed for you to take, as glucosamine is only intended as a supplement.

About the Author: Find out more about natural treatments for arthritis by visiting arthritispainadvice.com. You maybe surprised at the latest thoughts on herbal treatment for arthritis. You can also find out about the latest tips on rheumatoid arthritis diets.

Source: isnare.com

Permanent Link: isnare.com/?aid=231551&ca=Medicines+and+Remedies

Greek debt deal reached

Saturday, March 27, 2010

A meeting in Brussels has produced a plan, supported by all 16 countries in the eurozone, to make available up to 22 billion euros in financing to support Greece, which is laden with debt.

The deal would come into force only if Greece was unable to borrow money from commercial lenders, and would require approval from all 16 eurozone countries. While no figures were included in the agreement, anonymous officials said the total package would be around 22 billion euros, of which European countries would provide two-thirds. The remainder would be supplied by the International Monetary Fund.

Germany and France were the architects of the document, which was subsequently approved by the other members of the eurozone. While it is seen as a partial retreat for countries such as France that previously opposed any IMF participation in the loans, it is nevertheless regarded as a breakthrough in negotiations. Germany had been insistent on relatively strong terms for the plan, a large amount of which was in the final version.

Despite the agreement, there are no plans for it to take immediate effect, as the Greek government has not requested financial aid, and officials said that they hoped the option would never have to be used. The president of the European Central Bank, Jean-Claude Trichet, said that “the mechanism decided today will not normally need to be activated.”

Retrieved from “https://en.wikinews.org/w/index.php?title=Greek_debt_deal_reached&oldid=3741882”

British supermarket chain Tesco to sell its Polish branch to Salling Group A/S

Sunday, June 21, 2020

On Thursday, UK-based retailer Tesco and Denmark-based retailer Salling Group announced their agreement over selling a large portion of Tesco’s Polish operation to Salling Group. Tesco stated its intention to leave the Polish market altogether, Salling its intention to strengthen its Netto chain in Poland.

The deal covers 301 stores, two distribution centers and the head office. With the acquisition Salling said it seeks to improve its coverage in Southern Poland and, over 18 months at a cost of one billion z?otys, intends to merge these stores into its currently 386 strong Netto chain. Salling also takes over about 7000 employees from Tesco — Netto Poland currently has about 5000 employees. Tesco continues to run 19 stores, which were not included in this package.

The sale price, to be payed in cash, is 900 million z?otys (181 million pounds). In the 2019/20 fiscal year, Tesco said its Polish branch had a 24 million pounds operating loss with a 1368 million pounds turnover; to which the sold units contributed with a 947 million pounds turnover for a 107 million loss. At fiscal year-end, the sold units held a value of 681 million pounds in the books.

UOKiK, the Polish anti-monopoly agency, has to approve the deal. The parties said they expect a decision this year.

Tesco has suffered losses from its Polish operation for several years, as customer preference has shifted away from hypermarkets, Tesco’s preferred store size, to smaller discount stores like Biedronka and LIDL. The Sunday trade ban, introduced in 2018, also hurt sales. According to Notes from Poland (NFP), some discount stores resorted to offer postal services, a loophole which allows Sunday opening hours.

In 2015, Tesco centralized its management in the Central European region, comprising Czechia, Hungary, Poland, and Slovakia, but reverted the decision later on. At the time, the company invested in e-commerce and started home deliveries, but Gazeta Prawna reports only 0.5% of Polish grocery turnover comes from this segment, compared to 7% in the United Kingdom.

In the past few years, Tesco Polska has cut expenditures by streamlining its product range, halving its staff, shutting off home deliveries in parts of the country, and closing off stores, reportedly including last year its Poznan distribution center. Deutsche Welle (DW) reported in mid-2019, 62 Tesco outlets had closed within a year. Staff layoffs left meat, fish and delicatessen departments without designated shop assistants, and forced staff canteen closures and administration simplifications.

Tesco sold its roughly 2000 Thai and 74 Malaysian stores to Charoen Pokphand in March; announced leaving its joint venture with China Resources Holdings in February; and in 2015 sold its South Korean chain HomePlus. Speaking to Portfolio.hu, Matt Simister, Tesco’s CEO for Central Europe, explained that the company held a 4% share of the Polish retail market, compared to its 16% share in Hungary, and stated they want to stay in Hungary. In a DW report in March of last year, Dave Lewis, CEO of Tesco, stated they did not have intentions of leaving Thailand or Poland.

Tesco’s Polish operation, according to Gazeta Wyborcza in March, is too big for a single monolithic sale. The chain’s press release reported 22 sold units in the past year and a half, for around 200 million pounds. NFP named the Kaufland chain and property developer Echo Investment among the buyers.

Both Tesco and Salling entered the Polish market in 1995.

Retrieved from “https://en.wikinews.org/w/index.php?title=British_supermarket_chain_Tesco_to_sell_its_Polish_branch_to_Salling_Group_A/S&oldid=4577092”

What Is Freight Services And When Should I Use It?

By Adrianna Notton

With such a diverse range of companies shipping a wide variety of products, not every product can be shipped using traditional shippers such as couriers and postal delivery. As well, there are companies that have a high transport demand. For these situations, the best type of shipping is a Freight Service.

Freight services are used for items with non traditional weights, sizes, and volumes that cannot be shipped by conventional shipping methods. Using a freight service is essential for shipments that have to make switches at distribution centers or trade freight for pre-designed routes. This service cuts down on the shipment process time and the shipping process is much more convenient for customers, especially customers who have critical transports. Freight services are available to take on the challenges of unique shipment demands which makes it more convenience for businesses to get their shipments to their destination in a timely and safe manner.

When using a freight service, everything that must be protected is boxed, packed or crated properly to ensure safety of the items. The items are also addressed properly. Often drivers or freight service personnel will load the items. A freight truck with a hydraulic lift-gate can be reserved to help in the loading of heavier items. Freight services specialize in shipments that are 100 pounds or more and shipments that are too large and bulky. These types of items can be shipped by freight companies either by air, road, sea, and water, or any combination. Bigger shipments can be transported by container or dedicated truck.

YouTube Preview Image

Generally speaking, shipments that are 8 000 pounds or more are called a “truck load” (TL) shipment. and smaller shipments are transported by “less than a truck load” (LTL). Most people tend to select the LTL as it is often costs less. The Freight Service Specialist will suggest the cheapest method of transport, but the customer makes the decision. LTL services normally offer the lowest price.

Most shippers use LTL service as it is offered by major freight service companies. It is also safe when packed properly, easy to track the shipment, and timely. There are also specialty services for fragile items where the carrier will pick up the shipment, make sure it is protected when loaded, and deliver it to the final delivery destination. The service is much more expensive than LTL freight.

Think of freight as a vary large version of parcel shipping. Whenever you have a have a letter or a need to send a small box, you will use the Post Office. When shipping a small or medium size parcel, you will use a delivery courier. However, when you need a large quantity of items or items that are heavy, large, and bulky, you will need another level of shipping service called ‘Freight Services’. If you are confused about what type of service you require, you can discuss your choices with a freight service representative who will be able to answer all of your questions and offer efficient solutions.

About the Author: Looking for an excellent

Air freight

/

Cross border shipping

service? As Canada’s largest courier company, they offer the best products and services to ensure your shipments get where they need to be quickly, for the right price.

Source:

isnare.com

Permanent Link:

isnare.com/?aid=753367&ca=Business

Disposal of fracking wastewater poses potential environmental problems

Wednesday, April 25, 2012

A recent study by the United States Geological Survey (USGS) shows that the oil and gas industry are creating earthquakes. New information from the Midwest region of the United States points out that these man-made earthquakes are happening more frequently than expected. While more frequent earthquakes are less of a problem for regions like the Midwest, a geology professor from the University of Southern Indiana, Dr. Paul K. Doss, believes the disposal of wastewater from the hydraulic fracturing (or “fracking”) process used in extracting oil and gas has the possibility to pose potential problems for groundwater.

“We are taking this fluid that has a whole host of chemicals in it that are useful for fracking and putting it back into the Earth,” Doss said. “From a purely seismic perspective these are not big earthquakes that are going to cause damage or initiate, as far as we know, any larger kinds of earthquakes activity for Midwest. [The issue] is a water quality issue in terms of the ground water resources that we use.”

Hydraulic fracturing, or fracking, is a technique used by the oil and gas industries which inject highly pressurized water down into the Earth’s crust to break rock and extract natural gas. Most of the fluids used for fracking are proprietary, so information about what chemicals are used in the various fluids are unknown to the public and to create a competitive edge.

Last Monday four researchers from the University of New Brunswick released an editorial that sheds light on the potential risks that the current wastewater disposal system could have on the province’s water resources. The researchers share the concern that Dr. Doss has and have come out to say that they believe fracking should be stopped in the province until there is an environ­mentally safe way to dispose the waste wastewater.

“If groundwater becomes contamin­ated, it takes years to decades to try to clean up an aquifer system,” University of New Brunswick professor Tom Al said.

While the USGS group which conducted the study says it is unclear how the earthquake rates may be related to oil and gas production, they’ve made the correlation between the disposal of wastewater used in fracking and the recent upsurge in earthquakes. Because of the recent information surfacing that shows this connection between the disposal process and earthquakes, individual states in the United States are now passing laws regarding disposal wells.

The problem is that we have never, as a human society, engineered a hole to go four miles down in the Earth’s crust that we have complete confidence that it won’t leak.

“The problem is that we have never, as a human society, engineered a hole to go four miles down in the Earth’s crust that we have complete confidence that it won’t leak,” Doss said. “A perfect case-in-point is the Gulf of Mexico oil spill in 2010, that oil was being drilled at 18,000 feet but leaked at the surface. And that’s the concern because there’s no assurance that some of these unknown chemical cocktails won’t escape before it gets down to where they are trying to get rid of them.”

It was said in the study released by the New Brunswick University professors that if fracking wastewater would contaminate groundwater, that current conventional water treatment would not be sufficient enough to remove the high concentration of chemicals used in fracking. The researchers did find that the wastewater could be recycled, can also be disposed of at proper sites or even pumped further underground into saline aquifers.

The New Brunswick professors have come to the conclusion that current fracking methods used by companies, which use the water, should be replaced with carbon diox­ide or liquefied propane gas.

“You eliminate all the water-related issues that we’re raising, and that peo­ple have raised in general across North America,” Al said.

In New Brunswick liquefied propane gas has been used successfully in fracking some wells, but according to water specialist with the province’s Natural Resources De­partment Annie Daigle, it may not be the go-to solution for New Brunswick due its geological makeup.

“It has been used successfully by Corridor Resources here in New Bruns­wick for lower volume hydraulic frac­turing operations, but it is still a fairly new technology,” Daigle said.

The United States Environmental Protection Agency (EPA) is working with U.S. states to come up with guidelines to manage seismic risks due to wastewater. Under the Safe Drinking Water Act, the EPA is the organization that also deals with the policies for wells.

Oil wells, which are under regulation, pump out salt water known as brine, and after brine is pumped out of the ground it’s disposed of by being pumped back into the ground. The difference between pumping brine and the high pressurized fracking fluid back in the ground is the volume that it is disposed of.

“Brine has never caused this kind of earthquake activity,” Doss said. “[The whole oil and gas industry] has developed around the removal of natural gas by fracking techniques and has outpaced regulatory development. The regulation is tied to the ‘the run-of-the-mill’ disposal of waste, in other words the rush to produce this gas has occurred before regulatory agencies have had the opportunity to respond.”

According to the USGS study, the increase in injecting wastewater into the ground may explain the sixfold increase of earthquakes in the central part of the United States from 2000 – 2011. USGS researchers also found that in decades prior to 2000 seismic events that happened in the midsection of the U.S. averaged 21 annually, in 2009 it spiked to 50 and in 2011 seismic events hit 134.

“The incredible volumes and intense disposal of fracking fluids in concentrated areas is what’s new,” Doss said. “There is not a body of regulation in place to manage the how these fluids are disposed of.”

The study by the USGS was presented at the annual meeting of the Seismological Society of America on April 18, 2012.

Retrieved from “https://en.wikinews.org/w/index.php?title=Disposal_of_fracking_wastewater_poses_potential_environmental_problems&oldid=3931361”