Showing posts with label Science and Tech. Show all posts
Showing posts with label Science and Tech. Show all posts

Tuesday, August 02, 2016

Shale Gas

What is Shale Gas?


  1. Shale gas is natural gas formed from being trapped within shale formations.
  2. Shale gas—an “unconventional” source of methane, like coal-bed gas (in coal seams) and tight gas (trapped in rock formations)
  3. It is Colourless, odourless, lighter than air.
  4. In Europe shale gas isnot used because of green rules and limited property rights. But In America, use of Shale Gas has already started and In future, it will give a spur to the domestic manufacture of anything needing large amounts of energy.

What’re the Advantages of Gas as a source of Energy?


  1. Gas power stations are relatively cheap to build, beating nuclear power hands down in terms of capital costs, and in most cases they are also less expensive than renewables.
  2. A flexible fuel, capable of heating homes, fuelling industrial boilers and providing feedstock for the petrochemicals industry, where it is turned into plastics, fertiliser and other useful stuff.
  3. It is also making small but significant progress as a fuel for lorries and buses.
  4. Biggest advances have been in power generation: because it is cheaper to generate electricity from gas, and  the process releases up to 50% less carbon dioxide than does coal.

What’re the problems with Gas as a source of energy?


  1. It is difficult and expensive to transport.
  2. Because of those hefty transport costs, gas does not behave like a commodity. Only one-third of all gas is traded across borders, compared with two-thirds of oil.
  3. Gas has no global price.
  4. Gas prices in different parts of the world are set by quite different mechanisms, they vary wildly across the globe.
  5. Gas pipelines cost million of dollars a kilometre to build.
  6. alternative is to ship the gas in liquid form, as LNG. But projects to liquefy gas also require huge investments
  7. GK: Gazprom is Russia’s huge state-run gas producer and supplier of 25% of Europe’s gas.

Wednesday, September 16, 2015

National Solar Mission

The Union Cabinet chaired by the Prime Minister, Shri Narendra Modi, gave its approval for stepping up of India’s solar power capacity target under the Jawaharlal Nehru National Solar Mission (JNNSM) by five times, reaching 1,00,000 MW by 2022. The target will principally comprise of 40 GW Rooftop and 60 GW through Large and Medium Scale Grid Connected Solar Power Projects. With this ambitious target, India will become one of the largest Green Energy producers in the world, surpassing several developed countries.

The total investment in setting up 100 GW will be around Rs. 6,00,000 cr. In the first phase, the Government of India is providing Rs. 15,050 crore as capital subsidy to promote solar capacity addition in the country. This capital subsidy will be provided for Rooftop Solar projects in various cities and towns, for Viability Gap Funding (VGF) based projects to be developed through the Solar Energy Corporation of India (SECI) and for decentralized generation through small solar projects. The Ministry of New and Renewable Energy (MNRE) intends to achieve the target of 1,00,000 MW with targets under the three schemes of 19,200 MW.

Apart from this, solar power projects with investment of about Rs. 90,000 crore would be developed using Bundling mechanism with thermal power. Further investment will come from large Public Sector Undertakings and Independent Power Producers (IPPs). State Governments have also come out with State specific solar policies to promote solar capacity addition.

The Government of India may also approach bilateral and international donors as also the Green Climate Fund for achieving this target. Solar power can contribute to the long term energy security of India, and reduce dependence on fossil fuels that put a strain on foreign reserves and the ecology as well. The solar manufacturing sector will get a boost with this long term trajectory of solar capacity addition. This will help in creation of technology hubs for manufacturing. The increased manufacturing capacity and installation are expected to pave way for direct and indirect employment opportunities in both the skilled and unskilled sector.

The new solar target of 100 GW is expected to abate over 170 million tonnes of CO2 over its life cycle. This Solar Scale-up Plan has a target of 40 GW through Decentralized Solar Power Generation in the form of Grid Connected Rooftop Projects. While Decentralized Generation will stabilise the grid, it will minimise investment on power evacuation.

To facilitate such a massive target, the Prime Minister’s Office has been pushing various Ministries to initiate supporting interventions, like:-

a) incorporating changes in land use regulations and tenancy laws to facilitate aggregation and leasing of land by farmers/ developers for solar projects;

b) identification of large chunks of land for solar projects;

c) identification of large government complexes/ buildings for rooftop projects;

d) clear survey of wastelands and identification of transmission/ road infrastructure using satellite technology for locating solar parks;

e) development of power transmission network/ Green Energy Corridor;

f) setting up of exclusive parks for domestic manufacturing of solar PV modules;

g) provision of roof top solar and 10 percent renewable energy as mandatory reform under the new scheme of Ministry of Urban Development;

h) amendments in building bye-laws for mandatory provision of roof top solar for new construction or higher FAR;

i) considering infrastructure status for solar projects; raising tax free solar bonds; providing long tenor loans; making roof top solar a part of housing loan by banks/ NHB and extending IIFCL credit facility to such projects by the Department of Financial Services;

j) suitable amendments to the Electricity Act for strong enforcement of Renewable Purchase Obligation (RPO) and for providing Renewable Generation Obligation (RGO);

k) incorporating measures in Integrated Power Development Scheme (IPDS) for encouraging distribution companies and making net-metering compulsory.

Background:

JNNSM was launched in 2009 with a target for Grid Connected Solar Projects of 20,000 MW by 2022. In the last two to three years, the sector has witnessed rapid development with installed solar capacity increasing rapidly from 18 MW to about 3800 MW during 2010 - 15. The price of solar energy has come down significantly from Rs.17.90 per unit in 2010 to under Rs.7 per unit, thereby reducing the need of VGF / GBI per MW of solar power. With technology advancement and market competition, this Green Power is expected to reach grid parity by 2017-18. These developments would enable India to achieve its present target of 20,000 MW. But considering its international commitment towards Green and climate friendly growth trajectory, the Government of India has taken this path-breaking decision.

Wednesday, July 16, 2014

Software Technology Parks of India (STPI)

Software Technology Parks of India (STPI) is a government agency in India, established in 1991 under the Ministry of Communications and Information Technology, that manages the Software Technology Park scheme. It is an export oriented scheme for the development and export of computer software, including export of professional services. 

The STP Scheme provides various benefits to the registered units, which includes:
  1. 100% foreign equity, 
  2. tax incentives, 
  3. duty-free import, 
  4. duty-free indigenous procurement, 
  5. CST reimbursement, 
  6. DTA entitlement, 
  7. deemed export etc.

STPI has played a seminal role in India having earned a reputation as an information technology superpower. STP units exported software and information technology worth Rs. 215264 crore in FY 2010-11. The state with the largest export contribution was Karnataka followed by Maharashtra, Tamil Nadu and Andhra Pradesh. STPI has a presence in many of the major cities of India including the cities of Bangalore, Mysore, Trivandrum, Bhilai, Bhubaneswar, Chennai, Coimbatore, Hyderabad, Gurgaon, Pune, Guwahati, Noida, Mumbai, Nagpur, Kolkata, Kanpur, Lucknow, Dehradun, Patna, Rourkela, Ranchi, Gandhinagar-Gujarat, Surat, Imphal, Shillong, Nashik etc.

STPI centers provide variety of services, which includes:
  1. High Speed Data Communication, 
  2. Incubation facility, Consultancy, 
  3. Network Monitoring, 
  4. Data Center, 
  5. Data Hosting etc. 
  6. provides physical hosting for the National Internet Exchange of India
  7. regulating the STP scheme


The tax benefits under the Income Tax Act Section 10A applicable to STP units has expired since March 2011. While the Government has chosen not to extend the Sec 10A benefits against the demand by the IT units, most of the STP registered SME units shall be affected, who now will have to pay Income Tax on profits earned from exports.

Thursday, May 16, 2013

HUMAN GENOME PROJECT

HUMAN GENOME PROJECT

The concept of genomics began with the concept of Human Genome Project in the mid 1980s. The $3 billion project-The Human Genome organization (HUGO) was set up in 1990 to co-ordinate the work of scientists in a number of countries-the USA, Japan, UK, France, Germany, Canada, Israel, Russia, Italy and others- in a project to map all of the genes on human chromosomes. The Human Genome Project started on 1st Oct, 1990 in US to map and sequence the complete set of human chromosomes, as well as those of some of the model organisms.

According to a 1986 report submitted by Department of Energy (USA) " The ultimate goal of this initiative is to understand the human genome" and "knowledge" of the human genome is as necessary to the continuing progress of medicine and other health sciences as knowledge of human anatomy has been for the present state of medicine." 

The funding for this project came from the US government through the National Institutes of Health, USA and a UK charity organization, The Wellcome Trust (which funded the Sanger Institute in Great Britain), and some other groups around the world.

The aim of the Human Genome Project was to identify all the genes (approx. 25,000) in human DNA and to determine the sequence of the three billion chemical base pairs that make up human DNA. Efforts were made to create databases to store this information and develop tools to do comprehensive data analysis.

Another important aspect of this project was the decision taken to address the ethical, legal and social issues arising as a outcome of this project. In order to have comparative data, research work was carried out simultaneously on three other organisms namely bacteria- E.Coli, the fruit fly-Drosophila melanogaster, and laboratory mouse.

Another big step forward was the transfer of the technology to the private sector. This approach lead to tremendous progress in the biotechnological field in the later years. The procedure adopted involved the breaking down of genomes into smaller pieces approximately 150,000 base pairs in length also known as BACs or "bacteria artificial chromosomes". They can be inserted into bacteria where they are copied by the bacterial DNA replication. These pieces are then sequenced separately as a small "shotgun" project and then assembled. The larger (150,000 base pairs) together create chromosomes. This is known as "the Hierarchical shotgun" approach because in this method first the genome is broken into relatively large chunks, which are then mapped to chromosomes before being selected for sequencing.

Every individual has a unique gene sequence therefore the data published by the Human Genome Project does not essentially represent the exact sequence of each and every individual’s genome. The results represent the combined genome of a small number of anonymous donors. In order to have more information about the human genome project, the readers can visit the following web site:www.ornl.gov/techresources/Human_Genome/home.shtml

The Impact of Human Genome Project

After the human genome project the world has changed and it is going to change even more. The Human Genome project is going to impact our lives in a tremendous way. It took 15 years and about 4 billions US dollars to sequence the human genome which was completed in 2003. There are 4 bases in the DNA. A,T, G, C and if we add them up then the total is approx 3 billion. Further, the average gene consists of 3000 bases, and the sizes vary greatly. The largest known human gene is dystrophin about 2.4 million bases. The total number of genes is estimated at around 30,000. Almost all (99.9%) nucleotide bases are exactly the same in all people. So far the functions of over 50% of discovered genes are unknown.  Chromosome 1 has the most genes about 2968, and the Y chromosome has the fewest (231) as chromosome 1 is the longest and Y chromosome is the smallest. The Human Genome Project also revealed that genes appear to be concentrated in random areas along the genome, with vast expanses of non coding DNA between. Stretches of up to 30,000 C and G bases repeating over and over often occur adjacent to gene-rich areas, forming a barrier between the genes and the "junk DNA." These CpG islands are believed to help regulate gene activity. The ratio of germ line (sperm or egg cell) mutations is 2:1 in males v/s females.  Researchers point to several reasons for the higher mutation rate in the male germ line, including the greater number of cell divisions required for sperm formation than for eggs.

The information that was revealed by the Human Genome Project can be used to improve diagnosis of disease. The risk associated with genetic predisposition to diseases can be calculated and based on the results new strategies can be used to treat these diseases such as gene therapy, customized drugs based on individual patients genetic profiles. The information from Human genome Project is also being used in microbial Genomics to detect and treat pathogens, use microorganisms in bio-remediation where environment can be monitored to detect pollution levels and clean up toxic waste. New energy sources as biofuels are also being developed. 

With the help of Genome Sequencing machines, it is now possible to sequence a genome in a record period of time. The human genome is about 3 giga bases and with the present available models of the Genome Sequencing machine, it is possible to sequence 200 giga bases in a week’s time. As far as the price of sequencing is concerned, the price of sequencing a base has already fallen 100 million times. A few years back, it was used to cost $100,000, today it is $1000 and in the coming years it is going to be  $100.

 The world’ s capacity to sequence the human genome is something like 50,000 to 100, 000 human genomes this year. This is based on the present model of the machines available. This is going to double, triple or quadruple year over year. In fact, the Beijing Genomics Institute is far ahead then others with a capacity that is almost 20% of the total genome sequencing capacity of the world. The sequencing of the genes is continuously giving us valuable information regarding human health and treatment of diseases that were difficult to understand and therefore had no cure. The most important being Cancer hitherto still with no cure. It has been possible to correlate the relation between the deletion of TP 53 gene and occurrence of cervical and breast cancer. If there happens to be a deletion mutation in this gene, there is almost 90% chance of getting cancer in these individuals.  If one can get the genetic test done, and if they have the same deletions, the family or the individual can go for regular screenings to catch the cancer early.

There is other very interesting information getting revealed, such as explaining marital infidelity due to the presence of “Cheating genes.” Already there are labs to tests for allele 334 of the AVPR1 gene that is also called “Cheating gene”. This test is being used to find out the compatibility between the couples which in turn will help to lower down the divorce rates and emotional trauma caused due to broken relationships. Arginine vasopressin receptor 1 A (AVPR1A) is one of the three major receptor types, others being AVPR1B and AVPR2, for arginine vasopressin which is present through out the brain, liver, and kidney. Variation in the gene for one of the receptors for the hormone vasopressin is reported to be associated with the bonding of human males with their partners/spouses. It was reported that the 334 allele of a common AVPR1A variation seemed to have negative effects on the men’s relationship with their spouses.

"Our findings are particularly interesting because they show that men who are in a relatively stable relationship of five years of more who have one or two copies of allele 334 appear to be less bonded to their partners than men with other forms of this gene," says Jenae Neiderhiser, Professor of psychology, Penn State. "We also found that the female partners of men with one or two copies of allele 334 reported less affection, consensus and cohesion in the marriage, but interestingly, did not report lower levels of marital satisfaction than women whose male partners had no copies of allele 334." 

The prospect of using the genome as a universal diagnostic is upon us today. Just like other diagnostic tools being used in the hospitals, very soon we are going to have Whole Genome sequencing machines in the pathology labs as routine healthcare tools.

It means is that everybody who is alive today can live an extra 5, 10, 20 years. Very soon, we will have our entire genome copy on a pen drive or on the laptop with an easy access to your personal physician or family doctor. The doctor by looking at your genome can do the risk assessment for you as to which diseases you are prone to due to your genes. This assessment will help the doctor to suggest precautions and preventions and early interventions that will not only ultimately save millions of lives but also increase the life span of individuals.


Beyond Human Genome…….

The International HAP MAP Project

After sequencing the Human Genome, the next goal on which the biotechnologists and researchers are working is to map the SNPs in the entire genome, which is known as “HAP map”. This is a $ 100 M public- private effort, which will take almost 3 years to complete. The project involves collecting DNA samples from the blood samples of researchers from Nigeria, Japan, China and US. The aim is to create the next generation map of the human genome. This information is going to help us understand the .1% (100-99.9) difference that makes humans different from each other. These differences are due to Single Nucleotide Polymorphisms.  By locating the SNPs a Haplotype is created. A Haplotype is a set of single nucleotide polymorphisms (SNPs) on a single chromosome pair that are statistically associated. A Haplotype has also been defined as a combination of alleles (DNA sequence) at adjacent locations on the chromosomes that are transmitted together. A haplotype could be one locus, several loci, or an entire chromosome depending on the number of recombination events that have occurred between a given set of loci. The identification of a few alleles of a haplotype block can identify all other polymorphic sites in its region. This information will help to understand the genetics behind the common diseases.
The findings of the Human genome project, has opened vistas for new fields such as “Systems Biology” which explores life at the ultimate level. The whole organism is taken in to consideration instead of individual components such as single genes or proteins. This novel approach combines DNA sequences with advanced technologies to study how proteins carry out all the activities of a living cell.
Besides this the novel field of Consumer Genetics is going to define the business model and commercial enterprises. Consumer Genetics is now being used to customize personal care and nutritional supplement products. You can get skin care and supplements customized to meet the needs of your DNA. The Life insurance policy is going to be based on your personal genome copy.
One of the other products, fungi, is being also used as a rich source of protein. Since 1960s, a European bread producer spent over $45 million on a fungus that can be formed into acceptable food substitutes and started its commercial production in early 1984. For example a mycoprotein (protein derived from fungi), Fusarium graminearum which is a mold related to mushrooms and truffles. It is odorless and tasteless and contains about 45% protein, and 13% fat with a dietary composition same as beef. This mycoprotein has an amino acid content that is close to that recommended by the Food and Agriculture Organization of the United Nations as “ideal” for human consumption.

Wednesday, May 08, 2013

General Chemistry Notes



  1. Ammonia is manufactured by the Haber Process
  2. Full Name of TNT, an explosive is Trinitrotoluene
  3. Trinitrophenol, which is used as an explosive and antiseptic is also known as Picric Acid.
  4. Nessler’s Reagent is used for the test of Ammonia and Ammonium Salt. It gives brown precipitates
  5. Nitric Acid ( HNO3) is manufactured by Electric Arc Process.
  6. The process of Petroleum refining is called as a fractional distillation
  7. There are various products having different boiling points which are obtained during fractional distillation. At a boiling point of 30 to 70 celcius there is Petroleum Ether which is used as solvent for drycleaning. At a boiling point of 70 to 120 degrees celcius, there is Petrol or gasoline which is used as a motor fuel and general solvent. At a boiling point of 120 to 150 degrees celcius, there is Benzoline which is also a solvent. Then there is Kerosene at 150 to 300 degrees celcius which is used as an illuminant and as a fuel. And Finally above 300 degrees celcius as boiling point, we have Lubricating oil, vasaline and paraffin wax which are used as lubricant, grease and wax. 
  8. Petrol is obtained by a process called cracking. In this process kerosene and crude oil are broken up into low boiling point hydrocarbons such as Octane and Heptane. 
  9. To prevent knocking in petrol engine, an antiknock agent such as Tetraethyl lead is added. Now in lead free oils it is not used as it also harms catalytic converters.
  10. Iron commonly have three forms, first form which is the crudest form is cast iron containing 2 to 5 percent carbon. Then there is steel which has 0.15 to 1.5% of carbon. Wrought iron is the purest form of iron having 0.12 to 0.25 percent of carbon. 
  11. On exposure to moist air, in the presence of carbon dioxide, iron is converted into brown hydrated iron oxide ( Fe2O3. 10 H2O). This process is called rusting. 
  12. Stainless steel contains 13% chromium.
  13. Acids are generally sour, turn blue litmus red and generally give hydrogen with metals. 
  14. An acid is a substance which furnishes H+ ion on dissolving with water. It is proton donor and an electon acceptor. 
  15. Bases react with Acids to produce salt and water. They turn red litmus blue. They are also called alkalies eg. NaOH.
  16. A base is defined as proton acceptor and an electron donor. 
  17. pH is negative logarithm of H ions. pH of neutral solutions is 7. For acidic solutions, pH is between 0 and 7 and for alkaline solutions it is between 7-14.
  18. Metals occur in nature as chemical compounds called minerals. If these minerals are used as the starting material for extracting the metal, they are called ores.

Friday, May 03, 2013

REGULATIONS ON SURROGACY


  • While the new Assisted Reproductive Technology (ART) Regulation Bill and Rules, 2010, are still in the womb, the non-statutory Indian Council of Medical Research (ICMR) Guidelines, 2005, are being followed.
  • As per the latest and new Indian visa regulations, effective November 15, 2012, all foreigners visiting India for commissioning surrogacy will be required to apply for medical visas and cannot avail of simple tourist visas for surrogacy purposes. 

 NEW REGULATIONS:

  • Foreigners visiting India for commissioning surrogacy must apply for medical visa

  • The man and woman should be duly married and the marriage should have sustained for at least two years
  • letter from the embassy should be enclosed with the visa application stating that the country recognizes surrogacy and the child born thereof will be treated as a biological child of the couple
  • The couple will furnish an undertaking that they would take care of the child
  • The treatment would be done only at registered ART clinics recognized by the ICMR
  • The couple should produce a notarized agreement between the applicant couple and the prospective surrogate mother
  •  For return journey, the couple will need exit permission from FRRO/FRO
  • The couple can be permitted to visit India on a reconnaissance trip on tourist visa, but no samples can be given to any clinic during such visit 

    RECOMMENDATIONS OF LAW COMMISSION:

  • Surrogacy arrangement will continue to be governed by contract among parties, which will contain all the terms requiring consent of surrogate mother, medical procedures, reimbursement, willingness to hand over the baby, etc. This arrangement should not be for commercial purposes.
  • Surrogacy arrangement should provide for financial support for surrogate baby in the event of death of the commissioning couple or individual before delivery, or divorce between the intended parents and subsequent unwillingness to take the baby.
  • Life insurance covers for surrogate mother.
  • One of the intended parents should be a donor to foster the bond of love and check chances of child abuse.
  • Legislation should recognize a surrogate child to be the legitimate child of the commissioning parent(s) without there being any need for adoption or even declaration of guardian.
  • The birth certificate of the child should contain the name(s) of the commissioning parent(s) only.
  • Right to privacy of the donor as well as surrogate mother should be protected.
  • Sex-selective surrogacy should be prohibited.
  • Cases of abortions should be governed by the Medical Termination of Pregnancy Act.
  • The ART Bill, 2010, has legal lacunae and lacks creation of a specialist legal authority for determination and adjudication of legal rights of parties, in addition to falling in conflict with existing family laws. These pitfalls should not become a graveyard for a law yet to be born. Surrogacy needs to be regulated by a proper statutory law. Till then, the visa regulations will provide succor and relief.

Thursday, May 02, 2013

WHAT IS NFC?


NFC, or near-field communication, is a variant of RFID, or radio frequency identification. It is an ultra shortrange wireless technology that allows communication and data exchange between two devices held in tight proximity — about 4 cm apart.

How is it different from bluetooth? 
Bluetooth is also a short-range high frequency wireless technology but one that allows interaction between communication devices as much as 10 meters apart.
What makes NFC special? 
NFC-enabled smartphones have the potential to replace credit cards. This is because NFC phones pack a smart chip — a complex 80-character code that is really hard to crack. Such a device can safely store confidential credit card details and be handy for purchases on the go.
Frost & Sullivan predicts the technology will revolutionise e-commerce and drive over $150 billion worth of transactions by 2015, bulk of which is expected to be powered by NFC phones.

What else can the technology do? 

NFC can be deployed in ticketing services, rural banking, interactive and targeted advertising, healthcare, hospitality, libraries and pharmacies. In fact, an NFC phone could become the single-key to access to your car, home and office.

How do NFC transactions work? 
Any device, a cellphone, a camera or a watch, can be equipped with an NFC 'initiator' , which is simply an antenna that can store data. If the device is an NFC smartphone, the 'initiator' and 'target' (an NFC reader) need to be up close for data exchange to happen.
The 'reader' is attached to a point-of-sale (PoS) terminal or cash-register in a retail store that accepts NFC payments. A simple wave of the phone can pay for a purchase. Alternatively, two NFC phones can be tapped lightly to exchange business cards.

Will NFC be a drain on battery life? 
Geeks claim that in standby mode, a well-designed NFC solution does not consume any power. And since transactions happen in seconds, the power drain is not huge.

Are NFC-adoption levels growing? 
Globally, NFC adoption is picking up via smartphones. RIM, Nokia, Samsung and HTC have unveiled NFC smartphones. Apple iPhone5 is tipped to support NFC too. Google Wallet — a mobile payments technology that can be downloaded on some US mobile networks — is growing the NFC ecosystem. Payment trials have also begun in Australia, Singapore and China.

Has NFC arrived in India?  
The technology is still in its infancy here. As of now, the Reserve Bank does not recognise NFC mobile payment transactions and PoS terminals accepting NFC payments don't exist. But NFC-enabled phones like BlackBerry's Touch Bold 9900 and Curve 9360, Samsung's Nexus S and Galaxy S II and Nokia's C7, 700, 701 and 600 are available.
For NFC to take off, RBI has to frame norms and banks, carriers, creditcard companies, apps developers and PoS terminal makers have to team up. But awareness levels are growing and NFC is making some waves in entertainment.
Shah Rukh Khanstarrer Ra.One was the first movie to be marketed by Nokia using NFC technology. Armed with an NFC phone, you can download the movie content by merely tapping the device on a NFC-tagged movie poster at a Nokia priority outlet or a partner multiplex.

Tuesday, April 30, 2013

Magnetohydrodynamics Generator


The Magnetohydrodynamic power generation technology (MHD ) is the production of electrical power utilising a high temperature conducting plasma moving through an intense magnetic field.

Principle of MHD Power Generation

When an electrical conductor is moved so as to cut lines of magnetic induction, the charged particles in the conductor experience a force in a direction mutually perpendicular to the B field and to the velocity of the conductor. The negative charges tend to move in one direction, and the positive charges in the opposite direction. This induced electric field, or motional emf, provides the basis for converting mechanical energy into electrical energy. The production of electrical power through the use of a conducting fluid moving through a magnetic field is referred to as magnetohydrodynamic, or MHO, power generation. The principle was discovered by Michael Faraday.



The Lorentz Force Law describes the effects of a charged particle moving in a constant magnetic field. The simplest form of this law is given by the vector equation.
F= Q.(v×B)

where

 F is the force acting on the particle.
• Q is the charge of the particle,
• v is the velocity of the particle, and
• B is the magnetic field.

The vector F is perpendicular to both v and according to the right hand rule.

Systems using an MHD generator may operate in open or closed cycles. In the first case, products of combustion are the working fluid, and the exhaust gases are discharged into the atmosphere after removal of the alkali metal additives introduced into the working fluid to increase electric conductivity. In closed-cycle MHD generators the thermal energy produced by combustion of fuel is imparted to the working fluid in a heat exchanger. The working fluid then passes through the MHD generator and is returned through a compressor or pump, completing the cycle. Jet engines, nuclear reactors, or heat-exchange devices may be used as heat sources. The working fluids for MHD generators may be products of combustion of fossil fuels, inert gases with additives of alkali metals or their salts, vapors of alkali metals, and two-phase mixtures of liquid alkali metals and their vapors or liquid metals and electrolytes.

Advantages

 The conversion efficiency of a MHD system can be 50% as compared to less than 40 percent for the most efficient steam plants.
 Large amount of power is generated.
 It has no moving parts, so more reliable.
 It has ability to reach the full power level as soon as started.
 Because of higher efficiency, the overall generation cost of an MHD plant will be less.
 The more efficient heat utilization would efficient heat utilization would decreases the amount of heat discharged to environment and the cooling water requirements would also be lower.
 The higher efficiency means better fuel utilization. The reduce fuel consumption would offer additional economic and social benefits.
 The Closed cycle system produces power free of pollution

Disadvantages

 Simultaneous presence of high temperature and a highly corrosive and abrasive environment.
 MHD channel operates under extreme conditions of electric and magnetic field
Initial installments are expensive.

Saturday, April 20, 2013

GLOBAL POSITIONING SYSTEM

The Global Positioning System (GPS) is a space-based global navigation satellite system (GNSS) that provides reliable location and time information in all weather and at all times. It was established in 1973. The Global Positioning System (GPS) is a U.S.-owned utility that provides users with positioning, navigation, and timing (PNT) services. This system consists of three segments: the space segment, the control segment, and the user segment. The U.S. Air Force develops, maintains, and operates the space and control segments.
Space Segment
The space segment consists of a nominal constellation of 24 operating satellites that transmit one-way signals that give the current GPS satellite position and time.  
Control Segment
The control segment consists of worldwide monitor and control stations that maintain the satellites in their proper orbits through occasional command manoeuvre, and adjust the satellite clocks. It tracks the GPS satellites, uploads updated navigational data, and maintains health and status of the satellite constellation.
User Segment
The user segment consists of the GPS receiver equipment, which receives the signals from the GPS satellites and uses the transmitted information to calculate the user’s three-dimensional position and time.
Other satellite based navigation systems are:

a) GLONASS
The Global Navigation Satellite System (GLONASS) is based on a constellation of active satellites which continuously transmit coded signals in two frequency bands, which can be received by users anywhere on the Earth's surface to identify their position and velocity in real time based on ranging measurements. The system is a counterpart to the United States Global Positioning System (GPS) and both systems share the same principles in the data transmission and positioning methods. GLONASS is managed for the Russian Federation Government by the Russian Space Forces.
b) GALILEO 
Galileo is a global navigation satellite system (GNSS) currently being built by the European Union (EU) and European Space Agency (ESA). Its aim is to provide a high-accuracy positioning system upon which European nations can rely independent of the Russian GLONASS and US GPS systems which can be disabled for commercial users in times of war or conflict.
When in operation, it will use the two ground operations centres, one near Munich, Germany, and another in Fucino, Italy and will consist of 30 satellites (27 operational + 3 active spares). This will become fully operational by the year 2014.

c) BEIDOU NAVIGATION SYSTEM
The BeiDou Navigation System (COMPASS) Navigation Satellite System is a project by China to develop an independent satellite navigation system. It may refer to either one or both generations of the Chinese navigation system. The first BEIDOU system, officially called BEIDOU Satellite Navigation Experimental System, or known as BeiDou-1, consists of 3 satellites and has limited coverage and applications. It has been offering navigation services, mainly for customers in China and from neighboring regions, since 2000. The second generation of the system known as Compass or BEIDOU-2 will be a global satellite navigation system consisting of 35 satellites, is still under construction. It is planned to offer services to customers in Asia-Pacific region by 2012 and the global system will be started by 2020.
d) GAGAN
GPS Aided Geo Augmented Navigation ‘‘GAGAN’’ is an augmentation system to enhance the accuracy and integrity of GPS signals to meet precision approach requirements in Civil Aviation and is being implemented jointly by AAI and ISRO. The goal is to provide navigation system for all phases of flight over the Indian airspace and in the adjoining areas. GAGAN will increase safety by using a three-dimensional approach which will provide course guidance to the runways to help to reduce the risk of controlled flight into terrain i.e., an accident whereby an airworthy aircraft, under pilot control, inadvertently flies into terrain, an obstacle, or water.

VACCINE AND ITS TYPES


Vaccine is a biological preparation that improves immunity to a particular disease. A vaccine typically contains an agent that resembles a disease-causing microorganism, and is often made from weakened or killed forms of the microbe or its toxins. The agent stimulates the body's immune system to recognize the agent as foreign, destroy it, and "remember" it, so that the immune system can more easily recognize and destroy any of these microorganisms if it encounters in future.

Types of vaccines:
a) Inactivated vaccines: When inactivated vaccines are made, the bacteria are completely killed using a chemical, usually formaldehyde. Dead pieces of disease-causing microorganisms (usually bacteria) are put into the vaccine. Because the antigens are dead, the strength of these vaccines tends to wear off over time, resulting in less long-lasting immunity. So, multiple doses of inactivated vaccines are usually necessary to provide the best protection. The benefit of inactivated vaccines is that there is zero chance of developing any disease-related symptoms -- allergic reactions are possible but extremely rare.
Examples of inactivated vaccines are hepatitis A, hepatitis B, poliovirus, hemophilic influenza type b, meningococcal, pneumococcal and the injected form of influenza.

b) Live-attenuated vaccines: Live-attenuated basically means alive, but very weak. These vaccines are made when the virus is weakened to such a level that they reproduce only about 20 times in the body. 
When the vaccine is made, the virus or bacteria is weakened in a laboratory to the point where it's alive and able to reproduce, but can't cause serious illness. Its presence is enough to cause the immune system to produce antibodies to fight off the particular disease in the future. They typically provoke more durable immunological responses and are preferred for healthy adults.
Examples include the viral diseases yellow fever, measles, rubella, and mumps and the bacterial disease typhoid. 

c) Recombinant Vector vaccine – by combining the physiology of one micro-organism and the DNA of the other, immunity can be created against diseases that have complex infection processes.

d) DNA vaccination – in recent years a new type of vaccine called DNA vaccination has been created from an infectious agent's DNA. As in complex diseases the DNA quality of the infection changes thus no vaccine works on it.DNA vaccine works by insertion and expression, triggering immune system recognition of viral or bacterial DNA into human or animal cells. Some cells of the immune system that recognize the proteins expressed will mount an attack against these proteins and cells expressing them. 

Wednesday, April 17, 2013

Satellite Communication


 I. Introduction
An artificial satellite is a manufactured object that continuously orbits the Earth or some other body in space. Most artificial satellites orbit the Earth. People use them to study the universe, help forecast weather, transfer telephone calls over the oceans, assist in the navigation of ships and aircraft, monitor crops and other resources, and support military activities.

Artificial satellites also have orbited the moon, the sun, asteroids, and the planets Venus, Mars, and Jupiter. Such satellites mainly gather information about the bodies they orbit.
Piloted spacecraft in orbit, such as space capsules, space shuttle orbiters, and space stations, are also considered artificial satellites. Artificial satellites differ from natural satellites, natural objects that orbit a planet. Earth\'s moon is a natural satellite.

The Soviet Union launched the first artificial satellite, Sputnik 1, in 1957. Since then, the United States and about 40 other countries have developed, launched, and operated satellites. Today, about 3,000 useful satellites and 6,000 pieces of space junk are orbiting Earth.

II. The early birds of the satellite story
  
The first artificial satellite, Sputnik 1 was launched by the Soviet Union on

4 October 1957. Sputnik 1 helped to identify the density of high atmospheric layers through measurement of its orbital change and provided data on radio-signal distribution in the ionosphere. The success ofSputnik ignited the so-called Space Race within the Cold War.

Sputnik 2 was launched on November 3, 1957 and carried the first living passenger into orbit, a dog named Laika.

Explorer 1 became the United States\' first satellite on January 31, 1958.

In June 1961, three-and-a-half years after the launch of Sputnik 1, the Air Force used resources of the United States Space Surveillance Network to catalog 115 Earth-orbiting satellites.

The largest artificial satellite currently orbiting the Earth is the International Space Station.

III. Types of Satellite


  1. Killer Satellites are satellites that are armed and designed to take out enemy warheads, satellites and other space assets. They may have particle weapons, energy weapons, kinetic weapons, nuclear and/or conventional missiles and/or a combination of these weapons. Anti-satellite weapons (ASATs) are space weapons designed to incapacitate or destroy satellites for strategic military purposes. Currently, only the 

  1. USA, the former USSR and the People\'s Republic of China are known to have developed these weapons.
  2. Astronomical satellites are satellites used for observation of distant planets, galaxies, and other outer space objects.
  3. Biosatellites are satellites designed to carry living organisms, generally for scientific experimentation.
  4. Communication satellites are satellites stationed in space for the purpose of telecommunications. Modern communication satellites typically use geosynchronous orbits, Molniya orbits or Low Earth orbits (polar and non-polar Earth orbits). For fixed (point-to-point) services, communication satellites provide a microwave radio relay technology complementary to that of submarine communication cables. They are also used for mobile applications such as communications to ships, vehicles, planes and hand-held terminals, and for TV and radio broadcasting, for which application of other technologies, such as cable, is impractical or impossible.
  5. Miniaturized satellites are satellites of unusually low weights and small sizes. New classifications are used to categorize these satellites: minisatellite (500–200 kg), microsatellite (below 200 kg), nanosatellite (below 10 kg).
  6. Navigational satellites are satellites which use radio time signals transmitted to enable mobile receivers on the ground to determine their exact location. The relatively clear line of sight between the satellites and receivers on the ground, combined with ever-improving electronics, allows satellite navigation systems to measure location to accuracies on the order of a few metres in real time.
  7. Reconnaissance satellites are Earth observation satellite or communications satellite deployed for military or intelligence applications. Little is known about the full power of these satellites, as governments who operate them usually keep information pertaining to their reconnaissance satellites classified.
  8. Earth observation satellites are satellites intended for non-military uses such as environmental monitoring, meteorology, map making etc.
  9. Space stations are man-made structures that are designed for human beings to live in outer space. A space station is distinguished from other manned spacecraft by its lack of major propulsion or landing facilities — instead, other vehicles are used as transport to and from the station. Space stations are designed for medium-term living in orbit, for periods of weeks, months, or even years.
  10. Tether satellites are satellites which are connected to another satellite by a thin cable called a tether.
  11. Weather satellites are primarily used to monitor Earth\'s weather and climate. 

IV. Communication Satellite
communications satellite (comsat) is an artificial satellite stationed in space for the purposes of telecommunications. Modern communications satellites use a variety of orbits including geostationary orbits, Molniya orbits, other elliptical orbits and Low (polar and non-polar) Earth orbits.
For fixed (point-to-point) services, communications satellites provide a microwave radio relay technology complementary to that of submarine communication cables. They are also used for mobile applications such as communications to ships, vehicles, planes and hand-held terminals, and for TV and radio broadcasting, for which application of other technologies, such as cable, is impractical or impossible.

Monday, April 15, 2013

Microsoft Surface

It is a multi-touch product from Microsoft which has been developed as software and hardware combination technology that allows a user, or multiple users, to manipulate digital content by the use of natural motions, hand gestures, or physical objects. The product provides effortless interaction with digital content through natural gestures, touch and physical objects.

The system is composed of a horizontal touch screen under a coffee table-like surface, with cameras mounted below to detect user interaction activities. All interface components such as dialogs, mouse pointer, and windows, are replaced with circles and rectangles outlining "objects" that are manipulated via drag and drop. The "objects" in question can be either virtual objects displayed on the screen, or physical objects such as cell phones, digital cameras, and PDAs placed on the screen. Physical objects are automatically identified and connected to the Surface computer upon their placement on the screen. With no interface text, the Surface computer can be used by speakers of any language and any competency level.

What is Surface Computing?

Surface computing breaks down traditional barriers between people and technology, changing the way people interact with all kinds of everyday content, from photos to maps to menus. The intuitive user interface works without a traditional mouse or keyboard, allowing people to interact with content and information by using their hands and natural movements.

Surface computing features four key attributes:

•    Direct interaction:

Users can actually "grab" digital information with their hands, interacting with content by touch and gesture, without the use of a mouse or keyboard.

•    Multi-touch contact:

Surface computing recognizes many points of contact simultaneously, not just from one finger as with a typical touch screen, but up to dozens and dozens of items at once.

•    Multi-user experience:

The horizontal form factor makes it easy for several people to gather around surface computers together, providing a collaborative, face-to-face computing experience.

•    Object recognition:

Users can place physical objects on the surface to trigger different types of digital responses, including the transfer of digital content.

How does Microsoft Surface work?

At a high level, Surface uses cameras to sense objects, hand gestures and touch. This user input is then processed and the result is displayed on the surface using rear projection.

1.    Screen: A diffuser turns the Surface's acrylic tabletop into a large horizontal "multitouch" screen, capable of processing multiple inputs from multiple users. The Surface can also recognize objects by their shapes or by reading coded "domino" tags.

2.    Infrared: Surface's "machine vision" operates in the near-infrared spectrum, using an 850-nanometer-wavelength LED light source aimed at the screen. When objects touch the tabletop, the light reflects back and is picked up by multiple infrared cameras with a net resolution of 1280 x 960.

3.    CPU: Surface uses many of the same components found in everyday desktop computers — a Core 2 Duo processor, 2GB of RAM and a 256MB graphics card. Wireless communication with devices on the surface is handled using Wi-Fi and Bluetooth antennas (future versions may incorporate RFID or Near Field Communications). The underlying operating system is a modified version of Microsoft Vista.

4.    Projector: Microsoft's Surface uses the same DLP light engine found in many rear-projection HDTVs. The footprint of the visible light screen, at 1024 x 768 pixels, is actually smaller than the invisible overlapping infrared projection to allow for better recognition at the edges of the screen.

Carbon Footprint

A ‘carbon footprint’ is a measure of the greenhouse gas emissions associated with an activity, group of activities or a product.  Basically Carbon footprint is a measure of the impact of our activities on the environment, and in particular on climate change. It relates to the amount of greenhouse gases we are producing in our day-to-day lives through burning fossil fuels for electricity, heating, transportation, etc.

Nearly everything that we do produces greenhouse gas (GHG) emissions either directly or indirectly; whether it is getting to work, watching TV or buying our lunch. The most important greenhouse gas produced by human activities is carbon dioxide.  Direct GHG emissions sources are often easy to identify – for example burning fossil fuels for electricity generation, heating and transport.   It is sometimes less obvious that products and services also cause indirect emissions throughout their life-cycles.  Energy is required for production and transport of products, and greenhouse gases are also released when products are disposed of at the end of their useful lives.  
Thus a carbon footprint is the total set of greenhouse gases (GHG) emissions caused by an organization, event or product. It is expressed in terms of the amount of carbon dioxide, or its equivalent of other GHGs, emitted.

Today humanity uses the equivalent of 1.5 planet (Earth) to provide the resources to humanity and absorb the waste produced by them. This means it now takes the Earth one year and six months to regenerate what we use in a year.

Moderate UN scenarios suggest that if current population and consumption trends continue, by the 2030s, we will need the equivalent of two Earths to support us. And of course, we only have one.

We are turning resources into waste faster than it (waste) can be turned back into resources. This put us in global ecological overshoot, depleting the very resources on which human life and biodiversity depend.

The result is collapsing fisheries, diminishing forest cover, depletion of fresh water systems, and the buildup of carbon dioxide emissions, which creates problems like global climate change. Overshoot also contributes to resource conflicts and wars, mass migrations, famine, disease and other human tragedies—and tends to have a disproportionate impact on the poor, who cannot buy their way out of the problem by getting resources from somewhere else.

A carbon footprint is made up of the sum of two parts, the primary foot print and the secondary footprint.

1. The primary footprint is a measure of the direct emissions of CO2 from the burning of fossil fuels, including domestic energy consumption and transportation (e.g. car and plane).

2.
 The secondary footprint is a measure of the indirect CO2 emissions from the whole lifecycle of products we use - those associated with their manufacture and eventual breakdown. 

Sunday, April 14, 2013

Cell phone batteries

Battery is the lifeblood of a mobile phone.All batteries contain one or more cells. A cell is the working chemical unit inside a battery. A cell has three main parts: a positive electrode (terminal), a negative electrode, and a liquid or solid separating them called the electrolyte.

How rechargeable batteries work?

Batteries convert stored chemical energy into electrical energy. This is achieved by causing electrons to flow whenever there is a conductive path between the cell's electrodes.  Electrons flow as a result of a chemical reaction between the cell's two electrodes that are separated by an electrolyte. When a battery is connected to an electric circuit, a chemical reaction takes place in the electrolyte causing ions to flow through it one way, with electrons flowing through the outer circuit in the other direction. This movement of electric charge makes an electric current flow through the cell and through the circuit it is connected to.

The cell becomes exhausted when the active materials inside the cell are depleted and the chemical reactions become slow. The voltage provided by a cell depends on the electrode material, their surface area and material between the electrodes (electrolyte). Current flow stops when the connection between the electrodes is removed.

Rechargeable cells operate on the same principle, except that the chemical reaction that occurs is reversed while charging. When connected to an appropriate charger, cells convert electrical energy back into potential chemical energy. The process is repeated every time the cell is discharged and recharged.

The capacity of cells is expressed in amp-hours (Ah) or milliamp-hours (mAh).


There are four basic types of rechargeable battery used in mobile phones:

•    Nickel Cadmium (NiCd) Batteries


Nickel Cadmium cell phone batteries are based on old technology.  
NiCd batteries suffer from memory effect (happens when rechargeable batteries are not fully discharged between charge cycles; as a result the battery “remembers” the shortened cycle and is thus reduced in capacity). They must be completely discharged before recharging or else damage can occur. The chemicals used in NiCd batteries are not environmentally friendly, and the disposal of cadmium-rich waste is an increasing problem.



They are the cheapest variety of phone batteries. Their affordablility helps to bring down the overall cost of mobile phones.


•    Nickel Metal Hydride (NiMH) Batteries


Nickel Metal Hydride (NiMH) batteries claim to be superior to NiCd because they don't contain cadmium. The cell phone batteries are made from non-toxic materials and are environmentally friendly. They also deliver a higher capacity in relation to their size and weight.


NiMH cell phone batteries are relatively new technology and are prone to the “memory effect", but only to a very small extent. To maximize performance, it is advised to completely discharge the battery after every 20th recharge.


•    Lithium Ion (Li-Ion) Batteries


This is the current and most popular technology for cell phone batteries. The only real drawback of Lithium Ion cell phone batteries is that they are expensive. As such, they tend to be supplied with only top-of-the-line phones. Lithium Ion batteries are slightly lighter than NiMH batteries, but they also have a longer lifetime.


A Lithium Ion battery may be damaged by extensive overcharging (continuously on a cell phone charger for more than 24 hours).


•    Lithium Polymer (Li-Poly) Batteries


Li-Poly Batteries are the newest and most advanced technology for cell phone batteries. Ultra-lightweight, they do not suffer from memory effect and will deliver up to 40% more battery capacity than a Nickel Metal Hydride (NiMH) of the same size.

Random Articles: