Wednesday, February 27, 2008
Building-integrated photovoltaics (BIPV)
Building-integrated photovoltaics (BIPV) are photovoltaic materials that are used to replace conventional building materials in parts of the building envelope such as the roof, skylights, or facades. They are increasingly being incorporated into the construction of new buildings as a principal or ancillary source of electrical power, although existing buildings may be retrofitted with BIPV modules as well. The advantage of integrated photovoltaics over more common non-integrated systems is that the initial cost can be offset by reducing the amount spent on building materials and labor that would normally be used to construct the part of the building that the BIPV modules replace.
Friday, February 22, 2008
Seismic Intelligence
(US Army Field Manual 2-0) defines seismic intelligence as "The passive collection and measurement of seismic waves or vibrations in the earth surface." In the context of verification, seismic intelligence makes use of the science of seismology to locate and characterize nuclear testing, especially underground testing. Seismic sensors also can characterize large conventional explosions that are used in testing the high-explosive components of nuclear weapons.
Sunday, February 17, 2008
PID control
Apart from sluggish performance to avoid oscillations, another problem with proportional-only control is that power application is always in direct proportion to the error. In the example above we assumed that the set temperature could be maintained with 50% power. What happens if the furnace is required in a different application where a higher set temperature will require 80% power to maintain it? If the gain was finally set to a 50° PB, then 80% power will not be applied unless the furnace is 15° below setpoint, so for this other application the operators will have to remember always to set the setpoint temperature 15° higher than actually needed. This 15° figure is not completely constant either: it will depend on the surrounding ambient temperature, as well as other -factors that affect heat loss from or absorption within the furnace.
PID control
Apart from sluggish performance to avoid oscillations, another problem with proportional-only control is that power application is always in direct proportion to the error. In the example above we assumed that the set temperature could be maintained with 50% power. What happens if the furnace is required in a different application where a higher set temperature will require 80% power to maintain it? If the gain was finally set to a 50° PB, then 80% power will not be applied unless the furnace is 15° below setpoint, so for this other application the operators will have to remember always to set the setpoint temperature 15° higher than actually needed. This 15° figure is not completely constant either: it will depend on the surrounding ambient temperature, as well as other -factors that affect heat loss from or absorption within the furnace.
Friday, February 08, 2008
Sociology and Sociocybernetics
Systems theory has also been developed within sociology. An important figure in the sociological systems perspective as developed from GST is Walter Buckley (who from Bertalanffy's theory). Niklas Luhmann is also predominant in the literatures for sociology and systems theory. Miller's living systems theory was particularly influential in sociology from the time of the early systems movement.
Models for equilibrium in systems analysis that contrasted classical views from Talcott Parsons and George Homas were influential in integrating concepts with the general movement. With the renewed interest in systems theory on the rise since the 1990s, Bailey (1994) notes the concept of systems in sociology dates back to Auguste Comte in the 19th century, Herbert Spencer and Vilfredo Pareto, and that sociology was readying into its centennial as the new systems theory was emerging following the World Wars.
Models for equilibrium in systems analysis that contrasted classical views from Talcott Parsons and George Homas were influential in integrating concepts with the general movement. With the renewed interest in systems theory on the rise since the 1990s, Bailey (1994) notes the concept of systems in sociology dates back to Auguste Comte in the 19th century, Herbert Spencer and Vilfredo Pareto, and that sociology was readying into its centennial as the new systems theory was emerging following the World Wars.
Sunday, January 27, 2008
Autonomism
Autonomism is a term applied to a variety of social movements around the world, which the ability to organize in autonomous and horizontal networks, as opposed to hierarchical structures such as unions or parties. Autonomist Marxists, including Harry Cleaver, broaden the definition of the working-class to include salaried and unpaid labor, such as skilled professions and housework; it focuses on the working class in advanced capitalist states as the primary force of change in the construct of capital. Modern autonomist theorists such as Antonio Negri and Michael Hardt argue that network power constructs are the most effective methods of organization against the neoliberal regime of accumulation, and predict a massive shift in the dynamics of capital into a 21st Century Empire.
Thursday, January 17, 2008
DNA nanotechnology
DNA nanotechnology uses the unique molecular recognition properties of DNA and other nucleic acids to create self-assembing branched DNA complexes with useful properties. DNA is thus used as a structural material rather than as a carrier of biological information. This has led to the creation of two-dimensional periodic lattices (both tile-based as well as using the "DNA origami" method) as well as three-dimensional structures in the shapes of polyhedra. Nanomechanical devices and algorithmic self-assembly have also been demonstrated, and these DNA structures have been used to template the arrangement of other molecules such as gold nanoparticles and streptavidin proteins.
Thursday, January 10, 2008
Personality psychology
Personality psychology studies enduring psychological patterns of behavior, thought and emotion, commonly called an individual's personality. Theories of personality vary between different psychological schools. Trait theories attempts to break personality down into a number of traits, by use of factor analysis. The number of traits have varied between theories. One of the first, and smallest, models was that of Hans Eysenck, which had three dimensions: extroversion—introversion, neuroticism—emotional stability, and psychoticism. Raymond Cattell proposed a theory of 16 personality factors. The theory that has most empirical evidence behind it today may be the "Big Five" theory, proposed by Lewis Goldberg, and others.
A different, but well known, approach to personality is that of Sigmund Freud, whose structural theory of personality divided personality into the ego, superego, and id. Freud's theory of personality has been criticized by many, including many mainstream psychologists.
A different, but well known, approach to personality is that of Sigmund Freud, whose structural theory of personality divided personality into the ego, superego, and id. Freud's theory of personality has been criticized by many, including many mainstream psychologists.
Friday, January 04, 2008
Positivism
Positivism is a philosophy that states that the only authentic knowledge is scientific knowledge, and that such knowledge can only come from positive affirmation of theories through strict scientific method. It was developed by Auguste Comte (widely regarded as the first sociologist) in the middle of the 19th century. In the early 20th century, logical positivism-a stricter and more logical version of Comte's basic thesis-sprang up in Vienna and grew to become one of the dominant movements in American and British philosophy. The positivist view is sometimes referred to as a scientist ideology, and is often shared by technocrats who believe in the necessity of progress through scientific progress, and by naturalism, who argue that any method for gaining knowledge should be limited to natural, physical, and material approaches. As an approach to the philosophy of science deriving from Enlightenment thinkers like Pierre-Simon Laplace (and many others), positivism was first systematically theorized by Comte, who saw the scientific method as replacing metaphysics in the history of thought, and who observed the circular dependence of theory and observation in science.
Friday, December 28, 2007
AIBO
(Artificial Intelligence roBOt, homonymous with "companion" in Japanese) is one of several types of robotic pets designed and manufactured by Sony; there have been several different models since their introduction in 1999. Able to walk, "see" its environment via camera, and recognize spoken commands, they are considered to be autonomous robots, since they are able to learn and mature based on external stimuli from their owner or environment, or from other AIBOs. Artist Hajime Sorayama created the initial designs for the AIBO.
On January 26, 2006 Sony announced that it would discontinue AIBO and several other products as of March, 2006. It will also stop development of the QRIO robot. AIBO will still be supported until 2013 (ERS7 model), however, and AIBO technology will continue to be developed for use in other consumer products. AIBOware (the name is a trademark of Sony corporation), is the title given to the software the AIBO runs on its pink Memory Stick. The Life AIBOware allows the robot to be raised from pup to fully grown adult while going through various stages of development as its owner interacts with it.
On January 26, 2006 Sony announced that it would discontinue AIBO and several other products as of March, 2006. It will also stop development of the QRIO robot. AIBO will still be supported until 2013 (ERS7 model), however, and AIBO technology will continue to be developed for use in other consumer products. AIBOware (the name is a trademark of Sony corporation), is the title given to the software the AIBO runs on its pink Memory Stick. The Life AIBOware allows the robot to be raised from pup to fully grown adult while going through various stages of development as its owner interacts with it.
Wednesday, December 19, 2007
Computer networking
Computer networking is the engineering discipline concerned with communication between computer systems or devices. Networking, routers, routing protocols, and networking over the public Internet have their specifications defined in documents called RFCs. Computer networking is sometimes considered a sub-discipline of telecommunications, computer science, information technology and/or computer engineering. Computer networks rely heavily upon the theoretical and practical application of these scientific and engineering disciplines.
A computer network is any set of computers or devices connected to each other with the ability to exchange data.
A computer network is any set of computers or devices connected to each other with the ability to exchange data.
Wednesday, December 12, 2007
History of nanotechnology
The first use of the concepts in 'nano-technology' (but predating use of that name) was in "There's Plenty of Room at the Bottom," a talk given by physicist Richard Feynman at an American Physical Society meeting at Caltech on December 29, 1959. Feynman described a process by which the ability to manipulate individual atoms and molecules might be developed, using one set of precise tools to build and operate another proportionally smaller set, so on down to the needed scale. In the course of this, he noted, scaling issues would arise from the changing magnitude of various physical phenomena: gravity would become less important, surface tension and Van der Waals attraction would become more important, etc.
This basic idea appears feasible, and exponential assembly enhances it with parallelism to produce a useful quantity of end products. The term "nanotechnology" was defined by Tokyo Science University Professor Norio Taniguchi in a 1974 paper (N. Taniguchi, "On the Basic Concept of 'Nano-Technology'," Proc. Intl. Conf. Prod. Eng. Tokyo, Part II, Japan Society of Precision Engineering, 1974.) as follows: "'Nano-technology' mainly consists of the processing of, separation, consolidation, and deformation of materials by one atom or by one molecule." In the 1980s the basic idea of this definition was explored in much more depth by Dr. K. Eric Drexler, who promoted the technological significance of nano-scale phenomena and devices through speeches and the books Engines of Creation: The Coming Era of Nanotechnology (1986) and Nanosystems: Molecular Machinery, Manufacturing, and Computation[2], and so the term acquired its current sense.
This basic idea appears feasible, and exponential assembly enhances it with parallelism to produce a useful quantity of end products. The term "nanotechnology" was defined by Tokyo Science University Professor Norio Taniguchi in a 1974 paper (N. Taniguchi, "On the Basic Concept of 'Nano-Technology'," Proc. Intl. Conf. Prod. Eng. Tokyo, Part II, Japan Society of Precision Engineering, 1974.) as follows: "'Nano-technology' mainly consists of the processing of, separation, consolidation, and deformation of materials by one atom or by one molecule." In the 1980s the basic idea of this definition was explored in much more depth by Dr. K. Eric Drexler, who promoted the technological significance of nano-scale phenomena and devices through speeches and the books Engines of Creation: The Coming Era of Nanotechnology (1986) and Nanosystems: Molecular Machinery, Manufacturing, and Computation[2], and so the term acquired its current sense.
Thursday, December 06, 2007
Broadcast messages and paging
Practically every cellular system has some kind of broadcast mechanism. This can be used directly for distributing information to multiple mobiles, commonly, for example in mobile telephony systems, the most important use of broadcast information is to set up channels for one to one communication between the mobile transreceiver and the base station. This is called paging.
The details of the process of paging vary somewhat from network to network, but normally we know a limited number of cells where the phone is located (this group of cells is called a Location Area in the GSM or UMTS system, or Routing Area if a data packet session is involved). Paging takes place by sending the broadcast message to all of those cells. Paging messages can be used for information transfer. This happens in pagers, in CDMA systems for sending SMS messages, and in the UMTS system where it allows for low downlink latency in packet-based connections.
Our taxi network is a very good example here. The broadcast capability is often used to tell about road conditions and also to tell about work which is available to anybody. On the other hand, typically there is a list of taxis waiting for work. When a particular taxi comes up for work, the operator will call their number over the air. The taxi driver acknowledges that they are listening, then the operator reads out the address where the taxi driver has to go.
The details of the process of paging vary somewhat from network to network, but normally we know a limited number of cells where the phone is located (this group of cells is called a Location Area in the GSM or UMTS system, or Routing Area if a data packet session is involved). Paging takes place by sending the broadcast message to all of those cells. Paging messages can be used for information transfer. This happens in pagers, in CDMA systems for sending SMS messages, and in the UMTS system where it allows for low downlink latency in packet-based connections.
Our taxi network is a very good example here. The broadcast capability is often used to tell about road conditions and also to tell about work which is available to anybody. On the other hand, typically there is a list of taxis waiting for work. When a particular taxi comes up for work, the operator will call their number over the air. The taxi driver acknowledges that they are listening, then the operator reads out the address where the taxi driver has to go.
Wednesday, November 28, 2007
What is Java bytecode?
Java bytecode is the form of instructions that the Java virtual machine executes. Every bytecode instruction is one byte in length (hence the name), therefore the number of bytecodes is limited to 256. Not all 256 probable bytecode values are used. Actually, Sun Microsystems, the inventive creators of the Java programming language, the Java virtual machine and the added components of the Java Runtime Environment, have set aside a number of values to be lastingly unimplemented.
Wednesday, November 21, 2007
A short History of C Language
The C programming language was designed by the Dennis Ritchie in the near beginning 1970s at Bell Laboratories. It was first used system implementation language for the nascent or booming UNIX operating system. The most important explanation to devised C was to overcome the limitations of B. It was derivative from the type-less language BCPL ((Basic Combined Programming Language). C programming language was the advancement of B and BCPL by including type checking. It was initially intended for use in writing compilers for other languages.
Wednesday, November 14, 2007
What do you mean by Superbike racing?
The Superbike racing is a category of motorcycle road racing that employs modified production motorcycles, in the same way that Touring car racing employs production cars. Numerous countries like USA, United Kingdom, Australia and Canada operate national superbike championships, and a World Superbike final has run since 1988.
The Superbike category is extremely popular with manufacturers. As the race bikes are built from production road bikes, the marketing price of a Superbike victory is important. The set of laws relating to how extensively the machine can be modified vary significantly in the various competitions. The AMA Superbike series agrees for substantial modification of the machine, together with modifying elements of the engine block. On the contrary, World Superbike is significantly stricter and since 2004 this series has as well featured a control Pirelli tire. Until then it was general for riders form domestic championships to enter their country's race as a wildcard - Makoto Tamada and Shane Byrne are amid many riders to beat the regulars in these one-off races, and both went on to greater success.
The Superbike category is extremely popular with manufacturers. As the race bikes are built from production road bikes, the marketing price of a Superbike victory is important. The set of laws relating to how extensively the machine can be modified vary significantly in the various competitions. The AMA Superbike series agrees for substantial modification of the machine, together with modifying elements of the engine block. On the contrary, World Superbike is significantly stricter and since 2004 this series has as well featured a control Pirelli tire. Until then it was general for riders form domestic championships to enter their country's race as a wildcard - Makoto Tamada and Shane Byrne are amid many riders to beat the regulars in these one-off races, and both went on to greater success.
Thursday, November 01, 2007
Relay race Game
During a relay race, members of a group or team take turns swimming or running (generally with a baton) parts of a circuit or performing a certain action. The Relay races take the form of professional races and the amateur games. In the Olympic Games, there are lots of types of relay races that are part of track and field.
Based on the speed of the runners in relay, the generally accepted strategy used in setting up a 4 person relay team is: the second fastest, third fastest, slowest, then fastest (anchor). Each segment of the relay race is referred to as a leg.
Based on the speed of the runners in relay, the generally accepted strategy used in setting up a 4 person relay team is: the second fastest, third fastest, slowest, then fastest (anchor). Each segment of the relay race is referred to as a leg.
Saturday, October 27, 2007
Speed limit and the Design speed for vehicle
A speed limit is the maximum speed allowable for vehicle by law on a road. The Speed limits are only peripherally connected to the intend speed of the road. In the United States, the plan speed is "a chosen speed used to determine the different geometric design features of the roadway" according to the 2001 AASHTO Green Book of the highway design manual. It has been altered from previous versions which considered it the "maximum secure speed that can be maintained over a definite section of highway when conditions are so positive that the design features of the highway govern."
The design speed has basically been discredited as an only basis for establishing a speed limit. The Current U.S. standards for design speed derive from outdated, less-capable automotive technology. In addition, the design speed of a given roadway is the theoretical maximum secure speed of the roadway's worst feature (e.g., a curve, bottleneck, hill, etc.). The design speed generally underestimates the maximum secure and safe speed for a roadway and is as a result considered only a very conservative "first guess" at a limit.
The design speed has basically been discredited as an only basis for establishing a speed limit. The Current U.S. standards for design speed derive from outdated, less-capable automotive technology. In addition, the design speed of a given roadway is the theoretical maximum secure speed of the roadway's worst feature (e.g., a curve, bottleneck, hill, etc.). The design speed generally underestimates the maximum secure and safe speed for a roadway and is as a result considered only a very conservative "first guess" at a limit.
Thursday, October 18, 2007
Properties and uses of glasses
Glass can be made form transparent and flat, or into added shapes and colors as made known in this ball from the Verrerie of Brehat in Brittany. One of the nearly all obvious characteristics of ordinary glass is that it is clear to visible light. The clearness is due to an absence of electronic transition states in the range of visible light, and to the truth that such glass is homogeneous on all length scales greater than about a wavelength of noticeable light. Ordinary glass does not let light at a wavelength of lower than 400 nm, also recognized as ultraviolet light or UV, to pass. This is due to the addition of compounds for instance soda ash (sodium carbonate).
Pure SiO2 glass (also called fused quartz) does not absorb UV light and is used for applications that necessitate transparency in this region, although it is more costly. This kind of glass can be made so pure that hundreds of kilometers of glass are clear at infrared wavelengths in fiber optic cables. Individual fibers are given a uniformly transparent cladding of SiO2/GeO2 glass, which has only somewhat different optical properties (the germanium causative to a lower index of refraction). Undersea cables have sections doped with Erbium, which intensify transmitted signals by laser release from within the glass itself.
Amorphous SiO2 is also used as a dielectric substance in integrated circuits, owing to the smooth and electrically unbiased interface it forms with silicon. Glasses used for making visual devices are commonly categorized by means of a letter-number code from the Schott Glass catalog. For model, BK7 is a low-dispersion borosilicate crown glass, and SF10 is a high-dispersion opaque flint glass. The glasses are placed by composition, refractive indicator, and Abe number.
Glass is sometimes created obviously from volcanic magma. This glass is called obsidian, and is generally black with impurities. Obsidian is a raw substance for flint knappers, who have used it to make particularly sharp knives since the Stone Age. Obsidian collection is prohibited by law in some places (together with the United States), but the same tool making techniques can be useful to industrially-made glass.
Pure SiO2 glass (also called fused quartz) does not absorb UV light and is used for applications that necessitate transparency in this region, although it is more costly. This kind of glass can be made so pure that hundreds of kilometers of glass are clear at infrared wavelengths in fiber optic cables. Individual fibers are given a uniformly transparent cladding of SiO2/GeO2 glass, which has only somewhat different optical properties (the germanium causative to a lower index of refraction). Undersea cables have sections doped with Erbium, which intensify transmitted signals by laser release from within the glass itself.
Amorphous SiO2 is also used as a dielectric substance in integrated circuits, owing to the smooth and electrically unbiased interface it forms with silicon. Glasses used for making visual devices are commonly categorized by means of a letter-number code from the Schott Glass catalog. For model, BK7 is a low-dispersion borosilicate crown glass, and SF10 is a high-dispersion opaque flint glass. The glasses are placed by composition, refractive indicator, and Abe number.
Glass is sometimes created obviously from volcanic magma. This glass is called obsidian, and is generally black with impurities. Obsidian is a raw substance for flint knappers, who have used it to make particularly sharp knives since the Stone Age. Obsidian collection is prohibited by law in some places (together with the United States), but the same tool making techniques can be useful to industrially-made glass.
Thursday, October 11, 2007
The facts about Venus
Venus (Greek: Aphrodite; Babylonian: Ishtar) is the goddess of love and beauty. The planet is so named most likely because it is the brightest of the planets recognized to the ancients. Venus has been known since prehistoric times. It is the brightest object in the sky except for the Sun and the Moon. Like Mercury, it was commonly thought to be two separate bodies: Eosphorus as the morning star and Hesperus as the sunset star, but the Greek astronomers knew better.
Venus' rotary motion is somewhat unusual in that it is both very slow (243 Earth days per Venus day, somewhat longer than Venus' year) and retrograde. Additionally, the periods of Venus' rotary motion and of its orbit are synchronized such that it for all time presents the same face in the direction of Earth when the two planets are at their neighboring approach. Whether this is a resonance effect or just a coincidence is not known.
Venus is at times regarded as Earth's sister planet. In some ways they are especially similar:
* Venus is only somewhat smaller than Earth (95% of Earth's diameter, 80% of Earth's mass).
* Both have a small number of craters indicating relatively young surfaces.
* Their densities and chemical compositions are alike.
Because of these similarities, it was considered that below its dense clouds Venus might be very earthlike and might even have life. However, unfortunately, more detailed study of Venus reveals that in lots of important ways it is radically different from Earth. It may be the slightest hospitable place for life in the solar system.
The force of Venus' atmosphere at the surface is 90 atmospheres (about the same as the pressure at a deepness of 1 km in Earth's oceans). It is composed generally of carbon dioxide. There are numerous layers of clouds many kilometers thick composed of sulfuric acid. These clouds entirely obscure our view of the surface. This dense atmosphere creates a run-away greenhouse effect that raises Venus' face temperature by about 400 degrees to over 740 K (hot enough to melt lead). Venus' surface is truly hotter than Mercury's in spite of being nearly twice as far from the Sun. The oldest terrains on Venus appear to be about 800 million years old. Extensive volcanism at that time wiped out the in advance surface counting any large craters from early in Venus' history.
Venus' rotary motion is somewhat unusual in that it is both very slow (243 Earth days per Venus day, somewhat longer than Venus' year) and retrograde. Additionally, the periods of Venus' rotary motion and of its orbit are synchronized such that it for all time presents the same face in the direction of Earth when the two planets are at their neighboring approach. Whether this is a resonance effect or just a coincidence is not known.
Venus is at times regarded as Earth's sister planet. In some ways they are especially similar:
* Venus is only somewhat smaller than Earth (95% of Earth's diameter, 80% of Earth's mass).
* Both have a small number of craters indicating relatively young surfaces.
* Their densities and chemical compositions are alike.
Because of these similarities, it was considered that below its dense clouds Venus might be very earthlike and might even have life. However, unfortunately, more detailed study of Venus reveals that in lots of important ways it is radically different from Earth. It may be the slightest hospitable place for life in the solar system.
The force of Venus' atmosphere at the surface is 90 atmospheres (about the same as the pressure at a deepness of 1 km in Earth's oceans). It is composed generally of carbon dioxide. There are numerous layers of clouds many kilometers thick composed of sulfuric acid. These clouds entirely obscure our view of the surface. This dense atmosphere creates a run-away greenhouse effect that raises Venus' face temperature by about 400 degrees to over 740 K (hot enough to melt lead). Venus' surface is truly hotter than Mercury's in spite of being nearly twice as far from the Sun. The oldest terrains on Venus appear to be about 800 million years old. Extensive volcanism at that time wiped out the in advance surface counting any large craters from early in Venus' history.
Subscribe to:
Comments (Atom)