Archive for March 2016

Racing Cars By Nitesh Kumar

 
One of the most exciting and breathtaking events in the modern world is Auto Racing. In addition to being a high end entertainer, Racing also has contributed to the field of research and engineering. Since the first contest in 1887, the race car has seen extensive design improvements which were motivated by both performance and regulations in the racing industry. Along with this, a lot of experimentation and innovation is adopted by the designers according to the requirements of their clients in terms of style which resulted in various families of race cars.

 

The first step into understanding and enjoying the racing is to form an overview of the
types of race cars.
The first type is called Sports prototype used in sports car racing and is effectively the next automotive design and technological step up from road-going sports cars and are, along with open-wheel cars, the pinnacle of racing-car design. These cars are purpose-built racing cars with enclosed wheels, and either open or closed cockpits. Since the World Sportscar Championship was conceived there have been various regulations regarding bodywork, engine style and size, tires and aerodynamics to which these cars must be built. Sports-prototypes may be (and often are) one-of-a-kind machines, and need bear no relation to any road-going vehicle. In the ACO regulations, two categories of sports-prototypes are now recognized: P1 and P2. Cars competing in the P1 category must weigh no less than 900 kg and are limited to 6000 cc naturally aspirated and 4000 cc turbocharged engines. 5500 cc turbo-Diesel engines are also permitted in P1 – Audi scored Le Mans victories with such a car in 2006, 2007 and 2008 and Peugeot returned to racing in 2007 with a car with a similar powerplant (Peugeot 908). P2 cars can weigh much less -first 675 kg, then 750 kg and now 825 kg – but are restricted to 3400cc V6 or V8 normally aspirated or 2000 cc turbocharged powerplants.

Another popular type of race car is a Grand Touring ( GT). Among these, No. 35 Maserati MC12 GT1 car running at the 2005 Grand Prix of AtlantaGrand Touring (from the Italian Gran Turismo) racing is the most common form of sports car racing, and is found all over the
world, in both international and national series. When GT racing revived after the collapse of the World Sports Car Championship at the end of 1992, the lead in defining rules was taken by the ACO. Under the ACO rules, Grand Touring cars are divided into two categories, Grand Touring 1 (GT1, formerly GTS) and Grand Touring 2 (GT2, formerly GT). As the name of the class implies, the exterior of the car closely resembles that of the production version, while the internal fittings may differ greatly. GT2 cars are very similar to the FIA GT2 classification, and are ‘pure’ GT cars; that is production exotic cars with relatively few internal modifications for racing. The Porsche 911is currently the most popular car in the GT2 class.

 

The admiration towards a Race car, alike the creative design changes will keep on growing and keep on contributing to the field of sports, entertainment and advanced technological studies as well.

 

 

Contextualized Analytics by Boudhayan Bhattacharya

The growth of information generated by big data, digital engagement platforms and data supplied by Internet of Things (IoT) devices derive that contextualized analytics will be a prevailing force in 2016. In literature, context describes the setting, prelude and the scene where characters play their respective parts. In an organization, contextual data provides the same type of picture. Some Data points such as device, location, language, social network or influencers help the enterprises develop personalize products, improved insights, or services, or even suggest specific actions. The increase in context will allow enterprises to create a more integrated and valuable information experience for employees, clients, partners and citizens.

 

Telematic data, for example, gathered from vehicles will help automakers to better the durability of  components and identify problems, notifying drivers before trouble occurs. Insurers will be able to manage risks even better and offer personalized drivers, usage-based policies. Taken a step further, in insurance, and the whole theory of pooling risks may vanish because the data revolution will enable insurers to underwrite down to the individual level.

 

Contextualized analytics is an excellent way to get the most out of big chunks of data. Facebook and Twitter, for example, can be treated as excellent sources of data, but detecting patterns in order to make out all the available information takes up a lot of time and is highly complicated as the repository of data is huge in volume. To sense this huge amount of data, proper analysis of data is required in order to create valuable insights, and that also means having some context on mind. Otherwise, a generalized conclusion will be derived. Thus we can say that, applying contextualized analytics can help emphasize the individuality of the consumer and his or her behaviour.

 

Almost every organization in the digital industry use some web analytics software. But that doesn’t allow them to fully understand the cultural and the psychological factors that influence the customer lifestyles. The realization that the data can’t be trusted blindly has been come into reckoning for a while. In an article entitled “What Data Can’t Do,” David Brooks of The New York Times points out that the main issue about big data is that it’s “pretty bad at narrative and emergent thinking. It cannot even match the suppleness that is explanatory of even a mediocre novel.” The best way to deal with this problem is to follow the words of Scott Gnau of Teradata Labs: “big data is a new piece, but it is not the only piece of the data puzzle.” Context-derived analytics and context can unlock the potential stored within big data; by contextualizing the data at hand, the customer insights can be improved and the reasons behind common consumer behaviours can be understood. By this, businesses can create experiences that actually surprise and delight their users.

Green Energy Corridors by Naveen Kumar Gupta

India has huge potential of Renewable Energy (RE) resource such as wind, solar, hydro, etc. Most of the renewable capacity lies in the potential states like Tamil Nadu, Karnataka, Andhra Pradesh, Gujarat, Maharashtra, Rajasthan and Himachal Pradesh that are energy renewable. These states have undergone 80 to 90 % of total renewable capacity installations in the country. Various fiscal incentives and policy initiatives have created interest in developing renewable energy (RE) generation. Regulatory initiatives have also been taken to promote sale of RE power. Till recently, the quantum of Renewable Energy (RE) power was small and it was considered that connectivity with the nearest grid substation of StateTransmission Utility (STU) would be sufficient for evacuation of power and if the Renewable Energy (RE) power is consumed locally. Now emphasis has been given to harness Renewable Energy (RE) power on a larger scale to bring an economy of scales as soon as possible to supplement the capacity addition from usual sources as well as for cleaner development. It is envisaged to add more than 41,000 MW capacity of renewable energy generation during the 12th Planperiod.

The (RE) renewable energy (resources are generally located in remote locations and are found in few states only. Grid infrastructure is needed to be sufficient to transport the renewable energy to the load centers. Further, distribution licensees in each state, Captive Power Plants (CPP) and Open Access Consumers must be able to meet some percent of their annual consumption of energy through RE generation as part of their Renewable Purchase Obligations (RPO). In future scenarios, it is envisaged that home states would not be able to consume high cost renewable energy RE power within the state beyond their RPO requirements and therefore, RE power has to be transmitted to other states. Therefore, development of transmission system, both intra and inter-state transmission system to meet the needs of renewable energy in a larger scale are extremely necessary. Without significant increase in transmission capacity all the renewable generated energy cannot be accommodated in the power system. Renewable energy is also characterized by the problem of variability and intermittency. Intermittent/variable supply from RE sources create a complexity in the grid which is required to be addressed. Recognizing the criticality of large scale development of RE capacity and its integration with grid, Ministry of New & Renewable Energy (MNRE) and Forum of Regulators (FOR) have entrusted POWERGRID to carry out studies to identify transmission infrastructure and other control requirements for RE capacity addition program in 12th Plan and prepare a comprehensive report along with estimation of CAPEX requirement and financing strategy.

A SUPERCONDUCTING FAULT CURRENT LIMITER (SFCL), by Subhajit Mukherjee. Dept of EE

Modern electric power systems are becoming more and more complex in order to meet new needs. Nowadays a high power quality is mandatory and there is the need to integrate increasing amounts of on-site generation. All this translates in more sophisticated electric network with intrinsically high short circuit rate. This network is vulnerable in case of fault and special protection apparatus and procedures needs to be developed in order to avoid costly or even irreversible damage.

 

A superconducting fault current limiter (SFCL) is a device with negligible impedance in normal operating conditions that reliably switches to a high impedance state in case of extra-current. Such a device is able to increase the short circuit power of an electric network and to contemporarily eliminate the hazard during the fault. It can be regarded as a key component for future electric power systems. The use of the FCL as a mean to allow more interconnection of MV bus-bars as well an increased immunity with respect to the voltage disturbances induced by critical customer is discussed. The possibility to integrate more distributed generation in the distribution grid is also considered. Due to this characteristic a FCL is a fundamental component for future smart grids. Advantages of the FCL are allows a high level of networks interconnection thus making possible

  • a flexible link between producers, distribution operators and consumers which is required in the modern electric market,
  • favors the grid-connection of distributed generation (also from non-programmable sources such as renewables)
  • allows a higher power (voltage) quality reducing interruptions, dips, harmonic and flickers
  • avoids the need of extra-size of components and avoid (or delay) the need to replace protection equipment in case of network expansion

Electro Chemical Arsenic Remediation (ECAR) by Indranil Mookherjee, Department of Civil Engineering

The occurrence of high Arsenic in ground water was first reported in 1978 in West Bengal in India. In West Bengal, 79 blocks in 8 districts have Arsenic beyond the permissible limit of .05 mg/l. The most affected areas are on the eastern side of Bhagirathi River in the districts of Malda, Murshidabad, Nadia, North 24 Parganas and South 24 Parganas and western side of the districts of Howrah, Hugli and Bardhman. Arsenic in ground water is confined mainly in the aquifers up to 100 m depth. The deeper aquifers are free from Arsenic contamination. As of 2009 about 162.6 lakh people (35.48% of the total population of the state) live in the risk zone of potential threat in terms of Arsenic related diseases. Today, removal technologies implemented at community level are well-developed but often expensive. Therefore, the focus of research has been based on the development of cheap and easy-to-handle removal technologies especially for decentralized use in rural areas in developing countries. In light of the repeated problems with the continuing prevalence of arsenic in drinking water in India and Bangladesh, researchers at Lawrence Berkeley National Laboratory have recognized the need to be innovative in both technology and implementation. This interdisciplinary approach has led to the development of an efficient, effective, and low cost electricity-based technology known as Electro Chemical Arsenic Remediation (ECAR) for rural India and Bangladesh. This technology has a number of advantages over chemical adsorbents along with some additional challenges, such as the need for electricity. Electricity-based technologies are less appropriate for household filters in rural areas with limited electricity access. However, when partnered with an appropriate community scale implementation scheme, electricity-based technologies such as ECAR can be viable and beneficial to rural areas.

 

In Electro Chemical Arsenic Remediation (ECAR), electricity is used to continuously dissolve iron, forming a combination of corrosion products such as ferric hydroxides, oxy hydroxides, and oxides (i.e. rust). These products together form an electrochemically-generated adsorbent, or EGA, with a high affinity for arsenic. EGA is manufactured at the time of use, eliminating the need for a costly supply chain. In addition, electrochemical processes greatly enhance the arsenic removal capacity (i.e. arsenic removed per unit iron input) of ECAR relative to the chemical addition of ferric salts or metallic iron. This is due to (i) an increase in the rate of rust production (by factors of 10 to 100 over natural rust ingrate of metallic iron), and (ii) the rapid electrochemical oxidation of As(III) to the more favourable As(V),which binds much more readily to iron-based adsorbents. Thus employing a small amount of electricity allows for a large increase in efficiency, lower operating costs and production of far less arsenic-laden waste than most chemical adsorbents. In addition, the electrodes are self-cleaning if current is alternated, reducing maintenance and eliminating the need to handle strong alkalis and corrosive acids for regeneration (required of activated alumina and other regenerative adsorbents).

 

ECAR has been successfully lab tested and undergone a promising first round of field trials. It was found to be highly effective (final arsenic concentrations routinely reaching < 5 μg/L), robust, require little maintenance, and produce small quantities of sludge that can be successfully stabilized. These qualities combined with an extremely low operating cost make ECAR a promising candidate technology to operate in community scale micro utilities offering clean water at a locally affordable price. In addition, ECAR does not need an adsorbent to be imported, manufactured, or regenerated. This reduces large upfront capital investment and the need to set up and maintain chemical supply chains or handle hazardous chemicals, making the technology amenable to rapid scale-up.

 

 

Welding of wood by Satyabrata Podder

How to join two pieces instead of using glue? To be specific, how to join two pieces of wood to make it appear as a single piece from nature? The answer lies in the fine craftsmanship of a mechanical engineer who knows the emerging science of “mechanically-induced wood flow welding” pretty well. Now to “weld” wood, which is quite a promising technology, adheres pieces of lumber together by pressing (at 60 – 330 psi) and rubbing the parts to-and-fro at a very high speed for a 4 or 5 seconds. The friction created in between the pieces heats and melts the primary components of wood, lignin and the fibers present on the exposed surface. In the next few seconds, the molten lignin of both the surfaces intertwine in a matrix and finally solidifies when the friction movement is stopped and the interface is cooling down that results in the form of a full piece of wood of the desired shape and size.

Research at the Laboratory for Timber Construction IBOIS of Ecole Polytechnique (EPFL), Switzerland started to apply the principle of friction welding to be useful to weld wood. The university of Tennessee is also exploring the same area. Their research is mainly focused on the fabrication and application of welded timber panels. The research work has produced Small scale welded wooden samples successfully. Again those samples have gone through the tests for bending and shear giving a positive result. Here it should be noted that the structural design of welded timber construction requires a calculation tool for strength prediction to make those lumber pieces practically work. Initially probabilistic methods were used to determine the load bearing capacity. Here the welded joints had both parallel and perpendicular natural fiber pattern. The mathematically calculated and experimentally determined strength found to be staying in a good agreement with each other. Nevertheless to say that further research in this sphere will elevate these investigations on more complex systems. Hopefully more realistic scenarios will come into surface closer to full scale elements.

Now back to basics, let’s take a tour to the history of welding science. As a preconceived belief, welding is possible for only such type of matter, which is a good conductor of thermal energy and electricity. The parent materials should be joined such that the physical and chemical properties of the welded portion should remain same as the matter itself. Therefore not only the finished product should look as a single piece but also retain the ditto chemical qualities. Therefore welding of wood, if judged in this light, does not seem to have any real life acquaintance indeed. But this is possible. As of  standing in 2016, this chapter in mechanical engineering has passed the test of time in the very ritualistic way of how welding is done. An almost unthinkable phenomena! Surely science never fails to surprise. At every instance in a very pleasant way.

NUMERICAL SIMULATION OF FLUID FLOW by J. Kumar, Asst. Prof. ME Dept.

 

The central task in natural sciences is to describe reality as accurately as possible in order to better understand natural phenomena. In engineering sciences the purpose of research is to develop new products and to optimize the existing ones. In the past there have been two approaches in science: the experimental and the theoretical. With the invention of the computer new approaches have appeared:  the numerical simulation. Expensive experiments are being replaced by numerical simulations : which is cheaper and faster and simulation of phenomena that can not be experimentally reproduced (weather, ocean, …)

Mathematical equations that describe the physical world with reasonable accuracy are usually so complex that analytical solutions can no longer be obtained. Nowadays expensive experiments are increasingly being replaced by computer simulations. Moreover, simulation enables the examination of processes that cannot be experimentally tested. The starting point of any numerical method is the mathematical model, the set of partial differential equations and boundary conditions.

The equations of fluid mechanics, which were already derived about 150 years ago by Navier (1785-1836) and Stokes (1819-1903), are solvable analytically only in special cases. To obtain an approximate solution numerically, we have to use a discretization method which approximates the differential equations by a system of algebraic equations, which can then be solved on a computer. After selecting the mathematical model, one has to choose a suitable discretization method. The most important are: finite differences (FD), finite volume (FV) and finite element (FE) methods. The discrete locations at which the variables are to be calculated are defined by the numerical grid, which is essentially a discrete representation of the geometric domain on which the problem is to be solved. Accuracy Numerical solutions of fluid flow are only approximate solutions.

In addition to the errors that might be introduced in the development of the solution algorithm, in programming or setting up the boundary conditions, numerical solutions always include three kinds of systematic errors:

• Modeling errors: difference between the actual flow and the exact solution of the mathematical model

• Discretization errors: difference between the exact solution of the conservation equations and the exact solution of the algebraic system of equations obtained by discretizing these equations

• Iteration errors: difference between the iterative and exact solutions of the algebraic equation system.

Numerical simulations are becoming a most important drive for the design and analysis of complex systems. Numerical simulations are now recognized to be a part of the computer-aided engineering (CAE) spectrum of tools used extensively today in all industries, and its approach to modeling fluid flow phenomena allows equipment designers and technical analysts to have the power of a virtual wind tunnel on their desktop computer.

Controlling Remote Server and desktop pc’s for IT admins should be more robust by Debasish Dutta. Computer Science. (BCPS)

The people performing his/her role as IT administrator to instantly analyse problems, they can be quicker to resolve the problems as soon as they can diagnose the issue. But on the other hand as per user expectation and demand it is not possible to travel everywhere and give same level of support.

Remote admin, or better to say, Remote Management Tool are there for long duration and working with great needs and giving support to the administrator and nowadays there are dozens of these type of tools which are in the hands of administrators for support. They are providing remote access and control, support and management with more complex operations.

But one thing from the administrator’s view point, we need more the ability to remote in the server or end user’s system. We need more powerful and robust tools to give constant assistance to diagnose before we connect the server and desktop level PC for end users. Rather we can use in built tools to get a level standard for troubleshooting and diagnosis of the PCs from any corner of the Globe.

Microsoft provides varieties of tools for troubleshooting and for diagnosis and analysis. Its recent release is the RSAT (REMOTE SERVER ADMINISTRATION TOOLS) for windows 10, which is possible to work with next-gen Server administration.

These tools provide Server Manager, mmc (Microsoft management tools) that use Power Shell, the advanced feature for remote administration.

As IT administrator it is no doubt that we have worked with Microsoft built-in tools that can manage, take control, diagnose and resolve the issues.

Microsoft releases RSAT for separate version of windows server and windows client but the problem is that they don’t work in the same manner. It is true we can use Power Shell for task automation and configuration management framework from Microsoft consisting of command line shell in server but it is different from remote Control.

The problem which I commonly realize or encounter for remote in winxp is that, user’s session gets logged off automatically and this is quite embarrassing as the person without any warning, gets his/her session logged off. So in my viewpoint, there should be some advanced features for Remote IT support that should include ability to access the system perform task without interrupting, with its highest benchmark.

So the final word that I like to incorporate is that we need combined high ended tools so that we could forget about Third Party tools. So we need to go beyond Remote Control and Troubleshooting and developed more robust tools for IT administrator for management.

So I think this is the current issue in recent phase for management of systems.

Free Basics or Free Business Basics? by Aryabhatta Ganguly, Brainware College of Professional Studies (C.S.)

 

Nowadays, the most discussed topic in IT world is Free Basic or Free Net. Now what is that? The most popular and widely used social website Facebook has made a partnership between 6 companies namely QUALCOMM, Nokia, Opera Software ,Samsung, Ericsson, MediaTek to cater internet service to less developed countries by facilitating new business models and the provisions for Internet access.

 

But this project is being criticized for restricting net neutrality as it is not showing the sites which are not in the list of Facebook; they are Facebook’s rival.

 

Net neutrality or Network neutrality or Internet Neutrality is the term which in defined by the Columbia University Law professor Tim Wu in 2003 which means that the content of the internet should be provided to all the user by the Internet Service Provider (ISP) without blocking or favouring particular websites or products. But this rule is violating as because a special app is being provided to the user who are under the scheme of Internet.org by which they can access only the FB enlisted websites.

 

Actually Face book is targeting the third world countries for this project and they are simply implementing the Zero Rate policy (which means the offer for free access to the internet for popular sites by a particular mobile network.)

 

Several major service providers also enlisted themselves in this policy and they are making special arrangement with the mobile operators of the particular country or region to cater low data service to the users under this scheme.  It will be helpful to the user of the less developed countries, the people who don’t have the access to the internet they can access internet with a very low cost.  For example we can say that a user of Wikipedia Zero can access internet with an unlimited access with no cost to the online encyclopedia.

 

But on the other hand the Mobile Network Operators (MNO) are typically setting low volume caps data for free Internet user and setting high pricing for open internet data.  Google has also taken an initiative named Google Free Zone by which they are making an arrangement with internet providers (Mobile Based), by which  the provider provides  waived data to access a list of selected Google Products such as Google Search Engine, Gmail, Google + . A number of Internet Commentators are seeing these efforts and find Google Zero as a major competitor of Face book Zero.

 

In India Face book has partnered with Reliance Communications in February 2015 to spread the Internet.org catering limited portion of Internet services that includes total 37 websites other than facebook.  India is the fifth country under this mission of Facebook after Zambia, Tanzania, Colombia and Ghana.

 

For particular time being Kerala, TamilNadu, Maharastra, Andhra Pradesh, Gujrat is having these service.

 

The Net Neutrality debate is raged on the Facebook and  Telecom Authority of India (TRAI)  has set a bunch of question for differential data  pricing for content services , but didn’t specifically raise the issue of Net Neutrality. TRAI’s consultation paper titled “Cosultation Paper on Differential Pricing for Data Services” focuses over Zero rating platforms by MNO (Mobile Network Operator). Facebook responds to TRAI stating that Free Basics is open to everybody and entrusted digital quality. Reliance Communication commercial launch of Free Basic has been kept in abeyance, till all the details are considered by TRAI.

 

National Association of Software and Service Companies (NASSCOM) stated that the issue of different pricing needs to be considered on the basis of Net Neutrality.

 

Now both TRAI and Facebook both are fighting over the responses over Free Basics. TRAI is claiming that only 1.89  million people supported Free Basic but Facebook on January 6 claimed that 11 millions supported it’s plan to Free Basics.

TRAI promises to stand on differential data pricing by the end of January 2016.

 

So, Free Basics fueled the topic  of Net Neutrality and we have to see what comes at the end.

 

Augmented Reality (AR) Vs Virtual Reality(VR) by Sudipto Chattopadhyay, B.C.P.S, dept. of C.S

Technology is changing at a rapid pace, as many things are possible now a days that were not possible a decade ago, some of the impossible things are rising to the occasion in the form of Augmented Reality (AR) and Virtual Reality (VR).

Back in the 1990’s, VR was on the lips of everyone as multiple companies tried and failed to make it happen. The Nintendo Virtual Boy was the most notable device, though it failed miserably, and was discontinued a year after going on sale. Since then, Nintendo has never attempted improvement on the technology, which could set the company behind its competition as virtual reality has been slowly creeping back into our lives.

When it comes to AR, we are looking at that which has found more success in the consumer space when compared to VR. We have seen several applications with AR, along with video games and hardware devices such as the Google Glass.

 

Augmented Reality (AR)-

AR is the blending of VR and real life, as developers can create images within applications that blend in with contents in the real world. With the help of AR, users can interact with virtual contents in the real world, and are able to distinguish between the two.

 

Virtual Reality (VR)-

VR is all about the creation of a virtual world that users can interact with. This virtual world needs to be designed in a way that users find it difficult to tell the difference from what is real and what is unreal. VR is usually achieved by the wearing of a VR helmet or goggles similar to the Oculus Rift.

 

Future prospect of them

People are witnessing the rise of AR hardware devices from Google in the form of Glass, and also plan from Microsoft to launch something for wearable computing assets. On the matter of VR, the technology is just stepping up to the plate. It’s still far away from being a great thing for social encounters in a virtual world, such as Second Life, or even PlayStation Home, but with the rise of the Oculus Rift, it is getting there. Both VR  and AR will succeed; however, AR can have more commercial success because it does not completely take people out of the real world.

 

Conclusion-

Although VR and AR have existed in some form for decades, only recently have they garnered mainstream attention. VR is in right now, and its content and hardware advances have been exciting to watch. In a short snatch of time, content creators have made some mind-blowing advances in storytelling with this new technology. Brands, movie studios, gaming companies, news organizations are all tinkering with this tool and channel. VR will gain ascendancy throughout 2016, but I still think AR is going to become the dominant technology in our daily lives.

VR and AR tinker with our reality —

Yet AR enhances it, while VR diverts us from it.