Saturday, April 4, 2009

Using Tracert

Introduction

Tracert is a Windows based command-line tool that you can use to trace the path that an Internet Protocol (IP) packet takes to its destination from a source. Tracert will determine the path taken to a destination. It does this by sending Internet Control Message Protocol (ICMP) Echo Request messages to the destination. When sending traffic to the destination, it will incrementally increase the Time to Live (TTL) field values to aid in finding the path taken to that destination address. The path is outlined from this process.

Using the following illustration, let’s take a look at how tracert would function in a production network.


How to Use Tracert

As you saw in the last illustration, we will be sending traffic from a test workstation from Site B to a server at another site (Site A). The packets will traverse the wide area network (WAN) that separates the two sites over a T1 with a backup link via Integrated Services Digital Network (ISDN). To use the tracert utility, you simply need to know what your destination IP address is and how to use the tracert utility correctly as well as what to look for within the results.

Tracert works by manipulating the Time to Live (TTL). By increasing the TTL and then each router decrementing as it sends it along to the next router, you will have a hop count from your source to your destination. A router hop would be a packet sent from one router to another router – that’s a hop. When the TTL on the packet reaches zero (0), the router sends an ICMP "Time Exceeded" message back to the source computer. You can see an example of our sample network here in the next illustration; with a source and destination IP address… we will be using the workstation on Site B and a server at Site A for our test.

From this illustration you can see that the source IP will be 10.1.2.4 and the destination (for this example) will be 10.1.1.6. The normal route the packets should take would be from Site B to Site A over the higher capacity link, the T1 (1.544 Mbps). The ISDN link is 128 Kbps and is used as a backup if the primary link fails. Tracert once fired up and used will be able to show you that the packets sent will start from Site B, the PC at 10.1.2.4 and then traverse the T1 to 10.1.1.1. That router will know how to send the packets to its local LAN (10.1.1.0) and ultimately to 10.1.1.6.
As the packets are sent, tracert will use the first interface on the router that it sees to report back your router hops, so let’s take a look at our complete path before we send the test packets.

The Tracert Test
Now, to use tracert, you simply need to open a command prompt. To do this, go to
Start => Run => CMD => tracert
(note – you must type tracert, as you can see traceroute only works on UNIX/Linux and other systems such as Cisco, etc)


Using Tracert Options

To use tracert, be aware of a few options you can use with it. The most helpful is the first one. Using the –d option is always helpful when you want to remove DNS resolution. Name servers are helpful, but if not available or if incorrectly set or if you simply just want the IP address of the host, use the –d option.
-d Prevents tracert from attempting to resolve the IP addresses of intermediate routers to their names. This can speed up the display of tracert results.

-h Specifies the maximum number of hops in the path to search for the target (destination). The default is 30 hops.

-j You can use this with a host list (HostList). Specifies that Echo Request messages use the Loose Source Route option in the IP header with the set of intermediate destinations specified in HostList. With loose source routing, successive intermediate destinations can be separated by one or multiple routers. The maximum number of addresses or names in the host list is 9. The HostList is a series of IP addresses (in dotted decimal notation) separated by spaces.

-w Specifies the amount of time in milliseconds to wait for the ICMP Time Exceeded or Echo Reply message corresponding to a given Echo Request message to be received. If not received within the time-out, an asterisk (*) is displayed. The default time-out is 4000 (4 seconds)

-? Displays help at the command prompt.

tracert [-d] [-h MaximumHops] [-j HostList] [-w Timeout] [TargetName]

How to Use Tracert to Troubleshoot

There may be times where the output you get isn’t so clear to you. For example, what if you get an asterisk? As just mentioned in the last section, an asterisk can be a false positive, because the ICMP packet may be traveling through, but something is stopping the report from coming back, most likely a firewall rule or access control list.

You can use tracert to find out where a packet stopped on the network. In the following example, the default gateway has found that there is no valid path for any host. This would mean that both links are down – the T1 and the ISDN and there is no destination available.

C:\>tracert 10.1.1.6Tracing route to 22.110.0.1 over a maximum of 30 hops ----------------------------------------------------- 1 10.1.2.1 reports: Destination net unreachable.
Trace complete.

From this example, you can see that when you sent the tracert test to 10.1.1.6, the LAN default gateway reported that it could not find a path – to see this in graphical format may help you to understand it better.


As just mentioned, since there is no path, the closest router to the source informs the source that there is no path.

Important Notes

Here are some important notes that I have compiled to help you learn more about tracert.
Tracert also doesn’t help you to find ‘latency’. To trace a path and provide network latency and packet loss for each router and link in the path, use the pathping command. Visit my author section on this site to learn about pathping.

Tracert is available only if the Internet Protocol (TCP/IP) protocol is installed as a component in the properties of a network adapter in Network Connections. This is a TCP/IP utility that uses ICMP, a protocol within the TCP/IP protocol suite.

On modern Linux distros the traceroute (not tracert although some Linux systems allow you to use tracert too!) utility uses UDP datagram’s with a port number of 33434. Windows uses ICMP echo request (type 8) better known as ping packets.

Read RFC 792 for more information about ICMP and its internals.


Summary

In this article we covered the basics of tracert. Tracert (also known as traceroute) is a Windows based tool that allows you to help test your network infrastructure. In this article we looked at and covered how to use tracert while trying to troubleshoot real world problems such as multiple paths or downed links. This will help to reinforce the tool's usefulness and show you ways in which to use it when working on your own networks. This TCP/IP utility allows you to determine the route packets take through a network to reach a particular host that you specify. Tracert works by increasing the "time to live" (TTL) value of each successive packet sent. When a packet passes through a host, the host decrements the TTL value by one and forwards the packet to the next host. When a packet with a TTL of one reaches a host, the host discards the packet and sends an ICMP time exceeded. Tracert, if used properly, can help you find points in your network that are either routed incorrectly or are not existent at all. Tracert (and traceroute) is a tool that you must master if you plan on working on networks – this (with ping and pathping) can be used to help you map and troubleshoot your network with ease. Stay tuned for more!



Tuesday, March 24, 2009

Photo Highlight

Enjoying the view : Indian couples sit on a sea wall as they enjoy the last rays of sun in Mumbai

A young kangaroo (macropus fuliginosus) investigates the world from its mother's pouch in the zoo in Basel, Switzerland.


A model presents a creation by French fashion designer Jean-Charles de Castelbajac for his Fall-Winter 2009-2010 ready-to-wear collection, Tuesday, March 10, 2009 in Paris. The face of late American artist Andy Warhol is seen printed on the fabric.


Gerberas flowers are seen in a field for export at a plantation in LLano Grande, 37 miles (60 km) east of San Jose, March 10, 2009. Flower producers have seen their exports to the United States and Canada reduced by 50 percent in the current global economic crisis, according to a producer's cooperative. REUTERS/Juan Carlos Ulate.


This photo released by the Las Vegas News Bureau, shows an Owl butterfly hanging on a plant at the Bellagio Las Vegas Conservatory & Botanical Gardens.

Signs of spring : Daffodil flowers are pictured in front of London's Big Ben.




One-way street : Goldfish swim in an aquarium at a market in north of Tehran.

MAGENN AIR ROTOR SYSTEM (M.A.R.S.)...... New Technology...............Produce Power.....

Magenn Power's MARS is a Wind Power Anywhere™ solution with distinct advantages over existing Conventional Wind Turbines and Diesel Generating Systems including: global deployment, lower costs, better operational performance, and greater environmental advantages. MARS is a lighter-than-air tethered wind turbine that rotates about a horizontal axis in response to wind, generating electrical energy. This electrical energy is transferred down the 1000-foot tether for immediate use, or to a set of batteries for later use, or to the power grid. Helium sustains MARS and allows it to ascend to a higher altitude than traditional wind turbines. MARS captures the energy available in the 600 to 1000-foot low level and nocturnal jet streams that exist almost everywhere. MARS rotation also generates the "Magnus effect" which provides additional lift, keeps the MARS stabilized, and positions it within a very controlled and restricted location to adhere to FAA (Federal Aviation Administration) & Transport Canada guidelines. The Advantages of MARS over Conventional Wind Turbines: Wind Power Anywhere™ removes all placement limitations. Coast-line or off-shore locations are not necessary to capture higher speed winds. Reaching winds at 1,000-feet above ground level allow MARS to be installed closer to the grid. MARS is mobile and can be rapidly deployed, deflated, and redeployed without the need for towers or heavy cranes. MARS is bird and bat friendly with lower noise emissions and is capable of operating in a wider range of wind speeds - from 4 mph to greater than 60 mph. The Advantages of a MARS combined Wind and Diesel Solution over a Diesel Generator-only solution: MARS can complement a diesel generator by offering a combined diesel-wind power solution that delivers power below 20 cents per kWh. This compares to a wide range of 25 cents to 99 cents per kWh for diesel-alone, reflecting the high fuel and transportation costs in remote areas. The MARS combined solution allows lower pollution and green house gas emissions. It also results in lower handling, transporting, and storage costs. MARS Target Markets: Developing nations where infrastructure is limited or non existent; off-grid combined wind and diesel solutions for island nations, farms, remote areas, cell towers, exploration equipment, backup power & water pumps for natural gas mines; rapid deployment diesel & wind solutions (to include airdrop) to disaster areas for power to emergency and medical equipment, water pumps; on-grid applications for farms, factories, remote communities; and wind farm deployments.





The Magenn Power Air Rotor System (MARS) is an innovative lighter-than-air tethered device that rotates about a horizontal axis in response to wind, efficiently generating clean renewable electrical energy at a lower cost than all competing systems. This electrical energy is transferred down the tether to a transformer at a ground station and then transferred to the electricity power grid. Helium (an inert non-reactive lighter than air gas) sustains the Air Rotor which ascends to an altitude for best winds and its rotation also causes the Magnus effect. This provides additional lift, keeps the device stabilized, keeps it positioned within a very controlled and restricted location, and causes it to pull up overhead rather than drift downwind on its tether.

All competing wind generators use bladed two-dimensional disk-like structures and rigid towers. The Magenn Power Air Rotor system is a closed three-dimensional structure (cylinder). It offers high torque, low starting speeds, and superior overall efficiency thanks to its ability to deploy higher. The closed structure allows Magenn Power to produce wind rotors from very small to very large sizes at a fraction of the cost of current wind generators.


The distinct advantages of the Magenn Air Rotor System design are as follows:

Magenn Air Rotor System is less expensive per unit of actual electrical energy output than competing wind power systems.
Magenn Power Air Rotor System will deliver time-averaged output much closer to its rated capacity than the capacity factor typical with conventional designs. Magenn efficiency will be 40 to 50 percent. This is hugely important, since doubling capacity factor cuts the cost of each delivered watt by half.
Wind farms can be placed closer to demand centers, reducing transmission line costs and transmission line loses.
Conventional wind generators are only operable in wind speeds between 3 meters/sec and 28 meters/sec. Magenn Air Rotors are operable between 1 meter/sec and in excess of 28 meters/sec.
Magenn Air Rotors can be raised to higher altitudes, thus capitalizing on higher winds aloft. Altitudes from 400-ft to 1,000-ft above ground level are possible, without having to build an expensive tower, or use a crane to perform maintenance.
Magenn Air Rotors are mobile and can be easily moved to different locations to correspond to changing wind patterns. Mobility is also useful in emergency deployment and disaster relief situations. These points are mutually inclusive. The advantages above combine to make Magenn the most cost-effective wind electrical generation system.

ATG Dynamo Articles - J2EE / JSP in Dynamo

Part 1 : Creating a new J2EE app in ATG Dynamo 5.6.1

Overview

These instructions list the steps to create an deploy a new j2ee application using ATG Dynamo 5.6.1

This is the first step in converting the Dating application from JHTML to JSP. Even if you haven't created the dating application these steps still provide all the steps you need to create a new j2ee app and then go on to convert an existing JHTML app to JSP and j2ee.

Instructions

Start solid DB
Start Dynamo (WITHOUT the Dating module if you've already done that)
bin\startDynamo -m DSS (just with DSS)
start the ACC
Tools->J2EE Deployment
File-> create new J2EE application

ATG Dynamo Articles - Where do I Get an AccessDeniedListener From?

Overview

AccessDeniedListener is an interface which you must implement to track when a user is denied access to a protected resource. It's simple to write and set up.

Source Code
import atg.userprofiling.AccessDeniedListener;
import atg.userprofiling.AccessDeniedEvent;

public class DeniedListener
extends atg.nucleus.GenericService
implements AccessDeniedListener
{

public DeniedListener()
{
}
public void accessDenied(AccessDeniedEvent evt)
{
if(isLoggingInfo())
{
logInfo("Access Denied: " + evt.getURL());
}
}
}
Configuration
Set up a global Component based on your DeniedListener, in this example we've called it

/training/sf/security/AccessDeniedListener
Set up any old access Controller, in this example we've used

/training/sf/security/MemberAccessController
$scope=global
allowGroups=members
deniedAccessURL=/
groupRegistry=/atg/registry/RepositoryGroups

Add your listener and the access controller to the /atg/userprofiling/AccessControlServlet configuration

accessControllers=/members\=/training/sf/security/MemberAccessController
deniedAccessURL=/
accessDeniedListeners+=/training/sf/security/AccessDeniedListener

Testing the Listener
Don't log in as a member!
Point your browser to http://localhost:8840/members.
You should be redirected to the doc root check the log for an info message from your listener

Monday, March 23, 2009

ATG Dynamo Articles - Personalization (DPS)


Example of a Droplet Using Query Builder to Find Users by Name

Note: You'ld actually use RQLQueryForEach to do this job. This is just an example of the technology!

This sample implements a droplet which takes a search term (match), finds users whose login, firstName or lastName property matches the term and then iterates over them similarly to a ForEach droplet.

Components of this type could be configured at global scope since the droplet has no state to be set by the page designer. The repository and itemDescriptorName properties are configured in the component. While limit, start, empty, finish, match and person are configured via page parameters.


Source Code - PeopleFinder.javaimport atg.repository.*;
import javax.servlet.*; // ServletException
import atg.servlet.*; // DynamoServlet
import java.io.*; // IOException
public class PeopleFinder extends DynamoServlet
{
Repository mRepository = null;
public Repository getRepository()
{
return mRepository;
}
public void setRepository(Repository r)
{
mRepository = r;
}
String mstrItemDescriptorName = "user";
public void setItemDescriptorName(String s)
{
mstrItemDescriptorName = s;
}
public String getItemDescriptorName()
{
return mstrItemDescriptorName;
}
void show(RepositoryItem[] ris,
int nMaxLimit,
DynamoHttpServletRequest req,
DynamoHttpServletResponse res)
{
try
{
int nMax = ris.length;
if((nMaxLimit > 0)&&(ris.length > nMaxLimit))
{
nMax = nMaxLimit;
}
req.setParameter("count", new Integer(ris.length));
req.setParameter("limit", new Integer(nMaxLimit));
if(nMax > 0)
{
req.serviceLocalParameter("start",req,res);
}
for(int i=0; i <> 0)
{
req.serviceLocalParameter("finish",req,res);
}
}
catch(Exception ex)
{
logError(ex);
}
}
public void service(DynamoHttpServletRequest req,
DynamoHttpServletResponse res)
throws IOException, ServletException
{
try
{
String strMatch = req.getParameter("match");
String strMaxLimit = req.getParameter("limit");
Integer nMaxLimit = new Integer((strMaxLimit != null) ? strMaxLimit : "0");
if(strMatch != null)
{
if(getRepository() != null)
{
if(getItemDescriptorName() != null)
{
RepositoryView view = getRepository().getView(getItemDescriptorName());
if(view != null)
{
// Get the builder
QueryBuilder userBuilder = view.getQueryBuilder();
// Expressions
QueryExpression
firstName = userBuilder.createPropertyQueryExpression("firstName");
QueryExpression
lastName = userBuilder.createPropertyQueryExpression("lastName");
QueryExpression
login = userBuilder.createPropertyQueryExpression("login");
QueryExpression
match = userBuilder.createConstantQueryExpression(strMatch);
// Build queries for OR operation
Query [] queries = new Query[3];
queries[0] = userBuilder.createPatternMatchQuery(firstName, match,
QueryBuilder.CONTAINS);
queries[1] = userBuilder.createPatternMatchQuery(login, match,
QueryBuilder.CONTAINS);
queries[2] = userBuilder.createPatternMatchQuery(lastName, match,
QueryBuilder.CONTAINS);
// Create the main query
Query query = userBuilder.createOrQuery(queries);
// finally, execute the query and get the results
//
RepositoryItem[] people = view.executeQuery(query);
if(people == null)
{
req.serviceLocalParameter("empty",req,res);
}
else
{
show(people, nMaxLimit.intValue(), req, res);
}
}
else
{
throw new ServletException("No view found called for '"
+ getItemDescriptorName() + "'");
}
}
else
{
throw new ServletException("'itemDescriptorName' property not configured");
}
}
else
{
throw new ServletException("'repository' property not configured");
}
}
else
{
throw new ServletException("'match' is a required parameter for this droplet");
}
}
catch(Exception e)
{
logError(e);
}
}
public PeopleFinder()
{
}
}
Droplet Bean Info Source Code - PeopleFinderBeanInfo.javaimport java.beans.*;
import atg.servlet.DynamoServlet;
import atg.droplet.DropletBeanInfo;
import atg.droplet.ParamDescriptor;
public class PeopleFinderBeanInfo extends DropletBeanInfo
{
//-------------------------------------
// CONSTANTS
//-------------------------------------
public static final String CLASS_VERSION = "0.9";
//-------------------------------------
// FIELDS
//-------------------------------------
private final static ParamDescriptor[] sWrapperDescriptors = {
new ParamDescriptor("count", "The number of people in the result list",
Integer.class, false, true)
};
private final static ParamDescriptor[] sOutputDescriptors = {
new ParamDescriptor("element",
"The repository item for this person matching the search term",
atg.repository.RepositoryItem.class, false, true),
new ParamDescriptor("index", "The index of this person in the result list",
Integer.class, false, true)
};
private final static ParamDescriptor[] sParamDescriptors = {
new ParamDescriptor("limit",
"Specify > 0 to limit the maximum number of people returned",
Integer.class, false, true),
new ParamDescriptor("match", "Search Term for First and Last Name",
String.class, false, true),
new ParamDescriptor("person", "Rendered once between local and stack parameters",
DynamoServlet.class, false, true, sOutputDescriptors),
new ParamDescriptor("start", "Rendered before found matches",
DynamoServlet.class, false, true, sWrapperDescriptors),
new ParamDescriptor("finish", "Rendered after found matches",
DynamoServlet.class, false, true, sWrapperDescriptors),
new ParamDescriptor("empty", "Rendered when no matches are found",
DynamoServlet.class, false, true),
};
private final static BeanDescriptor sBeanDescriptor =
createBeanDescriptor(PeopleFinder.class,
null,
"This servlet iterates over the matching people and"
+ " calls person once for each match, setting element to"
+ " the repository item for that person. If no matches"
+ " are found the empty oparam is rendered",
sParamDescriptors,
"YourDropletCategoryName");
//-------------------------------------
// METHODS
//-------------------------------------
//-------------------------------------
/**
* Returns the BeanDescriptor for this bean, which will in turn
* contain ParamDescriptors for the droplet.
**/
public BeanDescriptor getBeanDescriptor() {
return sBeanDescriptor;
}
//----------------------------------------
}
JHTML Sample
People Finder

People Finder

Match Term :
No Matches Found
  • : :






    What's the Deal with java tags in JSP and JHTML Pages?

    Note: The information in this article is for Dynamo 5.1 only

    Example Code

    Let's look at an example. The server code below uses both and <% %>tags, we can save the same page with both a .jsp and .jhtml extension and get different results

    Java Server Page
    iJavaTag++;

    As a JHTML Page you can effectively use the jsp tags in a JHTML page for java code, the result of firing up our java_server_page.jhtml page is as follows:

    Java Server Page

    Test JSP tag: 1

    Test JSP < % % > tag: 1
    We see that both the <% %>and tag delimited code is part of the page servlet as a JSP page you cannot effectively use the jhtml tags in a JSP page for java code, the result of firing up our java_server_page.jsp page is as follows:

    Java Server Page

    iJavaTag++;

    Test JSP tag: 0

    Test JSP < % % > tag: 1
    We see that the tags are not processed on the server and are passed down untouched to the client. This is because these tags are not a part of the JSP specification.

    Summary

    Files ending in .JHTML are processed with the JHTML tags enabled as a superset of the JSP standard.

    This means:
    You can save JSP pages as .jhtml files and they will work fine
    JHTML tags are not processed on the server for .jsp files

    Using Bean Info for Custom Droplets

    Overview

    Writing a BeanInfo class for a custom droplet has several advantages for your droplet's users
    Components of your droplet will appear in the by module view in the DCC browser
    The DCC will display the input, output and oparam parameters for your users in the JHTML editor
    The DCC will enforce type checking for your parameters
    The custom description from your droplet will appear in the component browser

    Example 1 : TimeOfDay

    The TimeOfDay is a droplet which renders one of three parameters based on the current time, morning, afternoon or evening. The code for TimeOfDay.java is given below. A BeanInfo file ALWAYS has the same name as the bean file, with the addition of 'BeanInfo'. So in our case the bean info java file is TimeOfDayBeanInfo.java.
    This example has an optional calendar parameter and the three oparam. There are no output parameters.
    Notice that the calendar parameter is typed to java.lang.Calendar while the oparams are defined by atg.servlet.DynamoServlet.

    TimeOfDay.javapackage imagescript.atg.droplets;
    import javax.servlet.*; // ServletException
    import atg.servlet.*; // DynamoServlet
    import java.io.*; // IOException
    import java.util.Date;
    import java.util.GregorianCalendar;
    import java.util.Calendar;
    public class TimeOfDay extends DynamoServlet
    {
    public TimeOfDay() {}
    public void service(DynamoHttpServletRequest req,
    DynamoHttpServletResponse res)
    throws IOException, ServletException
    {
    Calendar c = null;
    try
    {
    c = (Calendar)req.getObjectParameter("calendar");
    }
    catch(ClassCastException e)
    {
    throw new ServletException(e);
    }
    if(c == null)
    {
    c = new GregorianCalendar();
    }
    int nHour = c.get(Calendar.HOUR_OF_DAY);
    if(nHour < 12)
    {
    req.serviceLocalParameter("Morning", req, res);
    }
    else if(nHour < 18)
    {
    req.serviceLocalParameter("Afternoon", req, res);
    }
    else
    {
    req.serviceLocalParameter("Evening", req, res);
    }
    }
    }
    TimeOfDayBeanInfo.javapackage imagescript.atg.droplets;
    import java.beans.*;
    import atg.servlet.DynamoServlet;
    import atg.droplet.DropletBeanInfo;
    import atg.droplet.ParamDescriptor;
    public class TimeOfDayBeanInfo extends DropletBeanInfo
    {
    private final static ParamDescriptor[] sParamDescriptors =
    {
    new ParamDescriptor("calendar", "Optional: calendar to base the choice on",
    java.util.Calendar.class, true, true),
    new ParamDescriptor("Morning", "Rendered before noon",
    DynamoServlet.class, false, true),
    new ParamDescriptor("Afternoon", "Rendered after noon but before 6pm",
    DynamoServlet.class, false, true),
    new ParamDescriptor("Evening", "Rendered after 6pm",
    DynamoServlet.class, false, true)
    };
    private final static BeanDescriptor sBeanDescriptor =
    createBeanDescriptor(TimeOfDay.class, null,
    "This servlet renders one of the three "
    +"oparams depending on the time of day",
    sParamDescriptors,
    "ATGSFDroplets");
    public BeanDescriptor getBeanDescriptor() {
    return sBeanDescriptor;
    }
    }

    Example 2 : output parameters

    To define output parameters for an oparams, pass in an array of ParamDescriptors respresenting the output parameters into the ParamDescriptor constructor for the oparam.
    The BannerMenu class implements a simple menu which is configured via the DCC, some of the required code is missing from this page for readability, but the essence of the droplet can be gleaned from the source below. For each menu item an oparam is rendered and the href and label output parameters are set.

    In the example below, the array sOutputDescriptors defines the href and label output parameters which are available in the currentItem and item oparams. This relationship is expressed by supplying the sOutputDescriptors array to the ParamDescriptor constructor for each of the oparams. You can create different arrays for individual oparams in the same manner.

    BannerMenu.javapackage imagescript.atg.droplets;
    import atg.servlet.DynamoServlet;
    import atg.servlet.*;
    import atg.nucleus.naming.ParameterName;
    import javax.servlet.*;
    import java.io.*;
    public class BannerMenu extends DynamoServlet
    {
    // input
    public final static ParameterName URL
    = ParameterName.getParameterName("currentPage");
    public final static ParameterName TRANSIENT
    = ParameterName.getParameterName("isTransient");
    // output
    public final static ParameterName HREF
    = ParameterName.getParameterName("href");
    public final static ParameterName LABEL
    = ParameterName.getParameterName("label");
    // Oparams
    public final static ParameterName ITEM
    = ParameterName.getParameterName("item");
    public final static ParameterName CURRENTITEM
    = ParameterName.getParameterName("currentItem");
    public void service(DynamoHttpServletRequest pReq,
    DynamoHttpServletResponse pRes)
    throws ServletException, IOException
    {
    try
    {
    if((mItems != null) && (mItems.length > 0))
    {
    String strURL = pReq.getParameter(URL);
    if(strURL != null)
    {
    int i = strURL.lastIndexOf("/");
    strURL = strURL.substring(i);
    }
    if(isLoggingDebug())
    {
    logDebug("Base URL:" + strURL);
    }
    String s = pReq.getParameter(TRANSIENT);
    if(s == null)
    {
    throw new ServletException(
    "TRANSIENT must be specified (Profile.transient)");
    }
    boolean bMember = !(new Boolean(s)).booleanValue();
    if(isLoggingDebug())
    {
    logDebug("Member?:" + bMember);
    }
    for(int i=0; i < mItems.length; i++)
    {
    boolean bRender = false;
    if(bMember && mItems[i].getForMembers().booleanValue())
    {
    bRender = true;
    }
    else if(!bMember && mItems[i].getForNonMembers().booleanValue())
    {
    bRender = true;
    }
    if(bRender)
    {
    pReq.setParameter(HREF.getName(), new String(mItems[i].getURL()));
    pReq.setParameter(LABEL.getName(), new String(mItems[i].getLabel()));
    if((strURL != null) && strURL.endsWith(mItems[i].getURL()))
    {
    pReq.serviceLocalParameter(CURRENTITEM,pReq,pRes);
    }
    else
    {
    pReq.serviceLocalParameter(ITEM,pReq,pRes);
    }
    }
    else
    {
    if(isLoggingDebug())
    {
    logDebug("Skipping " + mItems[i].getLabel());
    }
    }
    }
    }
    else
    {
    if(isLoggingInfo())
    {
    logInfo("No items in BannerMenu");
    }
    }
    }
    catch (Exception e)
    {
    if(isLoggingError())
    {
    logError(e);
    }
    }
    }
    BannerMenuItem[] mItems;
    public void setItems(BannerMenuItem[] p) { mItems = p; }
    public BannerMenuItem[] getItems(){ return mItems; }
    public BannerMenu() {
    }
    }
    BannerMenuBeanInfo.javapackage imagescript.atg.droplets;
    import java.beans.*;
    import atg.servlet.DynamoServlet;
    import atg.droplet.DropletBeanInfo;
    import atg.droplet.ParamDescriptor;
    public class BannerMenuBeanInfo extends DropletBeanInfo
    {
    private final static ParamDescriptor[] sOutputDescriptors =
    {
    new ParamDescriptor("href", "The URL link for the menu item",
    String.class, false, false),
    new ParamDescriptor("label",
    "The text associated with the menu item "
    +"(label or image link it's your choice)",
    String.class, false, false)
    };
    private final static ParamDescriptor[] sParamDescriptors =
    {
    new ParamDescriptor("currentPage",
    "The current page URL; usually from `request.getRequestURI()`",
    String.class, false, true),
    new ParamDescriptor("isTransient",
    "The isTransient value from the Profile; bean:Profile.transient",
    Boolean.class, false, true),
    new ParamDescriptor("item", "Rendered for a menu item",
    DynamoServlet.class, false, true, sOutputDescriptors),
    new ParamDescriptor("currentItem",
    "Rendered for a menu item matching "
    +"the supplied currentPage param",
    DynamoServlet.class, false, true, sOutputDescriptors)
    };
    private final static BeanDescriptor sBeanDescriptor =
    createBeanDescriptor(BannerMenu.class,
    null,
    "This servlet renders menu items with a special case "
    +"for the menu item of the current page",
    sParamDescriptors,
    "ATGSFDroplets");
    public BeanDescriptor getBeanDescriptor() {
    return sBeanDescriptor;
    }
    }

    Setting up a Dynamo Module

    Here's how to get your module to appear in the Dynamo DCC. Here are the steps, we'll use the name MyModule in this example

    Create the module folders

    Create a sub folder for your module in your on my machine that would be: C:\ATG\Dynamo5.1
    So our new folder is C:\ATG\Dynamo5.1\MyModule. This folder will contain all the files for your module

    Create the following sub folders

    classes - your java classes will go here
    config - your global module configuration files/layer will go here
    doc - your web pages will go here
    localconfig - I think this is for machine specific config files/layer
    META-INF - your module manifest will go here
    bin - your OS specific configuration files will go here
    Create a MANIFEST.MF file
    Create a file called MANIFEST.MF in the META-INF directory
    ATG-Product: MyModule
    Manifest-Version: 1.0
    ATG-Config-Path: config/
    ATG-Class-Path: classes/
    ATG-Required: DAS
    This has many effects.
    ATG-Config-Path: config/
    This appends the specified directory to the CONFIGPATH at startup, enabling your modules customizations
    ATG-Class-Path: classes/
    This pre-pends the classes sub folder to the CLASSPATH enabling your java classes that are placed there ATG-Required: DAS
    This forces the DAS modules to be loaded before your module. This is your dependency list, for example if you module relies on DPS as well you would specify..
    ATG-Required: DAS DPS
    Similarly ATG-Required: DAS DPS DSS DCS CSR Fulfillment
    Would force all these modules to load before MyModule
    Your module is now set up, we just need to tell Dynamo to use it.
    Starting Dynamo with your Module
    The module list to load is specified by using the -m flag on the startDynamo command line.
    To set this up create a copy of your standard shortcut to Dynamo, then modify the new Shortcutby adding MyModule to the end of the -m portion your link might look like this:
    C:\WINNT\System32\cmd.exe /k C:\ATG\Dynamo5.1\home\bin\startDynamo.bat -m MyModule
    This tells Dynamo to look for your module at start up.
    Optional: Displaying Web Content for your Module
    Dynamo does not support multiple document roots, any content to be displayed must be positioned somewhere under the system's doc root. This leaves you with two options
    Copy your content into the document root hierarchy
    Set the system doc root to your own doc directory
    The first of these two is doesn't really need to be explained here, so we'll deal with how to set Dynamo's doc root to your doc directory in the case where your module is active. This is useful for sample applications and allows you to easily test and demonstrate the functionality of your module without polluting the shared file system with sample files.
    To get the web server to display our module's content we need to do two things
    In your doc directory create a index.html file - some kind of hello world
    In your config\atg\ folder create a new folder dynamo, create a file called
    Configuration.properties with this line in it
    documentRoot=D:\\ATG\\Dynamo5.1\\MyModule\\doc
    This configuration layer will tell the web server to use your doc directory as the website root. This will only work for you if MyModule is the last configuration layer which specifies documentRoot.
    Optional: Making your configuration layer the default for updates
    When you modify the configuration in the DCC the changes are written back to the properties files in the default update layer. You can make your layer the default by specifying the following line in the CONFIG.properties file in your config folder.
    defaultForUpdates=true
    Optional: Registering your Module with the System
    This step is not required for packaging purposes but does allow you to use some advanced module features
    Create or modify CONFIG.properties file in your config directory adding the following 2 lines followed by an empty line (Do not delete the defaultForUpdate property if you added it in the previous step)
    name=MyModule
    module=MyModule
    In your config directory create a sub folder 'atg' then inside that another subfolder 'modules'. In that directory (e.g. C:\ATG\Dynamo5.1\MyModule\config\atg\modules)...
    Create a file MyModule.properties with the following, followed by an empty line:
    $class=atg.service.modules.Module
    moduleName=MyModule
    (This creates an component MyModule of class atg.service.modules.Module with the moduleName parameter set to MyModule)
    Create a file ModuleManager.properties with a the following, followed by an empty line:
    modules+=MyModule
    This tells adds the MyModule component to the ModuleManager modules list and presents your module in the DCC module list
    Optional: Environment setup files
    If you need to set some environment variables every time your module is started you can do this via the manifest file.
    Add this line to your MANIFEST.MF file:
    ATG-Config-Scripts: bin/setup
    This causes the file bin/setup.bat to be executed at startup, which can be very handy.
    So you'll also need to add that file to the bin directory, including your environment setup
    rem Add your start time configuration settings here

    Getting ready to deliver super-fast broadband

    It is almost nine months since British Telecom has set out plan to bring super-fast broadband to up to 10 million homes and businesses in the UK by 2012.

    This is a bold plan for the future. BT intend to invest £1.5bn in the next few years – which is a huge sum of money, especially in today’s economic environment. BT need to make a good commercial return on that investment which is why BT is working closely with industry, local and regional authorities to make sure that we take super-fast broadband to areas where there is genuine customer demand.

    So, what’s the latest news?

    Today, Openreach has announced the areas where, from January 2010, up to 500,000 customers will have access to fibre-based, super-fast broadband via our network. This fibre to the cabinet (FTTC)* technology will be deployed at 29 exchanges including parts of Belfast, Cardiff, Edinburgh, Glasgow, London and Greater Manchester; also at rural exchanges near Halifax and Cardiff. Areas serving a further million homes and businesses will be announced in the Autumn.

    Whilst Openreach will deploy the technology, it is up to the UK’s communications providers to develop the super-fast, innovative broadband services for their end-users. The same is true for our downstream businesses, like BT Retail, as super-fast broadband is an outstanding opportunity to retain existing customers as well as winning back customers from our competitors. Our upstream speeds of up to 10mb/s will be the fastest in the country, enabling all sorts of new and exciting applications and services to be developed.

    Other things to get ready:

    Ofcom has recently said that it will play its part in making sure that regulation does not stand in the way of companies that are prepared to invest in creating the UK’s broadband future. It will allow pricing flexibility at the retail and wholesale levels to enable returns appropriate to the considerable risks of building new networks. This is positive news, though still more to come on the regulatory front BT is counting down to July when we run FTTC pilots in both Muswell Hill, London and in Whitchurch, Cardiff. Five communications providers are taking part in these pilots with up to 15,000 customer premises involved in each area.

    Many parts of the company are involved in making sure that the technical and operational delivery of super-fast broadband are flawless – and there has been some great team work in bringing this together a premium fibre service has to come with a superb customer experience. We are building this in from the start, working through all the various touch points a customer would have so that the service works right first time.

    The UK already leads the world in terms of broadband access and penetration. Our plans for super-fast broadband will help the UK climb the broadband speed tables as well.

    ATG Dynamo


    In the late 90's, ATG Dynamo was a pioneering platform that helped organizations build robust, scalable and innovative Web applications, long before 'J2EE' or 'application server' were mainstream terms. What made Dynamo special is that it solved many of the common problems developers faced in building Web applications. Even today, the J2EE specification doesn't fully address some of these problems. An active Open Source community is now striving to plug these gaps, to help simplify Web application development. Dynamo ultimately became a certified J2EE application server, but continued to stand apart from the crowd.

    The ATG Dynamo name was not only applied to the 'application server' layer. Many will remember DPS (Dynamo Personalization Server), DCS (Dynamo Commerce Server), and a little later, DSS (Dynamo Scenario Server). Dynamo was not just about the technology platform; it became a suite of building blocks to help organizations create and manage user-centric, personalized Web experiences.


    All of this technology is very much alive and well at ATG. In fact, it's the cornerstone of all our current products. Much of ATG's early technology remains as innovative today as it was several years ago. So what happened to 'Dynamo?'
    • Dynamo Commerce Server evolved into ATG Commerce, which is the industry's most complete and flexible commerce solution.
    • Dynamo Personalization Server and Dynamo Scenario Server are today packaged together as the ATG Adaptive Scenario Engine.
    • The Dynamo platform itself has undergone the most significant metamorphosis. As J2EE made its mark on the industry, ATG made the decision to leverage J2EE application servers from other vendors, and so the 'special' features of Dynamo that were not J2EE -specific, but were still very valuable to developers, were separated out and packaged as the Dynamo Application Framework or 'DAF'. DAF runs as a J2EE Web application, and ATG support its execution on Dynamo Application Server (DAS), IBM WebSphere, BEA WebLogic and JBoss Application Server. DAF is not sold as a product, but it appears in the ATG Adaptive Scenario Engine. DAF contains many of the important and innovative technologies that are used by ATG applications. Two of the most important are Nucleus and Data Anywhere Architecture. Nucleus is a fast, lightweight component model which implements the Inversion of Control and Dependency Injection patterns, and Data Anywhere Architecture is an object/relational mapping, data persistence and caching technology, both of which help to build scalable, flexible Web applications.

    So, while the Dynamo name has taken a back seat, the technology behind it continues to evolve and to be used extensively throughout ATG's applications. Importantly, as the industry has adopted the J2EE standards, the innovative technology created by ATG continues to complement and enhance J2EE to help organizations build exciting, user-centric, personalized, scalable Web applications. And it's all proven technology.


    The ATG Dynamo e-Business Platform is a flexible, Java-based platform for building personalized applications for the Web and other communication channels (e-mail messaging, wireless devices, etc.). Introduces you to the e-Business Platform and its related products, including Dynamo Application Server, the ATG Portal and ATG Commerce applications, and the ATG Control Center.

    Application server
    ATG Portal application
    ATG Commerce applications
    ATG Dynamo e-Business

    As shown above, the Dynamo e-Business Platform consists of three core components:


    • The Dynamo Application Framework (DAF) runs on top of your application server and supplies essential facilities for application development and deployment (Nucleus, Repositories, tag libraries, security, etc.). This portable framework is designed to run on the industry’s leading J2EE application servers, including ATG’s own Dynamo application Server (DAS), BEA’s Web logic Server and IBM’s Web Sphere Application Server.

    • Dynamo Personalization Server (DPS) adds visitor profiling, content management, and content targeting capabilities to the e-Business Platform, enabling you to deliver personalized content to customers according to their characteristics and preferences.

    • Dynamo Scenario Server (DSS) offers advanced scenario-based personalization features for customer relationship management. Using visual layout tools, business managers can implement, test, and fine-tune their customer management initiatives by designing sequences of targeted interactions that track and respond to customer behavior. DSS also includes powerful data analysis and reporting tools, including ready-made business chart templates, for charting the data collected from scenarios.


    The following ATG products run on top of the Dynamo e-Business Platform and provide tools for developing personalized e-commerce and portal applications:

    The ATG Portal application provides a customizable framework that combines content and services in a unified desktop for your end users. You can create a series of portals for your various visitors. A B2B commerce site, for example, might have different portals for first time visitors, returning customers, and internal employees. Each of these portals can share content and access, while maintaining high levels of security, ensuring that sensitive information and access is not shared with the wrong audience.

    • The ATG Commerce applications include everything you need to build and manage personalized e-commerce sites. ATG Consumer Commerce provides business-to consumer (B2C) storefront development features -- product catalog management, pricing, inventory, customer service, etc. ATG Business Commerce adds support for business-to-business (B2B) transactions, including B2B payment methods (purchase orders, requisitions), account-specific product catalogs and price lists, multiple shipping/billing addresses, etc.

    Finally, the ATG Control Center (ACC) is the all-in-one user interface for the entire Dynamo e-Business Platform. The ACC gives application developers, page designers, site administrators, business managers, and other members of your project team a central control point for building and maintaining Dynamo applications.

    Application Framework supplies additional facilities for application development and deployment,
    Including:

    Nucleus is the central registry for the Java Beans, Servlets, and Enterprise Java Beans (Ebbs) that contain your application-level logic. It creates and configures dynamo components and organizes them into a hierarchical namespace (for example, /atg/dynamo/service/Scheduler) so they can be referenced by other components. By reading the various configuration files associated with each component, Nucleus figures out what components to use in an application, what there initial properties are, and how they connect to each other.

    Dynamo Server Pages and Dynamo Servlet Beans. Dynamo Server Pages (DSP) is based on ATG’s JHTML markup language. While structurally similar to ordinary HTML documents, Dynamo Server Pages support specialized markup tags that render content dynamically by linking directly to Nucleus components. The tag, for instance, enables page developers to embed dynamic elements called Dynamo servlet Beans. Using Dynamo Servlet Beans minimizes the amount of Java code that has to appear in HTML and the amount of HTML that has to be hard-coded into Java classes. As a result, Java programmers and page designers can work independently and applications are easier to maintain.

    DSP Tag Library. If your application is based on Java Server Pages (JSPs), you can use the DSP tag library to access Nucleus components and render dynamic content from your JSPs. These tags are the JSP equivalents to the JHTML tags used in Dynamo Server Pages.

    Repository API. The Repository API (atg.repository. *) is a data access layer that defines a generic representation of a data store. Whenever Dynamo needs to load, store, or query a data store, it makes the appropriate calls through this API. The API then accesses the underlying data storage device, translating the API calls into whatever calls are needed to access that particular data store. Implementations of the repository API exist to access data stores such as relational databases, LDAP systems, and content management systems.

    Security API. The Security API (ATG.security. *) provides authentication/access control
    Support for applications. It enables you to access and manage user security information in XML files, SQL databases, and LDAP directory services.

    Session Backup. Dynamo’s session backup feature enables you to save and restore important session-scoped Nucleus components and component properties. For example, backing up the user profile ID property allows the server to access persistent information in the user’s profile without requiring the user to log in again.

    Logging and Data Collection. Dynamo includes three different systems for sending, receiving, and recording messages generated by components: Logging (which logs system messages to a flat file), Data Collection (which can record data contained in any JavaBean), and Recorders (which collects data through scenarios).

    Supported Application Servers
    The portability of the Dynamo Application Framework enables you to run your e-Business applications on ATG Dynamo

    Full J2EE 1.2 support. DAS fully supports the Java 2, Enterprise Edition (J2EE) 1.2 standard (EJBs, JSPs, tag libraries, servlets, JavaMail, JDBC, JNDI, etc.). Depending on the needs of your site, you can write applications based on Dynamo Server Pages, or J2EE applications based on JavaServer Pages. The ATG Control Center includes a deployment Editor that guides you through the process of developing and deploying J2EE applications on DAS.

    Multi-channel support. DAS includes full support for Wireless Application Protocol (WAP) standards, including Wireless Markup Language (WML), compact HTML (CHTML), and Short Message Service (SMS). DAS is configured out of the box to handle requests from WAP-enabled devices just like it handles requests from conventional browsers.

    Extensible Markup Language (XML) support. DAS incorporates the Apache Xerces XML parser and the Apache Xalan XSLT engine, and provides several Dynamo servlet beans for transforming XML documents into Java objects. You can display the transformed XML document in a Dynamo Server Page, using templates defined either according to the Extensible Stylesheet Language (XSL) specification or in a Dynamo Server Page.

    Session management. DAS provides advanced session management facilities for web sites running multiple Dynamo servers. Session Failover lets you redirect user requests to another Dynamo server running a copy of the original application. Thus, if a Dynamo server fails, session data is moved to another Dynamo server running the same application. Session Migration is like session Failover, except that it is a planned porting of sessions. Session migration lets you transfer sessions even to servers at a different site with a different domain, while preserving session information. Session federation lets users have multiple active sessions on multiple associated dynamo applications. As users move from one Dynamo application to another, there sessions can be maintained separately within each Dynamo application, while sharing session
    Information.

    Dynamic load balancing. DAS is equipped with its own dynamic load balancing
    Facility, the Dynamo Load Manager, which automatically allocates the load across
    Available servers. If a site has multiple HTTP servers, it can use HTTP load distribution to
    Allocate user requests among the HTTP servers. If the site has multiple Dynamo
    Servers, it can use Dynamo Load Managers to distribute the load among the Dynamo
    Servers. With both multiple HTTP servers and multiple Dynamo servers, the site could
    Use both forms of load distribution. Dynamo’s flexible architecture allows it to work
    With one or both of these load distribution models simultaneously and allows DAS to
    Scale to support large, high-volume Web sites.

    Transaction management. DAS includes a transaction manager that monitors
    Transactions across resources (i.e., databases, messaging systems, etc.). It provides
    Developers with a standard framework for implementing high-performance database
    Applications and ensures the integrity of transactions across resources using a two phase
    Commit protocol

    Dynamo Message System (DMS). DAS provides a comprehensive messaging
    Architecture called the Dynamo Message System (DMS). DMS is based on the Java
    Message System API (JMS), but also provides several features and subsystems that
    Make JMS more accessible to applications. DMS can be used with J2EE and non-J2EE
    Applications. For non-J2EE applications, DAS provides a system called Patch Bay, which
    Is designed to ease the development of messaging applications in Dynamo.

    Diagnostic tools. DAS includes a variety of diagnostic and administrative tools to help
    Keep your applications running smoothly. For example, the ATG Control Center
    Includes an SNMP-based Dynamo System Console that monitors your running
    Dynamo servers and provides readings in an easy-to-use graphical interface.

    Wednesday, March 18, 2009

    Relational Database Concept




    Relational Model


    The principles of the relational model were first outlined by Dr. E. F. Codd in a June 1970 paper called “A Relational Model of Data for Large Shared Data Banks.” In this paper, Dr. Codd proposed the relational model for database systems.

    The more popular models used at that time were hierarchical and network, or even simple flat file data structures. Relational database management systems (RDBMS) soon became very popular, especially for their ease of use and flexibility in structure. In addition, a number of innovative vendors, such as Oracle, supplemented the RDBMS with a suite of powerful application development and user products, providing a total solution.

    Components of the Relational Model

    • Collections of objects or relations that store the data
    • A set of operators that can act on the relations to produce other relations
    • Data integrity for accuracy and consistency

    Definition of a Relational Database

    A relational database uses relations or two-dimensional tables to store information.
    For example, you might want to store information about all the employees in your company. In a
    relational database, you create several tables to store different pieces of information about your
    employees, such as an employee table, a department table, and a salary table.

    Data Models

    Models are a cornerstone of design. Engineers build a model of a car to work out any details before putting it into production. In the same manner, system designers develop models to explore ideas and improve the understanding of the database design.

    Purpose of Models

    Models help communicate the concepts in people’s minds. They can be used to do the following:
    • Communicate
    • Categorize
    • Describe
    • Specify
    • Investigate
    • Evolve
    • Analyze
    • Imitate

    The objective is to produce a model that fits a multitude of these uses, can be understood by an end user, and contains sufficient detail for a developer to build a database system.


    Relational Database Properties

    A relational database:
    • Can be accessed and modified by executing structured query language (SQL) statements
    • Contains a collection of tables with no physical pointers
    • Uses a set of operators

    VOICEMAIL PIN RESET FOR CISCO PHONE SET THROUGH UTILITY

    1) Disable proxy in browser and type IP address 155.126.196.6
    2) Open Unity Connection
    3) Now eg. Needs to reset password for SEOUL GO the go to excel sheet (Voice sites pbx-passwords compiled SP revision v2.0.xls) and look for username and password for that place.
    4) Here suppose for seoul User name : seouladmin/ Pass: ciscovoice
    5) Enter that username and password.
    6 )then go to USERS -> USERS -> FIND EXTN.
    7) If you get the extension verify it with username in ticket
    8) Click on it got to EDIT -> CHANGE PASSWORD.
    9) Set the password you want.


    HOW TO CHANGE PROFILE

    1) PELPP:DIR = EXTN NO.
    2) In that check second number , it is extension for the voicemail.
    3) TO CREATE A NEW PROFILE ENTER FOLLOWING COMMAND
    a. PELPI:DIR=EXTN NO, LIST=NO.OF.LIST,CHO=1(INDICATES EXTN NO),ANSPOS=EXTN,TIME=X SECONDS;
    4) check again all profile by entering PELPP command.