Content

Showing posts with label Internet. Show all posts
Showing posts with label Internet. Show all posts

Omniture | Omniture analytics | History and Definition of the Omniture | Omniture Logo

0 comments
Omniture is an online marketing and web analytics business unit owned by Adobe Systems. The Omniture Business Unit is based in Orem, Utah, with offices worldwide. It serves customers in 75 countries worldwide.

The company was founded in 1996 and was backed by venture capitalists including Hummer Winblad Venture Partners, University Venture Fund, and Scale Venture Partners. During a period of rapid growth, the company was one of Inc. Magazine's 500 fastest-growing private companies. Omniture was listed on the NASDAQ in 2006.

Omniture bought behavioral targeting company Touch Clarity for $51.5 million. In late 2007 the company acquired web analytics company Visual Sciences, Inc. (formerly WebSideStory) for $394 million, and also purchased Offermatica for $65 million. In October, 2008 it agreed to acquire the Israeli e-commerce search solution provider Mercado for $6.5 million.

On September 15, 2009, Omniture, Inc. and Adobe Systems announced that Adobe would be acquiring Omniture for roughly $1.8 billion. The deal was completed on October 23, 2009, and is now run as the Omniture Business Unit within Adobe.

Products
  1. SiteCatalyst, Omniture's software as a service application, offers Web analytics (client-side analytics).
  2. SearchCenter+ assists with paid search and content network optimization in systems such as Google's AdWords, Yahoo! Search Marketing, Microsoft Ad Center, and Facebook Ads.
  3. DataWarehouse, data warehousing of SiteCatalyst data.
  4. Test&Target, A/B and MVT (multi-variate testing), derived from Offermatica
  5. Test&Target 1:1, Omniture's main behavioural targeting solution, derived in part from Touch Clarity, drills down to the individual level of testing.
  6. Discover, an advanced segmentation tool.
  7. Insight, a multichannel segmentation tool (both client-side and server-side analytics). Formerly called Discover on Premise, it was derived from Omniture's Visual Sciences acquisition in 2007.
  8. Insight for Retail, an Insight offering geared toward multiple online and offline retail channels.
  9. Genesis, a third-party data integration tool (the majority of integrations work with SiteCatalyst).
  10. Recommendations offers automated product and content recommendations.
  11. SiteSearch, an on-demand enterprise search product.
  12. Merchandising, a search and navigation offering for online stores.
  13. Publish, for web content management.
  14. Survey, to gather visitor sentiment.
  15. DigitalPulse, a Web analytics code configuration monitoring tool.
  16. VISTA, server-side analytics.
Critics have accused Omniture of attempting to hide the fact they are collecting data. Critics claim they do this by sending the information to a domain name that looks and sounds similar to an IP address used to connect to devices on the local network and not the Internet. This has led to speculation that the domain name is used to trick users or firewall rules. Omniture's SiteCatalyst and SearchCenter products use the 2o7.net domain name.

Omniture collects data from Apple and Adobe, who use Omniture to collect usage statistics across their products. It is possible to opt-out of the Omniture data-collection system, and to block the tracking.
Read more »

comScore | History and definition of the comScore | comScore Logo / Image

0 comments
comScore is an Internet marketing research company providing marketing data and services to many of the Internet's largest businesses. comScore tracks all internet data on its surveyed computers in order to study online behavior.

comScore Networks was founded in August 1999 in Reston, Virginia. The company was co-founded by Gian Fulgoni, who was for many years the CEO of Information Resources (one of the world's largest market research companies) and Magid Abraham, who was also an ex-IRI employee and had served as President of IRI in the mid-1990s.

Magid and Gian came up with the idea while working with one of the original investors in the company, Mike Santer, who thought up the concept of creating a very large consumer panel online to track online commerce. The problem was that the traditional methods in which companies were tracking online behavior would not work in tracking commerce, because of the lower incidence of buying online. Normal panels in tracking visitation would be around 20-30K and, with less than 2-3% of the population buying online, the panel size needed to be at least 1-2MM. They decided to build a very large panel using more aggressive recruiting methodology and managing for the error by using advanced statistical methods and controls. Years and tens of millions of dollars went into finding the best ways to measure online buying and other behaviors, plus the level of accuracy required for Fortune 1000 companies to buy.

In 2000, comScore bought certain assets and the customer agreements of PCData of Reston, Virginia. PCData was among the earliest Web measurement firms, but increasing competitive challenges (including a threat of a patent infringement lawsuit by industry pioneer Media Metrix) put PCData's future in doubt. The acquisition of PCData's large customer base helped accelerate the growth of comScore's syndicated measurement service, which was widely considered to be more accurate than the service which PCData technology previously delivered.

By 2001, Media Metrix had built a market share lead but had been unable to create a sustainable financial structure. NetRatings, its closest competitor, was armed with strong capital reserves and announced its intention to acquire and integrate Media Metrix. However, after several months, the FTC announced its intention to block the acquisition and accordingly, NetRatings canceled the transaction. comScore was subsequently able to acquire Media Metrix in a deal announced in June 2002.

Media Metrix originated as PC Meter, a business unit of market research company NPD and began publishing statistics in January 1996. In July 1997, it changed its name to Media Metrix, citing the desire to track a wider variety of interactive traffic. In October 1998, Media Metrix merged with its nearest rival, Relevant Knowledge. The company went public as NASDAQ:MMXI in May 1999, reaching a market cap of $135 million on its first day of trading. In June 2000, the company acquired Jupiter Communications for $414 million in stock and changed its name to Jupiter Media Metrix. In the aftermath of the dot-com bubble collapse and associated downturn in internet marketing spending, Jupiter sold the Media Metrix service to rival comScore for $1.5 million in June 2002.

On March 30, 2007, comScore announced its intent to sell shares in an initial public offering and be traded on the Nasdaq using the symbol "SCOR".

In May 2008, comScore announced its acquisition of M:Metrics, a company measuring mobile content consumption. The transaction involved a cash payment of $44.3 million and the issue of approximately 50,000 options to purchase shares of comScore common stock to certain M:Metrics unvested option holders.

comScore announced in October 2009 the acquisition of Certifica, a provider of real-time web measurement and digital marketing technology solutions in Latin America. Based in Santiago, Chile, Certifica has offices throughout Latin America, including Mexico, Brazil, Argentina, Colombia and Peru. The acquisition enhanced comScore’s presence in the rapidly-developing Latin American market.

In February 2010, comScore announced it had signed a definitive agreement to acquire the ARSgroup in an all-cash acquisition. Headquartered in Evansville, Indiana, ARSgroup’s areas of expertise include: brand strategy, all stages of creative development, campaign evaluation across all marketing and media channels, media planning and strategy, return-on-investment, and forecasting.

On July 1, 2010, comScore announced that it has acquired the products division of Nexius, Inc.

comScore then acquired web analytics and video measurement solutions provider Nedstat for approximately $36.7 million. Announced on September 1, 2010, the acquisition helps comScore accelerate its global expansion strategy, particularly in European markets.

comScore maintains a group of users who have monitoring software (with brands including PermissionResearch, OpinionSquare and VoiceFive Networks) installed on their computers. In exchange for joining the comScore research panels, users are presented with various benefits, including computer security software, Internet data storage, virus scanning and chances to win cash or prizes.

comScore is up-front about collecting user data and the software's ability to track all of a user's internet traffic, including normally secure (https://) connections used to communicate banking and other confidential information.

comScore estimates that two million users are part of the monitoring program. However, self-selected populations, no matter how large, may not be representative of the population as a whole. To obtain the most accurate data, comScore adjusts the statistics using weights to make sure that each population segment is adequately represented. To calculate these weights, comScore regularly recruits panelists using random digit dialing and other offline recruiting methods to accurately determine how many users are online, aggregated by geography, income and age. Correcting the comScore data requires having accurate demographics about the larger pool of users. However, some comScore users are recruited without being asked to give demographic information and, in other cases, users may not be truthful about their demographics. To ensure the accuracy of the data, comScore verifies its users' demographics during the course of measuring statistical data.

The corrected data is used to generate reports on topics ranging from web traffic to video streaming activity and consumer buying power.

With the introduction of Unified Digital Measurement (UDM) in May 2009, comScore implemented a solution to digital audience measurement that organically blended both panel and census-based measurement approaches into a single unified methodology. comScore has developed this proprietary methodology to calculate audience reach in a manner not affected by variables such as cookie deletion and cookie blocking/rejection to help reconcile longstanding differences between the two measurement approaches.

Magid Abraham, President, CEO, and Co-Founder, comScore, Inc. received the 2009 Charles Coolidge Parlin Marketing Research Award at the 2009 AMA Marketing Research Conference.

comScore was selected as a winner of the 2009 Chicago Innovation Awards for its creative development of AdEffx in October 2009.

In June 2009 comScore and the GSM Association won the M.E.F. Award for Business Intelligence in Mobile Media.

comScore was rated as the preferred audience measurement service by 50.4 percent of respondents to the William Blair & Company 6th semiannual survey of the members of the Chicago Interactive Marketing Association (CIMA).

comScore was ranked as the 15th largest U.S. market research firm based on 2008 domestic revenues, growing faster than each of the largest 25 research firms, according to the 2008 Honomichl Top 50 report.

comScore was selected by World Economic Forum as one of 47 innovative companies in 2007.

Magid Abraham, Ph.D, co–founder and CEO of comScore, Inc. was honored with the Eighth Annual Buck Weaver Award for Marketing. The award recognizes individuals who have made important contributions to the advancement of theory and practice in marketing science.

comScore Trees for Knowledge is a partnership with the non-profit organization, Trees for the Future, which provides resources to developing countries to improve rural livelihoods through the introduction of environmentally sustainable land management.

In 2008, comScore pledged an initial donation, which allowed for the planting of one million trees in developing communities throughout the world. Under this initiative, comScore has also pledged to continue to make donations when new panelists join and remain active in the comScore panel of global Internet users.
Read more »

Quantcast | about Quantcast | Quantcast Logo

0 comments
Quantcast is a media measurement, web analytics service that allows users to view audience statistics for millions of websites. Quantcast Corporation's prime focus is to analyze the Internet's web sites in order to obtain accurate usage statistics by surfers from the USA. Like Alexa, Quantcast rates Web pages by ranks. Quantcast statistics always refer to the usage from the United States, therefore Alexa data and Quantcast data do not always show the same results. Quantcast does not require a toolbar to be installed upon one's web browser to obtain statistics.

Instead participating websites voluntarily insert Quantcast HTML code inside Web pages they wish to have included in statistics. This code allows Quantcast to keep track of the traffic directed towards those Web sites. Using this mechanism Quantcast can provide thorough details about Web pages created by participating publishers. Some of this information includes, for example, whether the Web page viewer is a male or female, whether the average viewer makes $30,000 USD annually or $100,000 USD annually, the age group of the viewer and the amount of U.S. homes the Web site reaches.

This information is provided by inference: comparing and correlating the information received from one participating publisher with another. The inferences are possible because the Quantcast code causes the user's browser to access Quantcast's servers, at which time they can log the user's IP address and information Quantcast places in cookies that are stored in the user's browser. The cookies significantly aid in making inferences. Quantcast also provides affinities revealing other popular sites that the average viewer browses. This is possible by tracking "referrer" information that is normally included as part of every HTTP request made by the user's browser. For instance a person browsing a page on Quantcast aimed towards music might also browse sites based on music downloads.

In February 2010, Quantcast was ranked 46th on Fast Company's 2010 list of the World's Most Innovative Companies and placed third in the Top 10 Most Innovative Companies on the Web, behind Facebook and Google respectively.

In July 2010 BBC News reported that a legal challenge has been launched in the US against a number of websites amid claims that they were engaged in "covert surveillance" of users through the use of a Quantcast Flash application to restore deleted cookies.
Read more »

NetBIOS | Understanding and definition of the NetBIOS

0 comments
NetBIOS is an acronym for Network Basic Input/Output System. It provides services related to the session layer of the OSI model allowing applications on separate computers to communicate over a local area network. As strictly an API, NetBIOS is not a networking protocol. Older operating systems ran NetBIOS over IEEE 802.2 and IPX/SPX using the NetBIOS Frames (NBF) and NetBIOS over IPX/SPX (NBX) protocols, respectively. In modern networks, NetBIOS normally runs over TCP/IP via the NetBIOS over TCP/IP (NBT) protocol. This results in each computer in the network having both an IP address and a NetBIOS name corresponding to a (possibly different) host name.

NetBIOS was developed in 1983 by Sytek Inc. as an API for software communication over IBM's PC-Network LAN technology. On PC-Network, as an API alone, NetBIOS relied on proprietary Sytek networking protocols for communication over the wire. Because PC-Network only supported up to 80 devices in its most accommodating mode (baseband), NetBIOS was itself designed with limited nodes in mind.

In 1985, IBM went forward with the token ring network scheme and a NetBIOS emulator was produced to allow NetBIOS-aware applications from the PC-Network era to work over this new design. This emulator, named NetBIOS Extended User Interface (NetBEUI), expanded the base NetBIOS API with, among other things, the ability to deal with the greater node capacity of token ring. A new networking protocol, NBF, was simultaneously produced to allow NetBEUI (NetBIOS) to provide its services over token ring — specifically, at the IEEE 802.2 Logical Link Control layer.

Also in 1985, Microsoft created a NetBIOS implementation for its MS-NET networking technology. As in the case of IBM's token ring, the services of Microsoft's NetBIOS implementation were provided over the IEEE 802.2 Logical Link Control layer by the NBF protocol.

In 1986, Novell released Advanced Novell NetWare 2.0 featuring the company's own NetBIOS emulator. Its services were encapsulated within NetWare's IPX/SPX protocol using the NetBIOS over IPX/SPX (NBX) protocol.

In 1987, a method of encapsulating NetBIOS in TCP and UDP packets, NetBIOS over TCP/IP (NBT), was published. It was described in RFC 1001 ("Protocol Standard for a NetBIOS Service on a TCP/UDP Transport: Concepts and Methods") and RFC 1002 ("Protocol Standard for a NetBIOS Service on a TCP/UDP Transport: Detailed Specifications"). The NBT protocol was developed in order to "allow an implementation [of NetBIOS applications] to be built on virtually any type of system where the TCP/IP protocol suite is available," and to "allow NetBIOS interoperation in the Internet."

After the PS/2 computer hit the market in 1987, IBM released the PC LAN Support Program, which included a driver for NetBIOS.

Worth noting is the popular confusion between the names NetBIOS and NetBEUI. NetBEUI originated strictly as the moniker for IBM's enhanced 1985 NetBIOS emulator for token ring. The name NetBEUI should have died there, considering that at the time, the NetBIOS implementations by other companies were known simply as NetBIOS regardless of whether they incorporated the API extensions found in that emulator. For MS-NET, however, Microsoft elected to name its implementation of the NBF protocol "NetBEUI" — literally naming its implementation of the transport protocol after IBM's second version of the API. Consequently, even today, Microsoft file and printer sharing over Ethernet continues to be called NetBEUI, with the name NetBIOS commonly used only in reference to file and printer sharing over TCP/IP. In truth, the former is NetBIOS over NBF, and the latter is NetBIOS over NBT.

Since its original publishing in a technical reference book from IBM, the NetBIOS API specification has become a de facto standard.

NetBIOS provides three distinct services:
  1. Name service for name registration and resolution.
  2. Session service for connection-oriented communication.
  3. Datagram distribution service for connectionless communication.
(Note: SMB, an upper layer, is a service that runs on top of the Session Service and the Datagram service, and is not to be confused as a necessary and integral part of NetBIOS itself. It can now run atop TCP with a small adaptation layer that adds a packet length to each SMB message; this is necessary because TCP only provides a byte-stream service with no notion of packet boundaries.

In order to start sessions or distribute datagrams, an application must register its NetBIOS name using the name service. NetBIOS names are 16 octets in length and vary based on the particular implementation. Frequently, the 16th octet is used to designate a "type" similar to the use of ports in TCP/IP. It is called the NetBIOS Suffix (read below) or "resource type", and is used to tell other applications what type of services the system offers. In NBT, the name service operates on UDP port 137 (TCP port 137 can also be used, but it is rarely, if ever, used).

The name service primitives offered by NetBIOS are:
  1. Add name — registers a NetBIOS name.
  2. Add group name — registers a NetBIOS "group" name.
  3. Delete name — un-registers a NetBIOS name or group name.
  4. Find name — looks up a NetBIOS name on the network.
NetBIOS name resolution is not supported by Microsoft for Internet Protocol Version 6 (IPv6).

Session mode lets two computers establish a connection for a "conversation", allows larger messages to be handled, and provides error detection and recovery. In NBT, the session service runs on TCP port 139.

The session service primitives offered by NetBIOS are:
  1. Call — opens a session to a remote NetBIOS name.
  2. Listen — listen for attempts to open a session to a NetBIOS name.
  3. Hang Up — close a session.
  4. Send — sends a packet to the computer on the other end of a session.
  5. Send No Ack — like Send, but doesn't require an acknowledgment.
  6. Receive — wait for a packet to arrive from a Send on the other end of a session.
In the original protocol used to implement NetBIOS services on PC-Network, to establish a session, the computer establishing the session sends an Open request which is responded to by an Open acknowledgment. The computer that started the session will then send a Session Request packet which will prompt either a Session Accept or Session Reject packet. Data is transmitted during an established session by data packets which are responded to with either acknowledgment packets (ACK) or negative acknowledgment packets (NACK). Since NetBIOS is handling the error recovery, NACK packets will prompt retransmission of the data packet. Sessions are closed by the non-initiating computer by sending a close request. The computer that started the session will reply with a close response which prompts the final session closed packet.

Datagram mode is "connectionless". Since each message is sent independently, they must be smaller; the application becomes responsible for error detection and recovery. In NBT, the datagram service runs on UDP port 138.

The datagram service primitives offered by NetBIOS are:
  1. Send Datagram — send a datagram to a remote NetBIOS name.
  2. Send Broadcast Datagram — send a datagram to all NetBIOS names on the network.
  3. Receive Datagram — wait for a packet to arrive from a Send Datagram operation.
  4. Receive Broadcast Datagram — wait for a packet to arrive from a Send Broadcast Datagram operation.
The NetBIOS name is 16 ASCII characters, however Microsoft limits the host name to 15 characters and reserves the 16th character as a NetBIOS Suffix. This suffix describes the service or name record type such as host record, master browser record, or domain controller record. The host name (or short host name) is specified when Windows networking is installed/configured, the suffixes registered are determined by the individual services supplied by the host. In order to connect to a computer running TCP/IP via its NetBIOS name, the name must be resolved to a network address. Today this is usually an IP address (the NetBIOS name-IP address resolution is often done by either broadcasts or a WINS Server — NetBIOS Name Server). A computer's NetBIOS name is often the same as that computer's host name (see below), although truncated to 15 characters, but it may also be completely different. NetBIOS names can include almost any combination of alphanumeric characters except for spaces and the following characters: [ \ / : * ? " ; | +
]

The Windows LMHOSTS file provides a NetBIOS name resolution method that can be used for small networks that do not use a WINS server.

A Windows machine's NetBIOS name is not to be confused with the computer's host name. Generally a computer running TCP/IP (whether it's a Windows machine or not) has a host name (also sometimes called a machine name or a DNS name). Generally the host name of a Windows computer is based on the NetBIOS name plus the Primary DNS Suffix, which are both set in the System Properties dialog box.

There may also be "connection specific suffixes" which can be viewed or changed on the DNS tab in Control Panel → Network → TCP/IP → Advanced Properties. Host names are used by applications such as telnet, ftp, web browsers, etc. In order to connect to a computer running the TCP/IP protocol using its HOST name, the host name must be resolved into an IP Address. Host name- or Fully Qualified Domain Name (FQDN)-IP address resolution is typically done by a Domain Name System (DNS) server.

The node type of a networked computer relates to the way it resolves NetBIOS names to IP addresses. There are four node types.
  1. B-node: 0x01 Broadcast
  2. P-node: 0x02 Peer (WINS only)
  3. M-node: 0x04 Mixed (broadcast, then WINS)
  4. H-node: 0x08 Hybrid (WINS, then broadcast)
The node type in use is displayed by opening a command line and typing ipconfig /all. A Windows computer registry may also be configured in such a way as to display "unknown" for the node type.

The NetBIOS suffix, alternately called the NetBIOS End Character (endchar) is the 16th character of a NetBIOS name. This character specifies the record or service type for the registered name record. The number of record types is limited to 255. However in actual use the number of commonly used NetBIOS Suffixes is substantially smaller. The most common NetBIOS Suffixes:

ASCII Values of 16th characters of NetBIOS "names"
  1. 00: Workstation Service
  2. 03: Messenger Service
  3. 20: File Service (also called Host Record)
  4. 1B: Domain Master Browser - Primary Domain Controller for a domain
  5. 1C: Domain Controllers for a domain (group record with up to 25 IP addresses)
  6. 1D: Master Browser
  7. 1E: Browser Service Elections
The Microsoft adaptation of the IBM NetBIOS protocol. NetBEUI expands on NetBIOS by adding a Transport layer component. NetBEUI is a fast and efficient protocol that consumes few network resources, provides excellent error correction, and requires little configuration.
Read more »

Cross-browser | Understanding and definition of the Cross-browser

0 comments
Cross-browser refers to the ability for a website, web application, HTML construct or client-side script to support all the web browsers. The term cross-browser is often confused with multi-browser. Multi-browser is a new paradigm in web development that allows a website or web application to provide more functionality over several web browsers, while ensuring that the website or web application is accessible to the largest possible audience without any loss in performance. Cross-browser capability allows a website or web application to be properly rendered by all browsers. The term cross-browser have existed since the web development began.

The term is still in use, but to lesser extent. The main reasons for this are:
  1. Later versions of both Internet Explorer and Netscape included support for HTML 4.0 and CSS1, proprietary extensions were no longer required to accomplish many commonly desired designs.
  2. Somewhat more compatible DOM manipulation techniques became the preferred method for writing client-side scripts.
  3. The browser market has broadened, and to claim cross-browser compatibility, the website is nowadays expected to support browsers such as Mozilla Firefox, Opera,[Chrome] and Safari in addition to Internet Explorer and Netscape.
  4. There has been an attitude shift towards more compatibility in general. Thus, some degree of cross-browser support is expected and only its absence needs to be noted.
The history of cross-browser is involved with the history of the "browser wars" in the late 1990s between Netscape Navigator and Microsoft Internet Explorer as well as with that of JavaScript and JScript, the first scripting languages to be implemented in the web browsers. Netscape Navigator was the most widely used web browser at that time and Microsoft had licensed Mosaic to create Internet Explorer 1.0. New versions of Netscape Navigator and Internet Explorer were released at a rapid pace over the following few years. Due to the intense competition in the web browser market, the development of these browsers were fast-paced and new features were added without any coordination between vendors. The introduction of new features often took priority over bug fixes, resulting in unstable browsers, fickle web standards compliance, frequent crashes and many security holes.

he World Wide Web Consortium (W3C), founded in 1994 to promote open standards for the World Wide Web, pulled Netscape and Microsoft together with other companies to develop a standard for browser scripting languages called "ECMAScript". The first version of the standard was published in 1997. Subsequent releases of JavaScript and JScript would implement the ECMAScript standard for greater cross-browser compatibility. After the standardization of ECMAScript, W3C began work on the standardization of Document Object Model (DOM), which is a way of representing and interacting with objects in HTML, XHTML and XML documents. DOM Level 0 and DOM Level 1 were introduced in 1996 and 1997. Only limited supports of these were implemented by the browsers, as a result, non-conformant browsers such as Internet Explorer 4.x and Netscape 4.x were still widely used as late as 2000. DOM Standardization become popular since the introduction of DOM Level 2, which published in 2000. It introduced the "getElementById" function as well as an event model and support for XML namespaces and CSS. DOM Level 3, the current release of the DOM specification, published in April 2004, added support for XPath and keyboard event handling, as well as an interface for serializing documents as XML. By 2005, large parts of W3C DOM were well-supported by common ECMAScript-enabled browsers, including Microsoft Internet Explorer, Opera, Safari and Gecko-based browsers (like Firefox, SeaMonkey and Camino).
Read more »

Web hosting service | Understanding and definition of the Web hosting service

0 comments
A web hosting service is a type of Internet hosting service that allows individuals and organizations to make their own website accessible via the World Wide Web. Web hosts are companies that provide space on a server they own or lease for use by their clients as well as providing Internet connectivity, typically in a data center. Web hosts can also provide data center space and connectivity to the Internet for servers they do not own to be located in their data center, called colocation or Housing as it is commonly called in Latin America or France.

The scope of web hosting services varies greatly. The most basic is web page and small-scale file hosting, where files can be uploaded via File Transfer Protocol (FTP) or a Web interface. The files are usually delivered to the Web "as is" or with little processing. Many Internet service providers (ISPs) offer this service free to their subscribers. People can also obtain Web page hosting from other, alternative service providers. Personal web site hosting is typically free, advertisement-sponsored, or inexpensive. Business web site hosting often has a higher expense.

Single page hosting is generally sufficient only for personal web pages. A complex site calls for a more comprehensive package that provides database support and application development platforms (e.g. PHP, Java, Ruby on Rails, ColdFusion, and ASP.NET). These facilities allow the customers to write or install scripts for applications like forums and content management. For e-commerce, SSL is also highly recommended.

The host may also provide an interface or control panel for managing the Web server and installing scripts as well as other modules and service applications like e-mail. Some hosts specialize in certain software or services (e.g. e-commerce). They are commonly used by larger companies to outsource network infrastructure to a hosting company.

Hosting uptime refers to the percentage of time the host is accessible via the internet. Many providers state that they aim for at least 99.9% uptime (roughly equivalent to 45 minutes of downtime a month, or less), but there may be server restarts and planned (or unplanned) maintenance in any hosting environment and this may or may not be considered part of the official uptime promise.

Many providers tie uptime and accessibility into their own service level agreement (SLA). SLAs sometimes include refunds or reduced costs if performance goals are not met.

Internet hosting services can run Web servers; see Internet hosting services.

Many large companies who are not internet service providers also need a computer permanently connected to the web so they can send email, files, etc. to other sites. They may also use the computer as a website host so they can provide details of their goods and services to anyone interested. Additionally these people may decide to place online orders.
  1. Free web hosting service: offered by different companies with limited services, sometimes supported by advertisements, and often limited when compared to paid hosting.
  2. Shared web hosting service: one's website is placed on the same server as many other sites, ranging from a few to hundreds or thousands. Typically, all domains may share a common pool of server resources, such as RAM and the CPU. The features available with this type of service can be quite extensive. A shared website may be hosted with a reseller.
  3. Reseller web hosting: allows clients to become web hosts themselves. Resellers could function, for individual domains, under any combination of these listed types of hosting, depending on who they are affiliated with as a reseller. Resellers' accounts may vary tremendously in size: they may have their own virtual dedicated server to a colocated server. Many resellers provide a nearly identical service to their provider's shared hosting plan and provide the technical support themselves.
  4. Virtual Dedicated Server: also known as a Virtual Private Server (VPS), divides server resources into virtual servers, where resources can be allocated in a way that does not directly reflect the underlying hardware. VPS will often be allocated resources based on a one server to many VPSs relationship, however virtualisation may be done for a number of reasons, including the ability to move a VPS container between servers. The users may have root access to their own virtual space. Customers are sometimes responsible for patching and maintaining the server.
  5. Dedicated hosting service: the user gets his or her own Web server and gains full control over it (user has root access for Linux/administrator access for Windows); however, the user typically does not own the server. Another type of Dedicated hosting is Self-Managed or Unmanaged. This is usually the least expensive for Dedicated plans. The user has full administrative access to the server, which means the client is responsible for the security and maintenance of his own dedicated server.
  6. Managed hosting service: the user gets his or her own Web server but is not allowed full control over it (user is denied root access for Linux/administrator access for Windows); however, they are allowed to manage their data via FTP or other remote management tools. The user is disallowed full control so that the provider can guarantee quality of service by not allowing the user to modify the server or potentially create configuration problems. The user typically does not own the server. The server is leased to the client.
  7. Colocation web hosting service: similar to the dedicated web hosting service, but the user owns the colo server; the hosting company provides physical space that the server takes up and takes care of the server. This is the most powerful and expensive type of web hosting service. In most cases, the colocation provider may provide little to no support directly for their client's machine, providing only the electrical, Internet access, and storage facilities for the server. In most cases for colo, the client would have his own administrator visit the data center on site to do any hardware upgrades or changes.
  8. Cloud Hosting: is a new type of hosting platform that allows customers powerful, scalable and reliable hosting based on clustered load-balanced servers and utility billing. A cloud hosted website may be more reliable than alternatives since other computers in the cloud can compensate when a single piece of hardware goes down. Also, local power disruptions or even natural disasters are less problematic for cloud hosted sites, as cloud hosting is decentralized. Cloud hosting also allows providers (such as Amazon) to charge users only for resources consumed by the user, rather than a flat fee for the amount the user expects they will use, or a fixed cost upfront hardware investment. Alternatively, the lack of centralization may give users less control on where their data is located which could be a problem for users with data security or privacy concerns.
  9. Clustered hosting: having multiple servers hosting the same content for better resource utilization. Clustered Servers are a perfect solution for high-availability dedicated hosting, or creating a scalable web hosting solution. A cluster may separate web serving from database hosting capability. (Usually Web hosts use Clustered Hosting for their Shared hosting plans, as there are multiple benefits to the mass managing of clients)
  10. Grid hosting: this form of distributed hosting is when a server cluster acts like a grid and is composed of multiple nodes.
  11. Home server: usually a single machine placed in a private residence can be used to host one or more web sites from a usually consumer-grade broadband connection. These can be purpose-built machines or more commonly old PCs. Some ISPs actively attempt to block home servers by disallowing incoming requests to TCP port 80 of the user's connection and by refusing to provide static IP addresses. A common way to attain a reliable DNS hostname is by creating an account with a dynamic DNS service. A dynamic DNS service will automatically change the IP address that a URL points to when the IP address changes.
Some specific types of hosting provided by web host service providers:
  1. File hosting service: hosts files, not web pages
  2. Image hosting service
  3. Video hosting service
  4. Blog hosting service
  5. Pastebin Hosts text snippets
  6. Shopping cart software
  7. E-mail hosting service
Web hosting is often provided as part of a general Internet access plan; there are many free and paid providers offering these types of web hosting.

A customer needs to evaluate the requirements of the application to choose what kind of hosting to use. Such considerations include database server software, scripting software, and operating system. Most hosting providers provide Linux-based web hosting which offers a wide range of different software. A typical configuration for a Linux server is the LAMP platform: Linux, Apache, MySQL, and PHP/Perl/Python. The web hosting client may want to have other services, such as email for their business domain, databases or multi-media services for streaming media. A customer may also choose Windows as the hosting platform. The customer still can choose from PHP, Perl, and Python but may also use ASP .Net or Classic ASP. Web hosting packages often include a Web Content Management System, so the end-user does not have to worry about the more technical aspects.
Read more »

Bizrate | About Bizrate | Logo bizrate

0 comments
Since 1996, Bizrate® has been the leading resource for shoppers and retailers. Bizrate began as a business school assignment in the mind of founder Farhad Mohit, who felt that customers needed a way to navigate the growing landscape of online retailers. To provide customers with a way to assess the quality of different online stores, Bizrate launched the very first online customer feedback and ratings platform. With Bizrate, customers could rate their store experiences and retailers could learn about how they were performing. In October 1999, Bizrate integrated product search into the site to provide a holistic shopping experience.

Today, Bizrate continues to help shoppers find the best value for all the products they are looking to buy. Bizrate also remains true to its roots, allowing shoppers to provide candid opinions on their experiences with different online retailers and allowing retailers to find shoppers as well as discover insights on how to provide the best customer service to those shoppers.
Find the best value with Bizrate

Bizrate is the trusted shopping resource, linking shoppers with over a million products, brands, and stores with one click. Bizrate enables shoppers to search for virtually every product, store, brand, and deal on the web. Shop the biggest names you know to the small but trustworthy stores just waiting to be found. Compare across products, prices, and store information. Quickly access all the ratings and reviews you need to make a confident buying decision. With Bizrate, shoppers can find the right product, at the right price, every time.

Great deals await. Shop www.bizrate.com now!
Grow customer loyalty and satisfaction with Bizrate Insights

Bizrate Insights is the customer feedback and ratings platform of Bizrate, providing tools and reports to over 6,000 retailers worldwide and empowering retailers to achieve their end goal of growing sales and customer loyalty. Bizrate Insights assists retailers in listening to their customers in way that is fast and measurable, resulting in insights, action, conversation, and customer loyalty.

Visit www.bizrateinsights.com to learn more about our free and premium buyer and non-buyer survey and reporting products.

Bizrate is part of a portfolio of shopping sites owned and operated by Shopzilla, Inc. To read more about how Bizrate came to be and our journey with parent company, Shopzilla, Inc. see our history .
Read more »

craigslist | History and definition of Craigslist

0 comments
Craigslist is a centralized network of online communities, featuring free online classified advertisements – with sections devoted to jobs, housing, personals, for sale, services, community, gigs, résumés, and discussion forums.

Craig Newmark began the service in 1995 as an email distribution list of friends, featuring local events in the San Francisco Bay Area, before becoming a web-based service in 1996 and expanding into other classified categories. It started expanding to other U.S. cities in 2000, and currently covers most of the countries in the world.

Having observed people helping one another in friendly, social and trusting communal ways on the Internet via the WELL, MindVox and Usenet, and feeling isolated as a relative newcomer to San Francisco, Craigslist founder Craig Newmark decided to create something similar for local events. In early 1995, he began an email distribution list of friends, featuring local events in the San Francisco Bay Area,

The initial technology encountered some limits, so by June 1995 majordomo had been installed and the mailing list "Craigslist" resumed operations. Most of the early postings were submitted by Newmark and were notices of social events of interest to software and Internet developers living and working in San Francisco.

Soon, word of mouth led to rapid growth. The number of subscribers and postings grew rapidly. There was no moderation and Newmark was surprised when people started using the mailing list for non-event postings. People trying to get technical positions filled found that the list was a good way to reach people with the skills they were looking for. This led to the addition of a category for "jobs". User demand for more categories caused the list of categories to grow. Community members started asking for a web interface. Craig registered "craigslist.org", and the web site went live in 1996.

By early 1998, Newmark still thought his career was as a software engineer ("hardcore java programmer") and that Craigslist was a cool hobby that was getting him invited to the best parties for geeks and nerds. In the fall of 1998, the name "List Foundation" was introduced and Craigslist started transitioning to the use of this name. In April 1999, when Newmark learned of other organizations called "List Foundation", the use of this name was dropped. iCraigslist incorporated as a private for-profit company in 1999. Around the time of these events, Newmark realized that the site was growing so fast that he could stop working as a software engineer and work full-time running Craigslist. By April 2000, there were nine employees working out of Newmark's apartment in San Francisco.

In January 2000, current CEO Jim Buckmaster joined the company as lead programmer and CTO. Buckmaster contributed the site's multi-city architecture, search engine, discussion forums, flagging system, self-posting process, homepage design, personals categories, and best-of-Craigslist feature. He was promoted to CEO in November 2000.

The web site expanded into nine more U.S. cities in 2000, four in 2001 and 2002 each, and 14 in 2003. On August 1, 2004, Craigslist began charging $25 to post job openings on the New York and Los Angeles pages. On the same day, a new section called "Gigs" was added, where low-cost and unpaid jobs and internships can be posted free.

The site serves over 20 billion page views per month, putting it in 37th place overall among web sites worldwide and 10th place overall among web sites in the United States (per Alexa.com on March 24, 2011), with over 49.4 million unique monthly visitors in the United States alone (per Compete.com on January 8, 2010). With over 80 million new classified advertisements each month, Craigslist is the leading classifieds service in any medium. The site receives over 2 million new job listings each month, making it one of the top job boards in the world. The classified advertisements range from traditional buy/sell ads and community announcements to personal ads.

Craiglist's main source of revenue is paid job ads in select cities – $75 per ad for the San Francisco Bay Area; $25 per ad for New York City, Los Angeles, San Diego, Boston, Seattle, Washington, D.C., Chicago, Philadelphia, Orange County (California) and Portland, Oregon – and paid broker apartment listings in New York City ($10 per ad).

The company does not formally disclose financial or ownership information. Analysts and commentators have reported varying figures for its annual revenue, ranging from $10 million in 2004, $20 million in 2005, and $25 million in 2006 to possibly $150 million in 2007.

On August 13, 2004, Newmark announced on his blog that auction giant eBay had purchased a 25% stake in the company from a former principal. Some fans of Craigslist expressed concern that this development would affect the site's longtime non-commercial nature, but it remains to be seen what ramifications the change will actually have. As of May 2011, there have been no substantive changes to the usefulness or non-advertising nature of the site—no banner ads, charges for a few services provided to businesses.

The company is believed to be owned principally by Newmark, Buckmaster, and eBay (the three board members). eBay owns approximately 25%, and Newmark is believed to own the largest stake.

In April 2008, eBay announced it was suing Craigslist to "safeguard its four-year financial investment." eBay claimed that in January 2008, Craigslist executives took actions that "unfairly diluted eBay's economic interest by more than 10%." In response, Craigslist filed a counter-suit against eBay in May 2008 "to remedy the substantial and ongoing harm to fair competition" that Craigslist claims is constituted by eBay's actions as Craigslist shareholders.

The site is notable for having undergone only minor design changes since its inception; even by 1996 standards, the design is very simple. Since 2001, the site design has remained virtually unchanged, and as of April 2010, Craigslist continues to avoid using images and uses only minimal CSS and JavaScript, a design philosophy common in the late 1990s but almost unheard of today for a major website.[citation needed]

Newmark says that Craigslist works because it gives people a voice, a sense of community trust and even intimacy. Other factors he cites are consistency of down-to-earth values, customer service and simplicity. Newmark was approached with an offer for running banner ads on Craigslist, but he decided to decline. In 2002, Craigslist staff posted mock-banner ads throughout the site as an April Fools' Day joke.

In March 2008, Spanish, French, Italian, German, and Portuguese became the first non-English languages supported.

Craigslist has a user flagging system to quickly identify illegal and inappropriate postings. Classified ad flagging does not require account log in or registration, and can be made anonymously by any visitor. When a certain number of users flag a posting, it is removed. The number of flaggings required for a posting's removal is variable and remains unknown to all but craigslist.org. Items are flagged for three categories: misplaced, prohibited, or spam/overpost. Although users are given a short description of each flagging category, users ultimately flag on their preference, prejudice, or misunderstanding of the Craigslist Terms of Use. Flaggings can also occur as acts of disruptive vandalism and for the removal of competitors postings. To better understand and clarify flagging it is up to the users to define rules themselves in such places as the Unofficial Flagging FAQ and the flag help forum. The Flag Help Forum is an unmoderated volunteer community, it is not staffed by Craigslist employees, and it is not affiliated with craigslist.org. The forum volunteers have no access to information about craigslist.org user accounts or ads, and must rely upon information supplied by the ad poster to try and piece together the reason an ad was flagged and removed. The Flag Help Forum's unmoderated format allows anyone to post anonymously and without accountability. The forums usefulness and effectiveness can be compromised by people who post malicious replies to help threads.

Over the years Craigslist has become a very popular online destination for arranging for dates, and sex. The personals section allows for postings that are for "strictly platonic", "dating/romance", and "sex".

The site has been found to be particularly appealing to help connect lesbians and gay men with one another because of its free and open nature in addition to it being hard to find gay people in one's area for some.

In 2005, San Francisco Craigslist's men seeking men section was attributed to facilitating sexual encounters and was the second most common correlation to Syphilis infections. The company has been pressured by San Francisco Department of Public Health officials leading Jim Buckmaster to state that the site has a very small staff and that the public must police themselves. They have however added links to San Francisco City Clinic and STD forums.

Advertisements for "adult" (previously "erotic") services were initially given special treatment, then closed entirely on September 4, 2010, following a controversy over claims by state attorneys general that the advertisements promoted prostitution.

In 2002, a disclaimer was put on the "men seeking men", "casual encounters", "erotic services", and "rants and raves" boards to ensure that those who clicked on these sections were over the age of 18, but no disclaimer was put on the "men seeking women", "women seeking men" or "women seeking women" boards. As a response to charges of discrimination and negative stereotyping, Buckmaster explained that the company's policy is a response to user feedback requesting the warning on the more sexually explicit sections, including "men seeking men." Today, all of the above listed boards (as well as some others) have a disclaimer.

On May 13, 2009, Craigslist announced that it would close the erotic services section, replacing it with an adult services section to be reviewed by Craigslist employees. This decision came after allegations by several U.S. states that the erotic services ads were being used for prostitution. Postings to the new category cost $10 and could be renewed for $5.

On September 4, 2010, Craigslist closed the adult services section of its website in the United States. The site initially replaced the adult services page link with the word "censored" in white-on-black text. The site received criticism and complaints from attorneys general that the section's ads were facilitating prostitution and child sex trafficking.

The adult services section link was still active in countries outside of the U.S. Matt Zimmerman, senior staff attorney for the Electronic Frontier Foundation, said, "Craigslist isn't legally culpable for these posts, but the public pressure has increased and Craigslist is a small company. Brian Carver, attorney and assistant professor at UC Berkeley, said that legal threats could have a chilling effect on online expression. "If you impose liability on Craigslist, YouTube and Facebook for anything their users do, then they're not going to take chances. It would likely result in the takedown of what might otherwise be perfectly legitimate free expression."

On September 8, 2010, the "censored" label and its dead link to adult services were completely removed.

Craigslist announced on September 15, 2010, that it had closed its adult services in the United States for good. However, it defended its right to carry such ads and its efforts to fight prostitution and sex trafficking. Free speech and some sex crime victim advocates criticized the removal of the section, saying that it threatened free speech and that it diminished law enforcement's ability to track criminals. However, the removal was applauded by many state attorneys general and some other groups fighting sex crimes. Craigslist said that there is some indication that those who posted ads in the adult services section are posting elsewhere. Sex ads cost $10 initially and it was estimated they would have brought in $44 million this year had they continued. In the four months following the closure, monthly revenue from sex ads on six other sites (primarily backpage.com) increased from $2.1 to $3.1 million, partly due to price increases.

On December 19, 2010, after pressure from Ottawa and several provinces, Craigslist closed 'Erotic Services' and 'Adult Gigs' from its Canadian website.

In 2001, the company started the Craigslist Foundation, a § 501(c)(3) nonprofit organization that offers free and low cost events and online resources to promote community building at all levels. It accepts charitable donations, and rather than directly funding organizations, it produces "face-to-face events and offers online resources to help grassroots organizations get off the ground and contribute real value to the community".

Since 2004, the Craigslist Foundation has hosted an annual conference called Boot Camp, an in-person event that focuses on skills for connecting, motivating and inspiring greater community involvement and impact. Boot Camp has drawn more than 10,000 people since its inception. The latest Boot Camp event was held on Saturday, August 14, 2010.

The Craigslist Foundation is also the fiscal sponsor for Our Good Works, the organization that manages AllforGood.org, an application that distributes volunteer opportunities across the web and helps people get involved in their communities.
Read more »

Joomla | Understanding and the definition Joomla

0 comments
Joomla!
Joomla! is a free and open source content management system (CMS) for publishing content on the World Wide Web and intranets. It comprises a model–view–controller (MVC) Web application framework that can also be used independently.

Joomla! is written in PHP, uses object-oriented programming (OOP) techniques and software design patterns, stores data in a MySQL database, and includes features such as page caching, RSS feeds, printable versions of pages, news flashes, blogs, polls, search, and support for language internationalization.

Within its first year of release, Joomla had been downloaded 2.5 million times. Between March 2007 and February 2011 there had been more than 21 million downloads. There are over 7,400 free and commercial extensions available from the official Joomla! Extension Directory and more available from other sources.

Joomla! was the result of a fork of Mambo on August 17, 2005. At that time, the Mambo name was trademarked by Miro International Pvt Ltd. who formed a non-profit foundation with the stated purpose to fund the project and protect it from lawsuits. The Joomla development team claimed that many of the provisions of the foundation structure went against previous agreements made by the elected Mambo Steering Committee, lacked the necessary consultation with key stake-holders and included provisions that violated core open source values.

The Joomla development team created a web site called OpenSourceMatters.org to distribute information to users, developers, web designers and the community in general. Project leader Andrew Eddie wrote a letter which appeared on the announcements section of the public forum at mamboserver.com.

A little more than one thousand people had joined the OpenSourceMatters.org web site within a day, most posting words of encouragement and support, and the web site received the Slashdot effect as a result. Miro CEO Peter Lamont gave a public response to the development team in an article titled "The Mambo Open Source Controversy - 20 Questions With Miro". This event created controversy within the free software community about the definition of "open source". Forums at many other open source projects were active with postings for and against the actions of both sides.

In the two weeks following Eddie's announcement, teams were re-organized, and the community continued to grow. Eben Moglen and the Software Freedom Law Center (SFLC) assisted the Joomla! core team beginning in August 2005, as indicated by Moglen's blog entry from that date and a related OSM announcement. The SFLC continue to provide legal guidance to the Joomla! project.

On August 18, 2005, Andrew Eddie called for community input on suggested names for the project. The core team indicated that it would make the final decision for the project name based on community input. The core team eventually chose a name that was not on the list of suggested names provided by the community.

On September 1, 2005 the new name, “Joomla!,” was announced. It is the anglicised spelling of the Swahili word jumla meaning “all together” or “as a whole.”

On September 6, 2005, the development team called for logo submissions from the community and invited the community to vote on the logo preferred; the team announced the community's decision on September 22, 2005. Following the logo selection, brand guidelines, a brand manual, and a set of logo resources were then published on October 2, 2005 for the community's use.

Joomla won the Packt Publishing Open Source Content Management System Award in both 2006 and 2007.

On October 27, 2008, PACKT Publishing announced Johan Janssens the "Most Valued Person" (MVP) for his work as one of the lead developers of the 1.5 Joomla! Framework and Architecture. In 2009 Louis Landry received the "Most Valued Person" award for his role as Joomla! architect and development coordinator.

Joomla (Joomla! 1.0.0) was released on September 16, 2005. It was a re-branded release of Mambo 4.5.2.3 which, itself, was combined with other bug and moderate-level security fixes.

Joomla! version 1.5 was released on January 22, 2008, the most recent release (April 4, 2011) being 1.5.23.

Joomla 1.6.0 was released on January 10, 2011. This version adds a full access control list functionality plus, user-defined category hierarchy, and admin interface improvements.

Joomla 1.7.0 is planned for release six months after 1.6.0 in July 2011.

Joomla can be installed manually from source code on a system running a web server which supports PHP applications, from a package management system or using a TurnKey Joomla appliance which comprises the application and its dependencies as a ready-to-use system.

There are numerous web hosting companies who provide a control panel which automates the deployment of a basic Joomla web site.

Joomla can also be installed via the Microsoft Web Platform Installer which installs the software on Windows and IIS. The Web PI will automatically detect any missing dependencies such as PHP or MySQL then install and configure them before installing Joomla.
Read more »

Blog Archive