Thursday, November 27, 2008

Mirc MP3

EFNET
#indomp3z

Friday, November 14, 2008

Diana Krall "Temptation"

Rusted brandy in a diamond glass
Everything is made from dreams
Time is made from honey slow and sweet
Only the fools know what it means

T
emptation, temptation, temptation

O
h, temptation, temptation, I can't resist
Well I know that she is made of smoke
But I've lost my way
He knows that I am broke
But I must play

T
emptation, oh temptation, temptation, I can't resist

D
utch pink and Italian blue
He is there waiting for you
My will has disappeared
Now confusion is so clear

T
emptation, temptation, temptation I can't resist
Temptation, temptation, temptation I can't resist

Thursday, November 13, 2008

Thomas Beatie, a married man who used to be a woman, is pregnant with a baby girl


A married man who used to be a woman says that he is pregnant and will give birth to a baby girl in July.

“How does it feel to be a pregnant man? Incredible,” wrote Thomas Beatie, 34, from the Pacific North West of the United States, in the latest issue of the gay magazine The Advocate.

“Despite the fact that my belly is growing with a new life inside me, I am stable and confident being the man that I am.”

Mr Beatie was born female, named Tracy Lagondino, but had gender reassignment surgery and is now legally male and married to a woman.

He decided to carry a baby for his wife, Nancy, because she had a hysterectomy years ago. He was able to get pregnant because he kept his female organs when he switched genders.

“Sterilisation is not a requirement for sex reassignment, so I decided to have chest reconstruction and testosterone therapy but kept my reproductive rights,” he writes. “Wanting to have a biological child is neither a male nor female desire but a human desire.” The couple, who have been together for ten years, run a custom screenprinting business in Bend, Oregon, where neighbours do not know that Mr Beatie was once a woman.

“Our desire to work hard, buy our first home and start a family was nothing out of the ordinary. That is, until we decided that I would carry our child,” he wrote.

Before becoming pregnant, Mr Beatie stopped the testosterone injections he was receiving as part of his gender reassignment. “It had been roughly eight years since I had my last menstrual cycle so this wasn’t a decision that I took lightly. My body regulated itself after about four months and I didn’t have to take any exogenous oestrogen, progesterone or fertility drugs to aid my pregnancy,” he wrote.

The couple bought donor vials from a cryogenic sperm bank and, facing resistance and prejudice from doctors, resorted to home insemination. “Doctors have discriminated against us, turning us away due to their religious beliefs. Healthcare professionals have refused to call me by a male pronoun or recognise Nancy as my wife. Receptionists have laughed at us. Friends and family have been unsupportive; most of Nancy’s family doesn’t even know I’m transgender,” he said.

Mr Beatie’s first successful insemination ended in a life-threatening ectopic pregnancy with triplets that required surgery, resulting in the loss of all his embryos and his right Fallopian tube. “When my brother found out about my loss, he said, ‘It’s a good thing that happened. Who knows what kind of monster it would have been?’,” he wrote.

The second pregnancy resulted in a baby girl who is due to be born on July 3. “I will be my daughter’s father, and Nancy will be her mother. We will be a family,” he wrote.

Mr Beatie would not be the first transgender man to give birth, according to Lisa Masterson, an obstetrician at Cedars-Sinai Medical Centre in Los Angeles.

“A transgender man can be pregnant because he has the same organs as a woman,” Dr Masterson said on the ABC Good Morning America show.

Dr Masterson said, however, that transgendered men face special health risks resulting from their sex change. “It’s really important that he doesn’t take any testosterone early on in the pregnancy and later on,” she said. “That can cause male-type characteristics in the female baby.”

Some of the Beaties’ neighbours in Bend voiced scepticism about the pregnancy claim. One resident, Josh Love, told ABC: “I couldn’t say that he looks pregnant. I can stick my stomach out and almost make it look like that. I think it’s kind of bizarre. I don’t know if I believe it or not.”

The Advocate said it had confirmed the story with Mr Beatie’s doctor.

http://www.timesonline.co.uk/tol/news/world/us_and_americas/article3628860.ece

Monday, November 10, 2008

If You Go Away

donlot here

Words & Music by Jacques Brel (English translation by Rod McKuen)
Recorded by Frank Sinatra, 1969


Am
If you go away on this summer day

Dm
Then you might as well take the sun away;

G7
All the birds that flew in the summer sky

C9 C C9 C
When our love was new and our hearts were high;

Dm6 E7
When the day was young and the night was long

Am
And the moon stood still for the nightbird's song,

Dm7 E7 Dm6 Am
If you go away, if you go away, if you go away.

Am7 Am6
But if you stay, I'll make you a day

E7-9 Am
Like no day has been or will be again;

Am7 Am6
We'll sail on the sun, we'll ride on the rain,

G7 C
We'll talk to the trees and worship the wind.

E7 Am Bb
Then if you go, I'll understand,

Dm6 E7 Am
Leave me just enough love to fill up my hand,

C Dm Dm6 E7 D6 E7
If you go away, if you go away, if you go away.



Am
If you go away, as I know you will,

Dm Dm+7
You must tell the world to stop turning til

G7
You return again, if you ever do,

C
For what good is love without loving you?

Dm6 E7
Can I tell you now, as you turn to go,

Am
I'll be dying slow til your next hello,

Dm7 E7 Dm6 Am
If you go away, if you go away, if you go away.

Am7 Am6
But if you stay, I'll make you a night

E7-9 Am
Like no night has been, or will be again;

Am7 Am6
I'll sail on your smile, I'll ride on your touch,

G7 C
I'll talk to your eyes that I love so much.

E7 Am Bb
But if you go, no, I won't cry,

Dm6 E7 Am
Though the good is gone from the word good-bye.

C Dm Dm6 E7 D6 E7
If you go away, if you go away, if you go away.


Am
If you go away, as I know you must,

Dm
There'll be nothing left in the world to trust,

G7
Just an empty room, full of empty space,

C
Like the empty look I see on your face;

Dm6 E7
I'd have been your shadow if I thought it might

Am
Have kept me here, by your side;

C Dm E7-9 Am
If you go away, if you go away . . . please don't go away.

Sunday, November 09, 2008

Web 2.0 Tools Demand A Cautious Approach

As companies look to cut costs and manage projects involving far-flung staff, many are investigating wikis, file-sharing services, and other consumer technologies to deliver Web-based collaboration inexpensively.
Bringing these tools into a corporate environment presents thorny issues, however. Chief among them is security: IT is justifiably wary of giving users privileges such as directly editing Web content or uploading files when their companies lack the technology and policies to enable safe Web 2.0 use.
But are there cost-effective collaboration tools that you don't have to just say no to? In this Rolling Review, we'll find out. To get a good look at a cross section of available apps, we invited a range of vendors, from major players like Microsoft, Novell, and Google to smaller companies such as Central Desktop, Socialtext, and CallWave. As we test these tools at our Phil Hippensteel Associates partner labs, we'll evaluate how well they address these key considerations:

• Who has the data?
Trade secrets, customer lists, and competitive intelligence must be carefully guarded. Violations of regulations and privacy laws are always a concern when data is in the hands of others. Whoever controls the data will be responsible for it and will be held accountable for any data that might be evidence in court cases.

• How secure is it?
Collaboration services track the progress of projects while integrating e-mail, schedules, and new contacts with local databases. These tools require users to send and receive files, images, and Java scripts to corporate computers, possibly from peer devices that are outside IT's control.
Antivirus and intrusion-detection software can spot inappropriate e-mail attachments, but finding malware embedded in HTTP traffic is more difficult.

• What are the costs?
Web 2.0 collaboration tools usually have a monthly or annual fee and relatively low up-front costs. Additional expenses are likely to occur once the tool is in use. For example, if you configure the components of a large project only to discover that your design is bulky and difficult to use, you'll incur the cost of the time needed to redesign the collaboration. It can be difficult to correct these problems once data has been stored. Our Rolling Review will look at the products' flexibility to avoid this pitfall. Training costs are a related issue. As we test these tools we'll consider whether they are easy to learn and use.

• How's the support?
Collaboration tools and services have to play well with other applications like ERP systems and desktop office software. Quick responses to integration questions will be crucial. This may be especially true for collaboration products and services because they may not have been developed with integration as a high priority.

Saturday, November 08, 2008

Vulnerability Scanning Web 2.0 Client-Side Components

Introduction

Web 2.0 applications are a combination of several technologies such as Asynchronous JavaScript and XML (AJAX), Flash, JavaScript Object Notation (JSON), Simple Object Access Protocol (SOAP), Representational State Transfer (REST). All these technologies, along with cross-domain information access, contribute to the complexity of the application. We are seeing a shift towards empowerment of an end-user's browser by loading libraries.

All these changes mean new scanning challenges for tools and professionals. The key learning objectives of this article are to understand the following concepts and techniques:

  • Scanning complexity and challenges in new generation Web applications
  • Web 2.0 client-side scanning objectives and methodology
  • Web 2.0 vulnerability detection (XSS in RSS feeds)
  • Cross-domain injection with JSON
  • Countermeasures and defense through browser-side filtering

Web 2.0 scanning complexities

The next generation Web 2.0 applications are very complex in nature and throw up new scanning challenges. The complexities can be attributed to the following factors:

  • Rich client interface - AJAX and Flash provide rich interfaces to applications with complex JavaScripts and Actionscripts, making it difficult to identify application logic and critical resources buried in these scripts.
  • Information sources - Applications are consuming information from various sources and building up mashups [ref 1] within sites. An application aggregates RSS feeds or blogs from different locations and builds a large repository of information at a single location.
  • Data structures - Exchange of data between applications is done using XML, JSON [ref 2], Java script arrays and proprietary structures.
  • Protocols - Aside from the simple HTTP GET and POST, applications can choose from an array of different protocols such as SOAP, REST and XML-RPC.

Our target application may be accessing RSS feeds from multiple sites, exchanging information with blogs using JSON, and communicating with a stock exchange portal's Web service over SOAP. All these services are bundled in the form of Rich Internet Applications (RIA) using AJAX and/or Flash.

Web 2.0 application scanning challenges

Application scanning challenges can be divided into two parts:

  1. Scanning server-side application components - One of the biggest challenges when scanning Web 2.0 applications is to identify buried resources on the server. When scanning traditional applications, a crawler can be run that would look for the string "href" in order to identify and profile Web application assets.

    In the case of Web 2.0 applications, however, one needs to identify backend Web services, third-party mashup, backend proxies, etc. The author has addressed some of these challenges in a previous article [ref 3].
  2. Scanning client-side application components - A Web 2.0 application can loads several JavaScripts, Flash components, and widgets in the browser. These scripts and components utilize the XMLHTTPRequest object to communicate with the backend Web server. It is also possible to access cross-domain information from within the browser itself. Cross-site scripting (XSS) attacks [ref 4] are potential threats to the application user. The Web 2.0 framework uses various client-side scripts and consumes information from "untrusted third-party sources." AJAX and JSON technologies, cross-domain access and dynamic DOM manipulation techniques are adding new dimensions to old XSS attacks [ref 5]. Client-side component scanning and vulnerability detection in Web 2.0 are new challenges coming up on the horizon. The scope of this article is restricted to this scanning category.

Client-side scanning objectives

To understand these scanning objectives clearly, let us take a sample scenario as illustrated in Figure 1.0. We have a Web application running on example.com. Clients access this application via a Web browser.

Figure 1.
Figure 1. Web 2.0 target application layout.

This web application can be divided into the following sections with regard to their usage and logic.

Application resources- these resources are deployed by example.com and they can be of any type: HTML, ASP/JSP, Web services. All these resources are in a fully trusted domain and are owned by example.com.

Feed proxy - The XMLHTTPRequest object cannot make direct backend calls to cross-domains. To circumvent this restriction a proxy is set up by example.comthat can give access to third-party RSS feeds, for example, a daily news feed. Hence, users of example.com can set up any feed on the Internet for daily use.

Blog access - End-users use the same application loaded by example.com to access some of the blogs on the Internet. This is possible because example.com loads certain scripts on the client's browser that allow users to access cross-domain blogs.

Here are four critical scanning objectives to determine client-side vulnerabilities.

  1. Technology and library fingerprinting - Web 2.0 applications can be created by many AJAX and Flash libraries. These libraries get loaded in the browser and are used by the application as and when needed. It is important to fingerprint these libraries and map them to publicly known vulnerabilities.
  2. Third-party untrusted information points - In Figure 1, we have divided the Web application layout into "trusted" and "untrusted" areas. Information originating from untrusted sources needs thorough scrutiny prior to loading it in the browser. In our example this information flows via an application server proxy in the case of news feeds, and directly into the DOM in the case of blogs.
  3. DOM access points - The browser runs everything in its DOM context. Loaded JavaScripts manipulate the DOM. If malicious information is passed to any one of these access points the browser can be at risk. DOM access points are therefore essential bits of information.
  4. Functions and variable traces for vulnerability detection - Once DOM access points and third-party information has been identified, it is important to understand execution logic and corresponding traces in the browser in order to expose threats and vulnerabilities.

Scanning client-side applications [news feeds]

In this section we shall adopt a manual approach to the scanning process. This methodology can be automated to some extent but given the complexity of the application it may be difficult to scan for all possible combinations.

The target resource -http://example.com/rss/news.aspx

We get the following page, as shown below in Figure 2.

Figure 2.
Figure 2. RSS feed application widget.

The above page serves various RSS feeds configured by the end-user. Now let's walk through the steps we require for scanning.

1. Scanning for technology and fingerprints

All possible JavaScripts consumed by the browser after loading the page can be grabbed from the HTML page itself by viewing the HTML source, as listed in Figure 3, or programmatically using regular expressions.

Figure 3.
Figure 3. All JavaScripts for the application page.

If you have the Firefox plugin "Web Developer" [ref 6], you can view all scripts in a single page as shown below in Figure 4.

Figure 4.
Figure 4. All JavaScripts along with source code.

The following information can be identified by scanning these JavaScripts:

  • One of the AJAX development toolkits dojo.js [ref 7] is being used. File names provide vital clues when fingerprinting these technologies. We can scan the content further to determine the version in use. A similar technique can also be employed to fingerprint Microsoft's Atlas and many other technologies. This information helps in mapping known vulnerabilities to underlying architecture.
  • Files containing functions that are consumed by an RSS feed application can be mapped within the browser. Here is a brief list:
    • The rss_xml_parser.js file contains functions such as processRSS() and GetRSS(). These functions fetch RSS feeds from the server and process them.
    • The XMLHTTPReq.jsfile contains makeGET() and makePOST() functions to process AJAX requests.
    • The dojo.js file contains several other functions.

All this enumerated information can be organized to obtain a better picture of the process.

2. Third party untrusted information points

We scan the HTML source for the page and locate the following code:

This code calls the function GetRSS(), which in turn, makes a request to the proxy to fetch untrusted RSS feeds from various servers.

3. DOM access points

Having collected all JavaScripts, we can look for certain patterns where the DOM gets accessed. We look for "document.*" usage and then narrow down the search to two potential candidates:

document.getElementById(name).innerHTML - This function is used extensively by applications to change HTML layers dynamically.

document.write()- This function is also used to change HTML layers in the browser.

There may be a few other calls which transform DOM views in the browser. At this point, however, we shall focus on the two preceding functions. In the source, scroll down to the following function where "innerHTML" is used.

function processRSS (divname, response) {
var html = "";
var doc = response.documentElement;
var items = doc.getElementsByTagName('item');
for (var i=0; i < title =" items[i].getElementsByTagName('title')[0];" link =" items[i].getElementsByTagName('link')[0];" target =" document.getElementById(divname);" innerhtml =" html;">

4. Functions and variable traces for vulnerability detection

We can organize all information gathered from the preceding three steps and apply debugging techniques [ref 8] to determine the entire flow of client-side logic. This is illustrated in Figure 5.

e

Figure 5.
Figure 5. Execution logic and data flow for news feed function.


From this logic it is clear that the feed proxy is filtering out certain characters such as <>, effectively making it impossible to inject JavaScript into the DOM. Again, since execution modifies "innerHTML" in preloaded DOM it is not possible to execute scripts either. Observe closely the following line from the loop that builds HTML dynamically:

html += ""

What if an untrusted RSS feed injects a malicious link? This link is not validated
anywhere as is evident from the code. Here is an example of the RSS node:

javascript:alert("Simple XSS")

XYZ news
2005-11-16T16:00:00-08:00

Note that the "href" element of XML contains JavaScript. Hence, when an end-user
clicks the link the script will run in the current DOM context. This process is
illustrated below in Figure 6.

Figure 6.
Figure 6. XSS with link.

Information presentation in the DOM from an untrusted source may put an end-user's session at risk. Web 2.0 applications provide information from different sources in a single browser page.


Let's take another example to understand this attack vector clearly.
Cross-domain JSON injection

JSON is a very lightweight structure for information exchange vis-à-vis XML that has a sizeable overhead. Many applications such as Google, Yahoo and others have extended their Web services with JSON callback. With this callback in place, cross-domain information can be captured and processed in specific functions.

Let's take an example. Assume that a web site running at http://blog.example.org, has extended their services using JSON callback to allow anyone access to information using JavaScript. A JSON structure with callback name will be provided.

Access the profile for id=10 by pointing the browser to the location or by sending a GET to the following location:

Request:
http://blog.example.org/Getprofile.html?callback=profileCallback&id=10

Response:
profileCallback({"profile":[{"name":"John","email":"john@my.com"}]})

As you can see, we get JSON wrapped around profileCallback. With this callback the profileCallback() function gets executed with JSON output as its argument.


Similarly if we send id=11 we get the following response back:

profileCallback({"profile":[{"name":"Smith","email":"smith@cool.com"}]})


Our target example.comhas integrated this blog service in its application. If we scan their clien-side code and look for document.write we get the following code snippet in one of their pages: showprofile.html.

The code makes a call to "blog.example.org" which is a cross-domain and processes output in "profileCallback". By calling the page we get following output, shown below in Figure 7.


Figure 7.
Figure 7. Simple JSON callback.

It is clear that this application running on example.com consumes third party untrusted JSON information into the application without any validation. This exposes an end-user's sensitive data such as session and cookie information.

This means that if one of the clients is trying to look up information for id=101 that has an injected JavaScript into its email field, such as the one show below, will instead have a remote malicious JavaScript run on the machine.

profileCallback({"profile":[{"name":"Jack","email":""}]})


Simply put, the victim's machine can be compromised while the victim browses through example.com's application as shown in Figure 8.

Figure 8.
Figure 8. Possible XSS with JSON injection.

This way untrusted information processing in an AJAX virtual application sandbox can be exploited by poorly written client-side scripts or components. In this article we have covered two ways to pass on payloads to the end-user. A few other methods exist as well.

Countermeasures

To protect client-side browsers it is important to follow a re-worded maxim, "trust no third-party information." In the design phase of the application one needs to clearly define a virtual application sandbox and provide incoming information validation for all third-party sources appearing in the form of XML, RSS feeds, JSON, JavaScript arrays, etc. as shown below in Figure 9.


Figure 9.
Figure 9. Virtual application sandbox to validate third-party information sources.



For example, prior to posting HREFs to the DOM, pass them to a simple function, such as the one shown in the code snippet below, to look for any JavaScript code injection.

function checkLink(link) {
if(link.match(/javascript:|<|>/))
{
return false;
} else {
return true;
}
}

It is important to filter all incoming traffic before it hits the DOM in a browser-side application sandbox - a final defense for the end-user.

Conclusion

In recent times several incidents [ref 9] were observed in which Web 2.0 applications were compromised at the client's end due to poorly written scripts. Future automated or manual scanning techniques and technologies will have to empower their engines with powerful client-side DOM related vulnerability detection mechanisms. It may be a challenge to perform complete automated scanning for Web 2.0 applications, one that can be surmounted by using automated scanning in combination with human intelligence. Some of the old attack vectors such as Cross-Site Request Forgery (XSRF) [ref 10], are also being looked at afresh in this era of Web 2.0. XSS, XSRF and other client-side attacks feature in several vulnerabilities and advisories for new generation Web applications.

Web 2.0 application assessment needs special attention to be devoted to client-side attack vectors with the purpose of mitigating risks at client ends. This article has sought to throw light on some attack vectors and scanning techniques used to identify vulnerable applications. Scratch the surface and many applications that are vulnerable to this range of attack vectors and that can be exploited by application layer attackers, viruses and worms, will be revealed.

References

[ref 1] Brief on mashup (http://en.wikipedia.org/wiki/Mashup_(web_application_hybrid))
[ref 2] JavaScript Object Notation (JSON) is a lightweight data-interchange format (http://www.json.org/)
[ref 3] Hacking Web 2.0 Applications with Firefox (http://www.securityfocus.com/infocus/1879)
[ref 4] XSS threat classification ( http://www.webappsec.org/projects/threat/classes/cross-site_scripting.shtml)
[ref 5] DOM Based Cross Site Scripting or XSS of the Third Kind - By Amit Klein(http://www.webappsec.org/projects/articles/071105.shtml)
[ref 6] Web developer plugin (http://chrispederick.com/work/webdeveloper/)
[ref 7] Dojo toolkit (http://www.dojotoolkit.com/)
[ref 8] JSON callback (http://developer.yahoo.com/common/json.html)
[ref 9] The Web Hacking Incidents Database (http://www.webappsec.org/projects/whid/)
[ref 10] Cross-Site Reference Forgery - By Jesse Burns (http://www.isecpartners.com/documents/XSRF_Paper.pdf