Don't judge a book by its title (yes title, no cover). A saying that could be not farther from the truth, at least when you're posting online. A proper headline can dictate your success, so here are a few ideas on how to construct a good title:
The Truth About _________________________
Example1: The Truth About Hearing devices
Example2: The Truth About Hell
_______ You Should ____________
Example1: 6 Foods You Should Avoid Giving To Your Baby
Example2: 5 Coffee Franchises You Should Invest In Immediately
The Warning Signs of _____________
Example1: The Warning Signs of Gout
Example2: The 6 Warning Signs of a Stroke
The Amazing Secrets of _________________
Example1: The Amazing Secrets of Trout Fishing
Example2: The Amazing Secrets of the Best Disability Lawyers
How To _____ Like The (pros/insiders/experts)
Example1: How To Cook Fish Like a World Class Chef
Example2: How To Feel Like The Incredible Hulk
A _______'s guide to _________
Example1: A Knitter's guide to 10 minute scarfs
Example2: A Man's Guide To Long Term Relationships
The Joy of ___________
Example1: The Joy of a Clutter-Free Home
Example2: The Joy of Painting
Cures for ____________
Example1: Cures for Hot Flashes
Example2: Natural Cures for Depression
The Perfect ________
Example1: The perfect way to shop for engagement rings
Example2: The perfect gifts for under $22
Are You ________? Test Yourself
Example1: Are You Ready To Invest in the Hottest Opportunities? Test Yourself.
Example2: Are You Ready For Marriage? Test Yourself
The Basic Elements Found In Every _________
Example1: The Basic Elements Found In Every Stress-Free Work Environment
Example2: The Basic Elements Found in Every Good Boyfriend
Do You Have the Courage to ________?
Example1: Do You Have the Courage to Lead a Debt Free Life?
Example2: Do You Have the Courage To Succeed In Your Adsense Business?
What Every ______ Should Know About ________
Example1: What Every Shrewd Investor Should Know About the Stock Market
Example2: What Every Student Should Know About Online Learning
What ______ Won't Tell You About _______
Example1: What Your Family Doctor Won't Tell You About Depression
Example2: What Politicians Won't Tell You About Future Oil Prices
Don't Read This If _________
Example1: Don't Read This If You're Happy With Your Marriage
Example2: Don't Read This If You're Not Worried About the Impact of Global Warming
Is It Wrong/Bad To ______ ?
Example1: Is It Wrong To Eat After 8 p.m.?
Example2: Is It Wrong To Access Someone Else's Background Information Online?
Protect ______ From _______
Example1: Protect Your Money From the Government
Example2: Protect Your Children from Skin Cancer
Best ______ of ________
Example1: Best Marketing Books Of All Time.
Example2: Best Hip Hop Albums Of 2007
Break The _______ Cycle
Example1: Break The Ear-infection Cycle
Example2: Break The Debt Cycle
________-proofing Checklist
Example1: Baby-proofing Checklist
Example2: Debt-Proofing Cheklist
_______ Breakthrough(s)!
Example1: Pancreatic Cancer Breakthroughs!
Example2: Couple Counseling Breakthrough!
Here’s a Quick Way To _________
Example1: Here's a Quick Way To Get Millions of Prospects
Example2: Here's a Quick Way To Spot a Woman Who's Been Abused
The _________ You Should Do When _________
Example1: The 3 Things You Should Do When You Can't Sleep
Example2: The One Thing You Should Do to Protect Your Family From Being Attacked
What Everybody Ought to Know About _________
Example1: What Everybody Ought To Know About Financial Freedom
Example2: What Everybody Ought to Know About Aronia Berries
How the Experts __________
Example1: How the Experts Buy and Sell Silver
Example2: How the Experts Bluff in Texas Hold 'Em
Who Else Wants __________
Example1: Who Else Wants To Climb Like An Orangutan?
Example2: Who Else Wants a PSP2?
Why You Should Never Even Think About ________
Example1: Why You Should Never Even Think About Showering Without Filtering Your Water
Example2: Why You Should Never Even Think About Selling a House Before Reading This
What Never to Do When ______
Example1: What Never to Do When Buying Information Products Online
Example2: What Never to Do When Approaching a Woman
Little Known Ways to _______
Example1: 11 Little Known Ways to Advance Your Career
Example2: Little Known Ways To Save Money When Eating Out
The Dumb Mistakes Most _______ Make When ________
Example1: The Dumb Mistakes Most Business Owners Make When Buying Advertising
Example2: The Dumb Mistakes Most Men Make When Approaching a Woman
_________ to Jump Start your _________
Example1: 6 Ways to Jump-Start Your Body's Detox System
Example2: How To Jump Start Your Car
Monday, November 9, 2009
30 Link Bait Headlines You Should Use When Writing Articles
Posted by paij0's at 1:07 AM 0 comments
Sunday, November 8, 2009
Free Auto Approved .EDU Links
For maximum results there are a couple of things that you should bear in mind when using these links:
#1. Most of these links are on blogs, this means that although I link to a specific post you can search others posts on the same blog. This is good because you can then post on posts with fewer outbound links, which makes your backlink more powerful. In addition, this will help to spread out your comments – especially if doing multiple backlinks – so that the blog owner does not think they are being spammed.
#2. All of these blogs are auto approved, that doesn’t mean that they are all unmoderated though. Becuase of this you should make your comments appropriate so that they will not be removed. Place your link in the URL field when possile and use your anchor text in the Name field. Make your comment either relate to the blog post itself or to another comment. This takes an extra few seconds but insures that your backlink will stay up forever rather than being removed as spam down the road.
That’s about it, these are powerful links and you will surely see some gains in search engine rankings after you get cranking away at them so what are you waiting for!
http://bcnm.berkeley.edu/blog/2009/05/collecting-the-uncollectable-panel-discussion-on-media-art/ – Picky about URL/Name/E-mail if already used on other WP forums frequently (USE FIRST)
———-
Sign Up Required:
———-
http://h2obeta.law.harvard.edu/ – Requires quick sign-up but MANY high PR lists to comment on and drop multiple spammy links per comment – no oversight!
https://alum.mit.edu/discuss/index.jspa – Requires sign-up but allows many locations to drop links on .edu pages (forum like discussion board).
http://scout.wisc.edu/Archives/SPT–FullRecord.php?ResourceId=266 – Requires e-mail validated sign-up, however lets you place as many links as you like in comment body, unmoderated.
———-
Auto Approve Do Follow Blogs:
———-
http://blogs.lynn.edu/afreshlook/2009/09/02/president-ross-and-the-new-blog-squad/
http://cyberlaw.stanford.edu/node/6242
http://bbnews.blog.usf.edu/2009/08/24/blackboard-external-urls-and-internet-explorer-8
http://www.iq.harvard.edu/blog/sss/archives/2009/09/grimmer_on_quan.shtml
http://blog.scad.edu/eco/2009/05/01/sustainability-council-screens-savannah-earth-day-festival-visitors/
http://studentsenate.rpi.edu/blog/show/16 – Comment directly in body, use ‘
http://iris.ebs.edu/accessdb/www/logistikblog.nsf/d6plinks/holm-vereinsgr%FCndung
http://news21.jomc.unc.edu/index.php/powering-a-nation-blog/eating-corn-eating-oil.html
http://blog.brookdalecc.edu/article.php?story=20081005224044757
http://www.stanford.edu/group/ccr/blog/2008/09/ocean_views_1.html
http://shauna.blog.usf.edu/2009/08/16/hello-from-scott
http://www.sca.ucla.edu/blog/2009/2/25/monkey-business.html – Comment directly in body, use ‘
http://www.darden.virginia.edu/html/blog-JimClawson.aspx?id=15836&blogid=294
http://bcnm.berkeley.edu/blog/2009/06/new-media-and-the-crisis-in-iran/
http://interactiondesign.sva.edu/blog/entry/the_human_race_jill_nussbaum
s_story_from_the_front/
http://library.duke.edu/blogs/scholcomm/2009/08/22/a-model-copyright-law
http://blog.fvsu.edu/2009/08/be-careful-when-social-networking/
http://www.csdhead.cs.cmu.edu/blog/2009/07/30/at-last-useful-social-networking
http://diva.sfsu.edu/blog/04-20-2009/using-content-outside-of-diva
http://www.sft.edu/blog/2009/07/15/so-much-for-type-casting/
http://blog.axehandle.org/2009/01/writing-about-black-panthers.html
http://connect.rhodes.edu/blog/tyler/
http://valis.cs.uiuc.edu/blog/?p=2694
http://blogs.tamu.edu/jmpackard/2009/07/16/technical-writing-help/
http://www.gspm.gwu.edu/545_GSPM-Dean-Talks-Bi-Partisanship
http://id.ome.ksu.edu/blog/2009/aug/12/seemingly-simple-assignment-digital-storytelling/
http://hcil.cs.umd.edu/localphp/hcil/vast/index.php/blog/comments/questions_about_the_challenge/
http://blog.shimer.edu/shimer/2009/09/a-bittersweet-adventure.html
http://www.marymountpv.edu/news-events/intentional-conversation/blog
http://blog.luthersem.edu/library/2009/02/kindle2-and-the-doom-of-libraries.html
Posted by paij0's at 9:29 PM 1 comments
Free Auto Approved .EDU Links
For maximum results there are a couple of things that you should bear in mind when using these links:
#1. Most of these links are on blogs, this means that although I link to a specific post you can search others posts on the same blog. This is good because you can then post on posts with fewer outbound links, which makes your backlink more powerful. In addition, this will help to spread out your comments – especially if doing multiple backlinks – so that the blog owner does not think they are being spammed.
#2. All of these blogs are auto approved, that doesn’t mean that they are all unmoderated though. Becuase of this you should make your comments appropriate so that they will not be removed. Place your link in the URL field when possile and use your anchor text in the Name field. Make your comment either relate to the blog post itself or to another comment. This takes an extra few seconds but insures that your backlink will stay up forever rather than being removed as spam down the road.
That’s about it, these are powerful links and you will surely see some gains in search engine rankings after you get cranking away at them so what are you waiting for!
http://bcnm.berkeley.edu/blog/2009/05/collecting-the-uncollectable-panel-discussion-on-media-art/ – Picky about URL/Name/E-mail if already used on other WP forums frequently (USE FIRST)
———-
Sign Up Required:
———-
http://h2obeta.law.harvard.edu/ – Requires quick sign-up but MANY high PR lists to comment on and drop multiple spammy links per comment – no oversight!
https://alum.mit.edu/discuss/index.jspa – Requires sign-up but allows many locations to drop links on .edu pages (forum like discussion board).
http://scout.wisc.edu/Archives/SPT–FullRecord.php?ResourceId=266 – Requires e-mail validated sign-up, however lets you place as many links as you like in comment body, unmoderated.
———-
Auto Approve Do Follow Blogs:
———-
http://blogs.lynn.edu/afreshlook/2009/09/02/president-ross-and-the-new-blog-squad/
http://cyberlaw.stanford.edu/node/6242
http://bbnews.blog.usf.edu/2009/08/24/blackboard-external-urls-and-internet-explorer-8
http://www.iq.harvard.edu/blog/sss/archives/2009/09/grimmer_on_quan.shtml
http://blog.scad.edu/eco/2009/05/01/sustainability-council-screens-savannah-earth-day-festival-visitors/
http://studentsenate.rpi.edu/blog/show/16 – Comment directly in body, use ‘
http://iris.ebs.edu/accessdb/www/logistikblog.nsf/d6plinks/holm-vereinsgr%FCndung
http://news21.jomc.unc.edu/index.php/powering-a-nation-blog/eating-corn-eating-oil.html
http://blog.brookdalecc.edu/article.php?story=20081005224044757
http://www.stanford.edu/group/ccr/blog/2008/09/ocean_views_1.html
http://shauna.blog.usf.edu/2009/08/16/hello-from-scott
http://www.sca.ucla.edu/blog/2009/2/25/monkey-business.html – Comment directly in body, use ‘
http://www.darden.virginia.edu/html/blog-JimClawson.aspx?id=15836&blogid=294
http://bcnm.berkeley.edu/blog/2009/06/new-media-and-the-crisis-in-iran/
http://interactiondesign.sva.edu/blog/entry/the_human_race_jill_nussbaum
s_story_from_the_front/
http://library.duke.edu/blogs/scholcomm/2009/08/22/a-model-copyright-law
http://blog.fvsu.edu/2009/08/be-careful-when-social-networking/
http://www.csdhead.cs.cmu.edu/blog/2009/07/30/at-last-useful-social-networking
http://diva.sfsu.edu/blog/04-20-2009/using-content-outside-of-diva
http://www.sft.edu/blog/2009/07/15/so-much-for-type-casting/
http://blog.axehandle.org/2009/01/writing-about-black-panthers.html
http://connect.rhodes.edu/blog/tyler/
http://valis.cs.uiuc.edu/blog/?p=2694
http://blogs.tamu.edu/jmpackard/2009/07/16/technical-writing-help/
http://www.gspm.gwu.edu/545_GSPM-Dean-Talks-Bi-Partisanship
http://id.ome.ksu.edu/blog/2009/aug/12/seemingly-simple-assignment-digital-storytelling/
http://hcil.cs.umd.edu/localphp/hcil/vast/index.php/blog/comments/questions_about_the_challenge/
http://blog.shimer.edu/shimer/2009/09/a-bittersweet-adventure.html
http://www.marymountpv.edu/news-events/intentional-conversation/blog
http://blog.luthersem.edu/library/2009/02/kindle2-and-the-doom-of-libraries.html
Posted by paij0's at 9:29 PM 0 comments
Instan way With Auto Approved .gov Backlins
some one has show me the instant way to get backlink
There are a couple of things we need to cover before you start here. First of all I provide specific instructions for each link, follow them for best results. Second if it is a blog that means any post on this blog will do, pick another one for better results (less outgoing links). That’s about it, dig in and watch your site go nuts in SE rankings!
Please note, links with the ‘**’ after the number are NoFollow, all others are DoFollow!
Auto Approved .GOV Backlinks:
- **http://wiki.cio.ny.gov/w/index.php?title=Special:UserLogin&type=signup&returnto=Main_Page Sign up for account here, then return to the wiki and choose ‘random page’ to find something to edit. Just drop your URL with anchor text here [http://www.yoursite.com anchor text] format.
- http://gis.utah.gov/index.php?option=com_comprofiler&task=registers Register here. Be sure to add your link in the ‘website’ field during registration. Confirm your email and your profile will go live with your backlink at the forum (http://gis.utah.gov/gisforum/).
- http://www.daculaga.gov/policy.asp Register here. Then return to the forums @
http://www.daculaga.gov/fhome.asp. Make a relevant post and use an anchor text link as a sort of signature at the end.
- http://townhall.virginia.gov/L/Comments.cfm?stageid=5089 Comment here and link to your anchor text in the comment.
- http://forums.info.usaid.gov/?boardid=procurement&Register=yes&action=8 Register here. Edit your profile to include your link in the proper field.
- http://community2.business.gov/t5/What-is-one-effective-tactic-you/order-EVERYTHING-online-to-save-money/idi-p/8285#A4 Register then post a comment here.
- http://ibis.wi.gov/CSForums/user/CreateUser.aspx?ReturnUrl=/CSForums/forums/ShowPost.aspx?PostID=59 Register and make a post on the forum with your backlinks and anchor text as signature.
- http://web.aging.sc.gov/forums/ucp.php?mode=register Register for these forums. As you can see there is not moderation so feel free to make a full thread hyping your backlinks (with anchor text).
- http://irp.idaho.gov/EditModule.aspx?tabid=679&forumid=3&postid=233&view=topic&def=Register Register here and drop links, at the end of your post like a signature. In addition feel free to add a link to your profile!
10. ** http://ptp.hud.gov/forumswww/main.cfm?CFID=6297797&CFTOKEN=47838801&CFApp=58& Register here then make a post and drop some links.
11. https://isis.astrogeology.usgs.gov/IsisSupport/index.php?PHPSESSID=bfkq69dv0fmjih53u64up2v6v3av1gsg&action=register Register here and add your link to your profile. You may make an on topic post to get your profile link indexed more quickly.
12. http://blogs.nasa.gov/cm/blog/FIRST%20Robotics%20Team%201868.blog/posts/post_1232059585953.html Just drop your links in the comment’s body with anchor text, be as spammy as you like there is no moderation!
13. http://www.performancesolutions.nc.gov/forums/Default.aspx?g=register Register here and add a link to your profile. Again just make a relevant post for faster indexing.
14. http://www.calepa.ca.gov/Forums/registration_rules.asp?FID=0 Register here. This forum has no posts so just add a link to your profile and let it get crawled, you can always build a backlink to it to get it indexed faster but do not post in this forum!
Auto Approved .EDU Backlinks:
- http://www.wcl.american.edu/pijip/go/blog-post/patent-lecture-nyu-professor-rochelle-dreyfuss-lecture-now-posted
- http://blog.shimer.edu/shimer/2009/10/the-fall-macaws.html
- http://www.salle.url.edu:81/cotxes/blog/index.php/component/user/?task=register Register here for forums. (Is in Spanish, first two fields username, then email and last two are password). Now simply login and make a post. This forum is not moderated so feel free to drop lots of text in your post to make your link contextually relevant for some extra juice.
- http://id.ome.ksu.edu/blog/2009/oct/26/transitory-digital-documents/
- http://law.baylor.edu/blog/post/Never-a-Dull-Moment-at-the-Law-School!.aspx
- http://sci.rutgers.edu/forum/register.php?s=0d2fb1b3239c9c257eb5cd0a72a05840 Register here then edit your profile to add your URL in the website field. You may also add links with anchor text in your signature.
- http://forum.buffalostate.edu/index.php?s=2acacdcdf67617b3e87db1978ab81434&act=Reg&CODE=00 Register here. Edit your ‘profile’ page and add your URL in the proper field.
- http://www.math.uaa.alaska.edu/~afkjm/techteach/?q=node/75#comments
- http://corot.dmca.yale.edu/~dmca/forum.cgi?register Register here. Edit your profile and stick your link in the website field.
- http://prometheus.scp.rochester.edu/ursds/user/register?destination=node/70%2523commentform Register here. Then return to the blog (http://prometheus.scp.rochester.edu/ursds/blog) and add a comment (show/hide comments). Use anchor text in comment body with link.
- http://mgl.scripps.edu/forum/index.php?sid=295f86c33d276871df3e5fb49055a131 Register here. Login now and go to ‘user control panel’ and then ‘profile’. Just stick you link in there and it is good to go.
Posted by paij0's at 9:13 PM 0 comments
Wednesday, October 28, 2009
Using Wget with Proxy Setting
As we already knew that Wget is a super-useful utility to download pages and automate all
types of web related tasks. It works for HTTP as well as FTP URL's.
And this is how to use wget through proxy server.
To get wget to use a proxy, you must set up an environment variable
before using wget. Type this at the command prompt / console:
For Windows:set http_proxy=http://proxy.example.com:8080
For Linux/Unix:export http_proxy="http://proxy.example.com:8080"
Replace proxy.example.com with your actual proxy server.
Replace 8080 with your actual proxy server port.
You can similarly use ftp_proxy to proxy ftp requests. An example on Linux would be:export ftp_proxy="http://proxy.example.com:8080"
Then you should specify the following option in wget command line to turn the proxy behavior on:
–proxy=on
Alternatively you can use the following to turn it off:
–proxy=off
You can use –proxy-username="user name" –proxy-passwd="password" to set proxy user name and password where required.
Replace user name with your proxy server user name and password
with your proxy server password. Another alternative is to specify them
in http_proxy / ftp_proxy environment variable as follows:export http_proxy="http://username:password@proxy.example.com:8080"
Happy using wget!
Posted by paij0's at 6:19 PM 0 comments
Wednesday, October 21, 2009
Curl Http Scripting
Online: http://curl.haxx.se/docs/httpscripting.html
Date: May 28, 2008
The Art Of Scripting HTTP Requests Using Curl
=============================================
This document will assume that you're familiar with HTML and general
networking.
The possibility to write scripts is essential to make a good computer
system. Unix' capability to be extended by shell scripts and various tools to
run various automated commands and scripts is one reason why it has succeeded
so well.
The increasing amount of applications moving to the web has made "HTTP
Scripting" more frequently requested and wanted. To be able to automatically
extract information from the web, to fake users, to post or upload data to
web servers are all important tasks today.
Curl is a command line tool for doing all sorts of URL manipulations and
transfers, but this particular document will focus on how to use it when
doing HTTP requests for fun and profit. I'll assume that you know how to
invoke 'curl --help' or 'curl --manual' to get basic information about it.
Curl is not written to do everything for you. It makes the requests, it gets
the data, it sends data and it retrieves the information. You probably need
to glue everything together using some kind of script language or repeated
manual invokes.
1. The HTTP Protocol
HTTP is the protocol used to fetch data from web servers. It is a very simple
protocol that is built upon TCP/IP. The protocol also allows information to
get sent to the server from the client using a few different methods, as will
be shown here.
HTTP is plain ASCII text lines being sent by the client to a server to
request a particular action, and then the server replies a few text lines
before the actual requested content is sent to the client.
Using curl's option -v will display what kind of commands curl sends to the
server, as well as a few other informational texts. -v is the single most
useful option when it comes to debug or even understand the curl<->server
interaction.
2. URL
The Uniform Resource Locator format is how you specify the address of a
particular resource on the Internet. You know these, you've seen URLs like
http://curl.haxx.se or https://yourbank.com a million times.
3. GET a page
The simplest and most common request/operation made using HTTP is to get a
URL. The URL could itself refer to a web page, an image or a file. The client
issues a GET request to the server and receives the document it asked for.
If you issue the command line
curl http://curl.haxx.se
you get a web page returned in your terminal window. The entire HTML document
that that URL holds.
All HTTP replies contain a set of headers that are normally hidden, use
curl's -i option to display them as well as the rest of the document. You can
also ask the remote server for ONLY the headers by using the -I option (which
will make curl issue a HEAD request).
4. Forms
Forms are the general way a web site can present a HTML page with fields for
the user to enter data in, and then press some kind of 'OK' or 'submit'
button to get that data sent to the server. The server then typically uses
the posted data to decide how to act. Like using the entered words to search
in a database, or to add the info in a bug track system, display the entered
address on a map or using the info as a login-prompt verifying that the user
is allowed to see what it is about to see.
Of course there has to be some kind of program in the server end to receive
the data you send. You cannot just invent something out of the air.
4.1 GET
A GET-form uses the method GET, as specified in HTML like:
<form method="GET" action="junk.cgi">
<input type=text name="birthyear">
<input type=submit name=press value="OK">
</form>
In your favorite browser, this form will appear with a text box to fill in
and a press-button labeled "OK". If you fill in '1905' and press the OK
button, your browser will then create a new URL to get for you. The URL will
get "junk.cgi?birthyear=1905&press=OK" appended to the path part of the
previous URL.
If the original form was seen on the page "www.hotmail.com/when/birth.html",
the second page you'll get will become
"www.hotmail.com/when/junk.cgi?birthyear=1905&press=OK".
Most search engines work this way.
To make curl do the GET form post for you, just enter the expected created
URL:
curl "www.hotmail.com/when/junk.cgi?birthyear=1905&press=OK"
4.2 POST
The GET method makes all input field names get displayed in the URL field of
your browser. That's generally a good thing when you want to be able to
bookmark that page with your given data, but it is an obvious disadvantage
if you entered secret information in one of the fields or if there are a
large amount of fields creating a very long and unreadable URL.
The HTTP protocol then offers the POST method. This way the client sends the
data separated from the URL and thus you won't see any of it in the URL
address field.
The form would look very similar to the previous one:
<form method="POST" action="junk.cgi">
<input type=text name="birthyear">
<input type=submit name=press value=" OK ">
</form>
And to use curl to post this form with the same data filled in as before, we
could do it like:
curl -d "birthyear=1905&press=%20OK%20" www.hotmail.com/when/junk.cgi
This kind of POST will use the Content-Type
application/x-www-form-urlencoded and is the most widely used POST kind.
The data you send to the server MUST already be properly encoded, curl will
not do that for you. For example, if you want the data to contain a space,
you need to replace that space with %20 etc. Failing to comply with this
will most likely cause your data to be received wrongly and messed up.
Recent curl versions can in fact url-encode POST data for you, like this:
curl --data-urlencode "name=I am Daniel" www.example.com
4.3 File Upload POST
Back in late 1995 they defined an additional way to post data over HTTP. It
is documented in the RFC 1867, why this method sometimes is referred to as
RFC1867-posting.
This method is mainly designed to better support file uploads. A form that
allows a user to upload a file could be written like this in HTML:
<form method="POST" enctype='multipart/form-data' action="upload.cgi">
<input type=file name=upload>
<input type=submit name=press value="OK">
</form>
This clearly shows that the Content-Type about to be sent is
multipart/form-data.
To post to a form like this with curl, you enter a command line like:
curl -F upload=@localfilename -F press=OK [URL]
4.4 Hidden Fields
A very common way for HTML based application to pass state information
between pages is to add hidden fields to the forms. Hidden fields are
already filled in, they aren't displayed to the user and they get passed
along just as all the other fields.
A similar example form with one visible field, one hidden field and one
submit button could look like:
<form method="POST" action="foobar.cgi">
<input type=text name="birthyear">
<input type=hidden name="person" value="daniel">
<input type=submit name="press" value="OK">
</form>
To post this with curl, you won't have to think about if the fields are
hidden or not. To curl they're all the same:
curl -d "birthyear=1905&press=OK&person=daniel" [URL]
4.5 Figure Out What A POST Looks Like
When you're about fill in a form and send to a server by using curl instead
of a browser, you're of course very interested in sending a POST exactly the
way your browser does.
An easy way to get to see this, is to save the HTML page with the form on
your local disk, modify the 'method' to a GET, and press the submit button
(you could also change the action URL if you want to).
You will then clearly see the data get appended to the URL, separated with a
'?'-letter as GET forms are supposed to.
5. PUT
The perhaps best way to upload data to a HTTP server is to use PUT. Then
again, this of course requires that someone put a program or script on the
server end that knows how to receive a HTTP PUT stream.
Put a file to a HTTP server with curl:
curl -T uploadfile www.uploadhttp.com/receive.cgi
6. HTTP Authentication
HTTP Authentication is the ability to tell the server your username and
password so that it can verify that you're allowed to do the request you're
doing. The Basic authentication used in HTTP (which is the type curl uses by
default) is *plain* *text* based, which means it sends username and password
only slightly obfuscated, but still fully readable by anyone that sniffs on
the network between you and the remote server.
To tell curl to use a user and password for authentication:
curl -u name:password www.secrets.com
The site might require a different authentication method (check the headers
returned by the server), and then --ntlm, --digest, --negotiate or even
--anyauth might be options that suit you.
Sometimes your HTTP access is only available through the use of a HTTP
proxy. This seems to be especially common at various companies. A HTTP proxy
may require its own user and password to allow the client to get through to
the Internet. To specify those with curl, run something like:
curl -U proxyuser:proxypassword curl.haxx.se
If your proxy requires the authentication to be done using the NTLM method,
use --proxy-ntlm, if it requires Digest use --proxy-digest.
If you use any one these user+password options but leave out the password
part, curl will prompt for the password interactively.
Do note that when a program is run, its parameters might be possible to see
when listing the running processes of the system. Thus, other users may be
able to watch your passwords if you pass them as plain command line
options. There are ways to circumvent this.
It is worth noting that while this is how HTTP Authentication works, very
many web sites will not use this concept when they provide logins etc. See
the Web Login chapter further below for more details on that.
7. Referer
A HTTP request may include a 'referer' field (yes it is misspelled), which
can be used to tell from which URL the client got to this particular
resource. Some programs/scripts check the referer field of requests to verify
that this wasn't arriving from an external site or an unknown page. While
this is a stupid way to check something so easily forged, many scripts still
do it. Using curl, you can put anything you want in the referer-field and
thus more easily be able to fool the server into serving your request.
Use curl to set the referer field with:
curl -e http://curl.haxx.se daniel.haxx.se
8. User Agent
Very similar to the referer field, all HTTP requests may set the User-Agent
field. It names what user agent (client) that is being used. Many
applications use this information to decide how to display pages. Silly web
programmers try to make different pages for users of different browsers to
make them look the best possible for their particular browsers. They usually
also do different kinds of javascript, vbscript etc.
At times, you will see that getting a page with curl will not return the same
page that you see when getting the page with your browser. Then you know it
is time to set the User Agent field to fool the server into thinking you're
one of those browsers.
To make curl look like Internet Explorer on a Windows 2000 box:
curl -A "Mozilla/4.0 (compatible; MSIE 5.01; Windows NT 5.0)" [URL]
Or why not look like you're using Netscape 4.73 on a Linux (PIII) box:
curl -A "Mozilla/4.73 [en] (X11; U; Linux 2.2.15 i686)" [URL]
9. Redirects
When a resource is requested from a server, the reply from the server may
include a hint about where the browser should go next to find this page, or a
new page keeping newly generated output. The header that tells the browser
to redirect is Location:.
Curl does not follow Location: headers by default, but will simply display
such pages in the same manner it display all HTTP replies. It does however
feature an option that will make it attempt to follow the Location: pointers.
To tell curl to follow a Location:
curl -L www.sitethatredirects.com
If you use curl to POST to a site that immediately redirects you to another
page, you can safely use -L and -d/-F together. Curl will only use POST in
the first request, and then revert to GET in the following operations.
10. Cookies
The way the web browsers do "client side state control" is by using
cookies. Cookies are just names with associated contents. The cookies are
sent to the client by the server. The server tells the client for what path
and host name it wants the cookie sent back, and it also sends an expiration
date and a few more properties.
When a client communicates with a server with a name and path as previously
specified in a received cookie, the client sends back the cookies and their
contents to the server, unless of course they are expired.
Many applications and servers use this method to connect a series of requests
into a single logical session. To be able to use curl in such occasions, we
must be able to record and send back cookies the way the web application
expects them. The same way browsers deal with them.
The simplest way to send a few cookies to the server when getting a page with
curl is to add them on the command line like:
curl -b "name=Daniel" www.cookiesite.com
Cookies are sent as common HTTP headers. This is practical as it allows curl
to record cookies simply by recording headers. Record cookies with curl by
using the -D option like:
curl -D headers_and_cookies www.cookiesite.com
(Take note that the -c option described below is a better way to store
cookies.)
Curl has a full blown cookie parsing engine built-in that comes to use if you
want to reconnect to a server and use cookies that were stored from a
previous connection (or handicrafted manually to fool the server into
believing you had a previous connection). To use previously stored cookies,
you run curl like:
curl -b stored_cookies_in_file www.cookiesite.com
Curl's "cookie engine" gets enabled when you use the -b option. If you only
want curl to understand received cookies, use -b with a file that doesn't
exist. Example, if you want to let curl understand cookies from a page and
follow a location (and thus possibly send back cookies it received), you can
invoke it like:
curl -b nada -L www.cookiesite.com
Curl has the ability to read and write cookie files that use the same file
format that Netscape and Mozilla do. It is a convenient way to share cookies
between browsers and automatic scripts. The -b switch automatically detects
if a given file is such a cookie file and parses it, and by using the
-c/--cookie-jar option you'll make curl write a new cookie file at the end of
an operation:
curl -b cookies.txt -c newcookies.txt www.cookiesite.com
11. HTTPS
There are a few ways to do secure HTTP transfers. The by far most common
protocol for doing this is what is generally known as HTTPS, HTTP over
SSL. SSL encrypts all the data that is sent and received over the network and
thus makes it harder for attackers to spy on sensitive information.
SSL (or TLS as the latest version of the standard is called) offers a
truckload of advanced features to allow all those encryptions and key
infrastructure mechanisms encrypted HTTP requires.
Curl supports encrypted fetches thanks to the freely available OpenSSL
libraries. To get a page from a HTTPS server, simply run curl like:
curl https://that.secure.server.com
11.1 Certificates
In the HTTPS world, you use certificates to validate that you are the one
you claim to be, as an addition to normal passwords. Curl supports client-
side certificates. All certificates are locked with a pass phrase, which you
need to enter before the certificate can be used by curl. The pass phrase
can be specified on the command line or if not, entered interactively when
curl queries for it. Use a certificate with curl on a HTTPS server like:
curl -E mycert.pem https://that.secure.server.com
curl also tries to verify that the server is who it claims to be, by
verifying the server's certificate against a locally stored CA cert
bundle. Failing the verification will cause curl to deny the connection. You
must then use -k in case you want to tell curl to ignore that the server
can't be verified.
More about server certificate verification and ca cert bundles can be read
in the SSLCERTS document, available online here:
http://curl.haxx.se/docs/sslcerts.html
12. Custom Request Elements
Doing fancy stuff, you may need to add or change elements of a single curl
request.
For example, you can change the POST request to a PROPFIND and send the data
as "Content-Type: text/xml" (instead of the default Content-Type) like this:
curl -d "<xml>" -H "Content-Type: text/xml" -X PROPFIND url.com
You can delete a default header by providing one without content. Like you
can ruin the request by chopping off the Host: header:
curl -H "Host:" http://mysite.com
You can add headers the same way. Your server may want a "Destination:"
header, and you can add it:
curl -H "Destination: http://moo.com/nowhere" http://url.com
13. Web Login
While not strictly just HTTP related, it still cause a lot of people problems
so here's the executive run-down of how the vast majority of all login forms
work and how to login to them using curl.
It can also be noted that to do this properly in an automated fashion, you
will most certainly need to script things and do multiple curl invokes etc.
First, servers mostly use cookies to track the logged-in status of the
client, so you will need to capture the cookies you receive in the
responses. Then, many sites also set a special cookie on the login page (to
make sure you got there through their login page) so you should make a habit
of first getting the login-form page to capture the cookies set there.
Some web-based login systems features various amounts of javascript, and
sometimes they use such code to set or modify cookie contents. Possibly they
do that to prevent programmed logins, like this manual describes how to...
Anyway, if reading the code isn't enough to let you repeat the behavior
manually, capturing the HTTP requests done by your browers and analyzing the
sent cookies is usually a working method to work out how to shortcut the
javascript need.
In the actual <form> tag for the login, lots of sites fill-in random/session
or otherwise secretly generated hidden tags and you may need to first capture
the HTML code for the login form and extract all the hidden fields to be able
to do a proper login POST. Remember that the contents need to be URL encoded
when sent in a normal POST.
14. Debug
Many times when you run curl on a site, you'll notice that the site doesn't
seem to respond the same way to your curl requests as it does to your
browser's.
Then you need to start making your curl requests more similar to your
browser's requests:
* Use the --trace-ascii option to store fully detailed logs of the requests
for easier analyzing and better understanding
* Make sure you check for and use cookies when needed (both reading with -b
and writing with -c)
* Set user-agent to one like a recent popular browser does
* Set referer like it is set by the browser
* If you use POST, make sure you send all the fields and in the same order as
the browser does it. (See chapter 4.5 above)
A very good helper to make sure you do this right, is the LiveHTTPHeader tool
that lets you view all headers you send and receive with Mozilla/Firefox
(even when using HTTPS).
A more raw approach is to capture the HTTP traffic on the network with tools
such as ethereal or tcpdump and check what headers that were sent and
received by the browser. (HTTPS makes this technique inefficient.)
15. References
RFC 2616 is a must to read if you want in-depth understanding of the HTTP
protocol.
RFC 2396 explains the URL syntax.
RFC 2109 defines how cookies are supposed to work.
RFC 1867 defines the HTTP post upload format.
http://www.openssl.org is the home of the OpenSSL project
http://curl.haxx.se is the home of the cURL project
Posted by paij0's at 8:20 AM 0 comments
Thursday, October 15, 2009
Unix Grep Command Examples
below to demonstrate grep command.<br>
<code> </code>
<pre>$ cat demo_file<br>THIS LINE IS THE 1ST UPPER CASE
LINE IN THIS FILE.<br>this line is the 1st lower case line in
this file.<br>This Line Has All Its First Character Of The Word
With Upper Case.<br><br>Two lines above this line is
empty.<br>And this is the last line.</pre>
<h3>1. Search for the given string in a single file</h3>
<p>The basic usage of grep command is to search for a specific string
in the specified file as shown below.</p>
<pre>Syntax:<br>grep "literal_string" filename</pre>
<p><code> </code></p>
<center>
</center>
<pre>$ grep "this" demo_file<br>this line is the 1st lower
case line in this file.<br>Two lines above this line is
empty.</pre>
<h3>2. Checking for the given string in multiple files.</h3>
<pre>Syntax:<br>grep "string" FILE_PATTERN</pre>
<p><code> </code><br>
This is also a basic usage of grep command. For this example, let us
copy the demo_file to demo_file1. The grep output will also include the
file name in front of the line that matched the specific pattern as
shown below. When the Linux shell sees the meta character, it does the
expansion and gives all the files as input to grep.</p>
<pre>$ cp demo_file demo_file1<br><br>$ grep "this"
demo_*<br>demo_file:this line is the 1st lower case line in this
file.<br>demo_file:Two lines above this line is
empty.<br>demo_file:And this is the last
line.<br>demo_file1:this line is the 1st lower case line in this
file.<br>demo_file1:Two lines above this line is
empty.<br>demo_file1:And this is the last line.</pre>
<h3>3. Case insensitive search using grep -i</h3>
<pre>Syntax:<br>grep -i "string" FILE</pre>
<p><code> </code><br>
This is also a basic usage of the grep. This searches for the given
string/pattern case insensitively. So it matches all the words such as
“the”, “THE” and “The” case insensitively as shown below.</p>
<pre>$ grep -i "the" demo_file<br>THIS LINE IS THE 1ST
UPPER CASE LINE IN THIS FILE.<br>this line is the 1st lower case
line in this file.<br>This Line Has All Its First Character Of
The Word With Upper Case.<br>And this is the last
line.</pre>
<h3>4. Match regular expression in files</h3>
<pre>Syntax:<br>grep "REGEX" filename</pre>
<p><code> </code><br>
This is a very powerful feature, if you can use use regular expression
effectively. In the following example, it searches for all the pattern
that starts with “lines” and ends with “empty” with anything
in-between. i.e To search “lines[anything in-between]empty” in the
demo_file.</p>
<pre>$ grep "lines.*empty" demo_file<br>Two lines above this line is empty.</pre>
<p>From documentation of grep: A regular expression may be followed by
one of several repetition operators:</p>
<ul>
<li>? The preceding item is optional and matched at most once.</li>
<li>* The preceding item will be matched zero or more times.</li>
<li>+ The preceding item will be matched one or more times.</li>
<li>{n} The preceding item is matched exactly n times.</li>
<li>{n,} The preceding item is matched n or more times.</li>
<li>{,m} The preceding item is matched at most m times.</li>
<li>{n,m} The preceding item is matched at least n times, but not
more than m times.</li>
</ul>
<h3>5. Checking for full words, not for sub-strings using grep -w</h3>
<p>If you want to search for a word, and to avoid it to match the
substrings use -w option. Just doing out a normal search will show out
all the lines.<br>
<code> </code><br>
The following example is the regular grep where it is searching for
“is”. When you search for “is”, without any option it will show out
“is”, “his”, “this” and everything which has the substring “is”.</p>
<pre>$ grep -i "is" demo_file<br>THIS LINE IS THE 1ST UPPER
CASE LINE IN THIS FILE.<br>this line is the 1st lower case line
in this file.<br>This Line Has All Its First Character Of The
Word With Upper Case.<br>Two lines above this line is
empty.<br>And this is the last line.</pre>
<p><code> </code><br>
The following example is the WORD grep where it is searching only for
the word “is”. Please note that this output does not contain the line
“This Line Has All Its First Character Of The Word With Upper Case”,
even though “is” is there in the “This”, as the following is looking
only for the word “is” and not for “this”.</p>
<pre>$ grep -iw "is" demo_file<br>THIS LINE IS THE 1ST
UPPER CASE LINE IN THIS FILE.<br>this line is the 1st lower case
line in this file.<br>Two lines above this line is
empty.<br>And this is the last line.</pre>
<h3>6. Displaying lines before/after/around the match using grep -A, -B
and -C</h3>
<p>When doing a grep on a huge file, it may be useful to see some lines
after the match. You might feel handy if grep can show you not only the
matching lines but also the lines after/before/around the match.</p>
<p><code> </code><br>
Please create the following demo_text file for this example.</p>
<pre>$ cat demo_text<br>4. Vim Word
Navigation<br><br>You may want to do several navigation in
relation to the words, such as:<br><br> * e - go to the end
of the current word.<br> * E - go to the end of the current
WORD.<br> * b - go to the previous (before) word.<br> * B -
go to the previous (before) WORD.<br> * w - go to the next
word.<br> * W - go to the next WORD.<br><br>WORD -
WORD consists of a sequence of non-blank characters, separated with
white space.<br>word - word consists of a sequence of letters,
digits and underscores.<br><br>Example to show the
difference between WORD and word<br><br> * 192.168.1.1 -
single WORD<br> * 192.168.1.1 - seven words.</pre>
<h4>6.1 Display N lines after match</h4>
<p>-A is the option which prints the specified N lines after the match
as shown below.</p>
<pre>Syntax:<br>grep -A "string" FILENAME</pre>
<p><code> </code><br>
The following example prints the matched line, along with the 3 lines
after it.</p>
<pre>$ grep -A 3 -i "example" demo_text<br>Example to show
the difference between WORD and word<br><br>* 192.168.1.1 -
single WORD<br>* 192.168.1.1 - seven words.</pre>
<h4>6.2 Display N lines before match</h4>
<p>-B is the option which prints the specified N lines before the match.</p>
<pre>Syntax:<br>grep -B <N> "string" FILENAME<br></pre>
<p><code> </code><br>
When you had option to show the N lines after match, you have the -B
option for the opposite.</p>
<pre>$ grep -B 2 "single WORD" demo_text<br>Example to show
the difference between WORD and word<br><br>* 192.168.1.1 -
single WORD</pre>
<h4>6.3 Display N lines around match</h4>
<p>-C is the option which prints the specified N lines before the
match. In some occasion you might want the match to be appeared with
the lines from both the side. This options shows N lines in both the
side(before & after) of match.</p>
<pre>$ grep -C 2 "Example" demo_text<br>word - word
consists of a sequence of letters, digits and
underscores.<br><br>Example to show the difference between
WORD and word<br><br>* 192.168.1.1 - single WORD</pre>
<h3>7. Highlighting the search using GREP_OPTIONS</h3>
<p>As grep prints out lines from the file by the pattern / string you
had given, if you wanted it to highlight which part matches the line,
then you need to follow the following way.<br>
<code> </code><br>
When you do the following export you will get the highlighting of the
matched searches. In the following example, it will highlight all the
this when you set the GREP_OPTIONS environment variable as shown below.</p>
<pre>$ export GREP_OPTIONS='--color=auto'
GREP_COLOR='100;8'<br><br>$ grep this
demo_file<br><strong>this</strong> line is the 1st
lower case line in this file.<br>Two lines above
<strong>this</strong> line is empty.<br>And
<strong>this</strong> is the last line.</pre>
<h3>8. Searching in all files recursively using grep -r</h3>
<p>When you want to search in all the files under the current directory
and its sub directory. -r option is the one which you need to use. The
following example will look for the string “ramesh” in all the files in
the current directory and all it’s subdirectory.</p>
<pre>$ grep -r "ramesh" *</pre>
<h3>9. Invert match using grep -v</h3>
<p>You had different options to show the lines matched, to show the
lines before match, and to show the lines after match, and to highlight
match. So definitely You’d also want the option -v to do invert match.<br>
<code> </code><br>
When you want to display the lines which does not matches the given
string/pattern, use the option -v as shown below. This example will
display all the lines that did not match the word “go”.</p>
<pre>$ grep -v "go" demo_text<br>4. Vim Word
Navigation<br><br>You may want to do several navigation in
relation to the words, such as:<br><br>WORD - WORD consists
of a sequence of non-blank characters, separated with white
space.<br>word - word consists of a sequence of letters, digits
and underscores.<br><br>Example to show the difference
between WORD and word<br><br>* 192.168.1.1 - single
WORD<br>* 192.168.1.1 - seven words.</pre>
<h3>10. display the lines which does not matches all the given pattern.</h3>
<pre>Syntax:<br>grep -v -e "pattern" -e "pattern"</pre>
<p><code> </code></p>
<pre>$ cat
test-file.txt<br>a<br>b<br>c<br>d<br><br>$
grep -v -e "a" -e "b" -e "c" test-file.txt<br>d</pre>
<h3>11. Counting the number of matches using grep -c</h3>
<p>When you want to count that how many lines matches the given
pattern/string, then use the option -c.</p>
<pre>Syntax:<br>grep -c "pattern" filename</pre>
<p><code> </code></p>
<pre>$ grep -c "go" demo_text<br>6</pre>
<p><code> </code><br>
When you want do find out how many lines matches the pattern</p>
<pre>$ grep -c this demo_file<br>3</pre>
<p><code> </code><br>
When you want do find out how many lines that does not match the pattern</p>
<pre>$ grep -v -c this demo_file<br>4</pre>
<h3>12. Display only the file names which matches the given pattern
using grep -l</h3>
<p>If you want the grep to show out only the file names which matched
the given pattern, use the -l (lower-case L) option.<br>
<code> </code><br>
When you give multiple files to the grep as input, it displays the
names of file which contains the text that matches the pattern, will be
very handy when you try to find some notes in your whole directory
structure.</p>
<pre>$ grep -l this demo_*<br>demo_file<br>demo_file1</pre>
<h3>13. Show only the matched string</h3>
<p>By default grep will show the line which matches the given
pattern/string, but if you want the grep to show out only the matched
string of the pattern then use the -o option.<br>
<code> </code><br>
It might not be that much useful when you give the string straight
forward. But it becomes very useful when you give a regex pattern and
trying to see what it matches as</p>
<pre>$ grep -o "is.*line" demo_file<br>is line is the 1st
lower case line<br>is line<br>is is the last
line</pre>
<h3>14. Show the position of match in the line</h3>
<p>When you want grep to show the position where it matches the pattern
in the file, use the following options as</p>
<pre>Syntax:<br>grep -o -b "pattern" file</pre>
<p><code> </code></p>
<pre>$ cat
temp-file.txt<br>12345<br>12345<br><br>$ grep
-o -b "3" temp-file.txt<br>2:3<br>8:3</pre>
<p><code> </code><br>
<strong>Note:</strong> The output of the grep command above is not the
position in the line, it is byte offset of the whole file.</p>
<h3>15. Show line number while displaying the output using grep -n</h3>
<p>To show the line number of file with the line matched. It does
1-based line numbering for each file. Use -n option to utilize this
feature.</p>
<pre>$ grep -n "go" demo_text<br>5: * e - go to the end of
the current word.<br>6: * E - go to the end of the current
WORD.<br>7: * b - go to the previous (before) word.<br>8: *
B - go to the previous (before) WORD.<br>9: * w - go to the next
word.<br>10: * W - go to the next WORD.</pre>
<p><code> </code></p>
link source
Posted by paij0's at 8:29 AM 0 comments
Labels: linux
Wednesday, October 14, 2009
Delecious....
when i look at it ..... slurp...
CHICKEN POT PIE SOUP OVER MASHED POTATOES
Posted by paij0's at 12:08 PM 1 comments
Saturday, October 3, 2009
Karst Sulawesi
Go Ad-Free
To support free features you love, from time to time, we display
text ads on your blog to logged-out users. And you can make money from
your blog content. Any click on those ads will generate a revenue and 75% of the revenue from these ads will be credited to your account. All you need is Google AdSense account.
Become a Blogetery Supporter and you can earn 100% of the revenue or remove ads completely from your blogs for small fee.
AdSense ads already blended into your blogs, and your blog readers
will be more likely to read the ad, and subsequently click it. Plus, you can put ads into your sidebars using widget system.
Become a Blogetery Supporter today!
Did you know that Blogetery sometimes have ads on them, limited upload space and somewhat restricted functionality?
However, Blogetery Supporters have no ads, can earn 100% of AdSense revenue from your blog content, receive 1GB of upload space and get exclusive access to plugins alongside other extra functionality!
And Blogetery Supporters can easily use their blog on iPhones, other mobiles and through third party publishing devices like Windows Live Writer.
Posted by paij0's at 2:01 PM 0 comments
Labels: Copas Forum
Tuesday, September 29, 2009
Twitter Beginner. (it is me!)
kayaknya di Indonesia masih kalah sama facebook. Apalagi ada facebooklite kayaknya berat buat twitter untuk bersaing di Indonesia, tapi siapa tahu.
Posted by paij0's at 11:04 AM 0 comments
About Hamza Perez, Mecca2medina.
Hamza
Perez originally a rap singer who closely with the drug world. In the
world of American rap, almost certainly, drugs and sex is always a big
part. Hamza seemed to feel hopeless how to stop its dependence on two
things, but when he embraced Islam, he immediately stopped. Something
strange, according to Hamza. "However, I could not stop the hip-hop!"
He said.
Setelah masuk Islam, Hamza baru tahu banyak kejadian yang merugikan Islam. Misalnya saja, FBI seringkali menghancurkan masjid-masjid tanpa ada alasan. Ia dijebloskan ke penjara, namun di hotel Prodeo itu, Hamza mengajarkan Islam kepada para tahanan yang lain. “Sekarang, bagi saya, karena saya tidak bisa berhenti dari musik, maka musik saya sepenuhnya jalan dakwah saya.” ujarnya.
Once converted to Islam, Hamza learned many Islamic adverse events. For example, the FBI often destroy the mosques without any reason. He was thrown into prison, but the hotel was without cost, Hamza taught Islam to other prisoners. "Now, for me, because I could not stop the music, then my music full path my mission." He said.
Saat ini, para penyanyi rap dan hip-hop Muslim di Amerika lebih banyak tinggal di daerah Bay-Area. Menurut Tyson Amir-Mustafa, 29, seorang warga San Jose yang telah merilis empat album rap Islami, Islam masih sangat baru di daerah itu, tapi berkembang sangat cepat.
Currently, the rapper and hip-hop more Muslims in America live in areas many Bay-Area. According to Tyson Amir-Mustafa, 29, a resident of San Jose who has released four albums rap Islami, Islam is still very new in the area, but growing very fast.
Menurut Mustafa, sangat sulit untuk menciptakan sebuah lagu rap Islami. Karena, seperti yang diketahui, Islam bukanlah sebuah agama yang membebaskan perbuatan dan perkataan kotor, sedangkan rap sudah identik dengan sumpah serapah dan caci maki. Namun, satu hal yang membuat Mustafa dan para rapper lainnya menjadi terbuka, tema yang bisa diangkat dalam Islam menjadi sangat luas. “Kalau Anda ingin mengetahui perbedaan Islam dengan agama lainnya, dengarkanlah musik kami. Musik rap kami mempunyai integritas.” jelasnya.
According to Mustafa, very difficult to create an Islamic rap song. Because, as is well known, Islam is not a religion that frees dirty deeds and words, while rap is synonymous with curses and insults. However, one thing that makes Mustafa and the other rappers be open, a theme that can be raised in Islam is very broad. "If you want to know the difference of Islam with other religions, listen to our music. Rap music we have integrity. "He explained.
Tampaknya, seperti zaman yang kini sedang berkembang, musik rap Islami akan menjadi alternatif yang semakin banyak dipilih oleh masyarakat Amerika, sekaligus juga mengenalkan dan mengajak orang kepada Islam, sebuah agama yang jauh sama sekali dari kata teror.
pparently, like the era that is now being developed, Islamic rap music will become more and more alternatives chosen by American society, as well as introduce and invite people to Islam, a religion that far at all from the word terror.
(http://eramuslim.com)
Posted by paij0's at 10:25 AM 0 comments
Mecca2medina feat Raihan
i found nice comment
music is never wrong my friend. even Muhammad himself asked Aisyah to bring a group of man that plays drum and music as an engaging ceremony.
secondly, Umar has once asked why there are singers in the Rasulullah's house and asked the singers to get out. but Muhammad said, its ok, its Eid fitr. and Aishah really likes music, so, he didnt chase the singers out of his home.
music and entertainment is like salt in dishes. it is needed, but not too much =)
musik tidaklah buruk, tidak juga haram. Musik dan hiburan itu seperti garam pada hidangan, dibutuhkan, tetapi jangan kebanyakan... hm... meatafora yang maizzz....!
Posted by paij0's at 9:56 AM 0 comments
Monday, September 28, 2009
How to Submit Sitemaps from Yahoo and Google
Did you know that at this time Yahoo has joined Google Sitemaps system with the opportunity to give the webmaster and blogger for the sites and blogs to register for free with Yahoo sitemap? Yahoo! as we know is the number two search engine in the world after Google. Index our blog on Yahoo! will increase the visibility of our blog in the internet world. In addition, if indexed Yahoo! will usually also add a link index our site on Google, so how vice versa. Next list / registration blog / site on Yahoo! Sitemaps.
1. Please slide to Yahoo Sitemaps.
2. You will be logged. Enter your email ID and a password.
3. Enter your blog URL address in the box provided.
4. Once registered we blog, there is the menu "Status" click the "authenticated" I have underneath.
5. You'll have two choices meta tags or code you have to download the script. Copy / paste the code and enter a meta tag in the header to your blog.
5. A. Form of meta tags Yahoo Sitemaps for example like this:
<META Name="y_key" content="xxxxxxxxxxxxxxxxx"/>
5. B. Header in the form of a template (blogspot / blogger) in the top position is like this:
<head>
<title><$BlogPageTitle$></title>
<meta http-equiv="Content-Type" content="text/html; charset=UTF-8"/>
6. Put the code from the meta Yahoo! Sitemaps below code
<meta http-equiv="Content-Type" content="text/html; charset=UTF-8"/>
7. Click SAVE template.
8. Back to Yahoo Sitemaps menu and click on "Ready to Authenticate"
9. Done.
Note: (Update), for example a blog page caught by search engines, this paper with the keyword "Installing Yahoo meta tags" I index by google on the first page. To check Please see here.
================================================== =
2.Meta tags with Google Sitemap
How:
1. Login ID dg email in GMail http://google.com
2. At the top of the box ADD SITE -> enter your blog -> Example: http://trik-tipsblog.blogspot.com
3. Click OK.
4. After that there is in the field have SITEMAP ADD command. Click.
5. Select ADD GENERAL WEB SITEMAP
6. Love signs TIK / check in the menu (a) I've created, ff, (b) I've Uploaded, ff. (c) My Sitemap .. ff.
7. MY SITEMAP menu in the URL is. Dg contents of your blog feed.
Example:
http://trik-tipsblog.blogspot.com/atom.xml
8. Click the ADD WEB SITEMAP
9. There will be information on the menu that we are sitemap pending.
10. Click on the menu at the top of the Diagnostic.
11. Click the Verify. There is information that the Verification Status: NOT verified.
12. There are menu "Choose verification method" -> select ADD A META tags.
13. We give code
<Meta name = "dst ...">
14. Enter the code in the page meta HEAD blog template in between us
<HEAD> And </HEAD>, preferably under <title> <BlogPageTitle> </title>
15. After the blog template and republish on SAVE, go back to the sitemap and you sign TIK / check in the menu "I've added the META tag in the homepage."
16. Click Verify.
17. Finish.
Note Important:
1. How / tip is only a blog for free I like blogger / blogspot, blogdrive, etc..
2. I blog for the top domain and hosting your own way you can be effective but it's a little less complicated for you experience html / xml. Description can be seen below:
http://www.google.com
Posted by paij0's at 2:53 PM 0 comments
My Blogspot Not Verified under Google Webmaster Tools
setelah mengutak-utik.. dan melototi html code yang ada layout blogger, kesalahan naruh sebelum dan sesudah tag tittle saja... ternyata
so if you put tag meta generator content (that gave by google webmaster tools) after tag title (
Posted by paij0's at 10:59 AM 0 comments
Wednesday, September 2, 2009
It Is Easy to Use Flock for Blog Post
This post is a quick simple for using the best browser for social networking, it is flock that have nice and quick setting for blogpost using the standar feauture that is Blog Editor in the tab Tools....
and this is the test....
Friday, May 8, 2009
Top Malware attack Indonesia
Net-Worm.Win32.Kido.ih 49.4841%
HEUR:Trojan.Win32.Generic 8.0257%
Heur.Win32.Trojan.Generic 6.3747%
Trojan-Dropper.Win32.Small.axv 3.2791%
Trojan-Dropper.Win32.Agent.zje 2.7517%
Trojan-Downloader.Win32.Agent.wxq 2.7058%
Trojan-Downloader.Win32.Agent.ansh 2.7058%
Heur.Win32.Invader 2.0179%
Trojan-DDoS.Win32.Agent.ei 1.5363%
HEUR:Trojan.Win32.Invader 1.4676%
Net-Worm.Win32.Kido.eo 1.1236%
Backdoor.Win32.VB.iqo 1.1236%
Trojan.Win32.Agent.ceaj 0.986%
Trojan-Downloader.Win32.FraudLoad.ehp 0.9172%
Trojan.Win32.Agent.ceau 0.8255%
HEUR:Trojan-Downloader.Win32.Generic 0.7567%
Trojan-Downloader.Win32.Injecter.crw 0.6191%
Trojan-Downloader.Win32.Injecter.cqd 0.5733%
Trojan-Downloader.Win32.FraudLoad.eiq 0.5274%
not-a-virus:AdWare.Win32.Shopper.ar 0.5274%
Trojan-Downloader.Win32.Agent.bjts 0.4815%
Heur.Win32.Downloader 0.4357%
Trojan-Downloader.Win32.CodecPack.fus 0.4127%
Rootkit.Win32.Podnuha.bsd 0.3898%
Trojan.Win32.Agent.cexo 0.321%
Trojan-Downloader.Win32.Small.jqz 0.2981%
Trojan-Downloader.Win32.Agent.bjhd 0.2752%
Trojan-Downloader.Win32.FraudLoad.eih 0.2752%
not-a-virus:AdWare.Win32.Agent.lmz 0.2064%
Trojan-Dropper.Win32.VB.yab 0.1834%
Trojan-Downloader.Win32.FraudLoad.eik 0.1834%
Trojan-Spy.Win32.Ardamax.e 0.1605%
MultiPacked.Multi.Generic 0.1605%
Net-Worm.Win32.Kido.dam.y 0.1605%
Trojan-GameThief.Win32.Magania.baex 0.1376%
Trojan-Downloader.Win32.FraudLoad.ehz 0.1376%
Trojan.Win32.Agent2.ils 0.1376%
Net-Worm.Win32.Kido.fc 0.1376%
Trojan.Win32.Buzus.fit 0.1376%
Trojan-PSW.Win32.IMMultiPass.od 0.1376%
not-a-virus:AdWare.Win32.Shopper.v 0.1147%
Trojan-Downloader.MSIL.Agent.bn 0.1147%
Trojan-GameThief.Win32.WOW.bie 0.1147%
Trojan.Win32.Tdss.abzl 0.1147%
Virus.Win32.Sality.z 0.1147%
not-a-virus:FraudTool.Win32.MalwareDoctor.d 0.1147%
Trojan.Win32.Buzus.axkt 0.0917%
Backdoor.Win32.Agent.aexg 0.0917%
Trojan-Dropper.Win32.Small.ayg 0.0917%
Packed.Win32.Black.d 0.0917%
Trojan.Win32.Pakes.lmb 0.0917%
Trojan.Win32.Agent.cbst 0.0917%
not-a-virus:AdWare.Win32.OneStep.z 0.0917%
Trojan-Downloader.Win32.Geral.hu 0.0917%
Hoax.MSIL.BadJoke.Agent.s 0.0917%
SuspiciousPacker.Multi.Generic 0.0917%
Trojan.Win32.Monder.cavv 0.0688%
Trojan.Win32.Buzus.avhh 0.0688%
Trojan-Downloader.Win32.FraudLoad.eil 0.0688%
Trojan-Downloader.Win32.Agent.bvpa 0.0688%
Trojan-Downloader.Win32.Horst.bc 0.0688%
Backdoor.Win32.Inject.aav 0.0688%
Trojan.Win32.VB.odh 0.0688%
Trojan-Spy.Win32.Ardamax.n 0.0688%
Trojan.Win32.Dialer.ext 0.0688%
Trojan-Downloader.JS.Agent.gj 0.0688%
Trojan.Win32.Agent.arjp 0.0688%
Trojan-Dropper.Win32.Microjoin.ap 0.0688%
not-a-virus:Client-IRC.Win32.mIRC.631 0.0688%
HEUR:Worm.Win32.Generic 0.0688%
not-a-virus:AdWare.Win32.SaveNow.cp 0.0688%
Trojan-Dropper.Win32.Agent.doi 0.0688%
Trojan.Win32.Agent2.hmp 0.0688%
Virus.Win32.Sality.aa 0.0688%
HEUR:Trojan.Win32.StartPage 0.0688%
Heur.Win32.Backdoor.Generic 0.0688%
Backdoor.Win32.Hupigon.aovn 0.0688%
Trojan-GameThief.Win32.Lmir.cny 0.0459%
Trojan-Dropper.Win32.ExeBinder.e 0.0459%
not-a-virus:FraudTool.Win32.PrivacyCenter.k 0.0459%
Packed.Win32.Black.a 0.0459%
not-a-virus:FraudTool.Win32.ErrorDoctor.d 0.0459%
Trojan.Win32.Tdss.abwc 0.0459%
Trojan-Downloader.Win32.Agent.akwa 0.0459%
Virus.Win32.Alman.b 0.0459%
Trojan-Banker.Win32.Banker.adks 0.0459%
not-a-virus:AdWare.Win32.Sahat.as 0.0459%
Packed.Win32.Krap.n 0.0459%
Backdoor.Win32.Bifrose.foo 0.0459%
not-a-virus:FraudTool.Win32.InternetAntivirus0.0459%
Backdoor.Win32.Rbot.jxa 0.0459%
Trojan-Dropper.Win32.Tiny.cf 0.0459%
Trojan-PSW.Win32.LdPinch.afeb 0.0459%
Trojan-Dropper.Win32.VB.jue 0.0459%
not-a-virus:NetTool.Win32.Portscan.c 0.0459%
Trojan.Win32.Monder.byqu 0.0459%
Trojan.Win32.Delf.kym 0.0459%
Net-Worm.Win32.Koobface.in 0.0459%
Trojan.Win32.Agent.ccpe 0.0459%
Virus.Win32.VB.ki 0.0459%
Posted by paij0's at 12:56 AM 0 comments
Sejumlah keanehan dalam kasus antasari
Sejumlah Keanehan Dalam Kasus Antasari
oleh Riza Fahriza
Jakarta,
8/5 (ANTARA) - Pemberitaan mengenai Ketua Komisi Pemberantasan Korupsi
(KPK) nonaktif Antasari Azhar yang menjadi tersangka dalam kasus dugaan
pembunuhan Direktur PT Putra Rajawali Banjaran (PBR) dalam waktu
sepekan terakhir selalu ditempatkan di halam muka media cetak.
Demikian pula dalam pemberitaan di media elektronik. Bahkan salah satu
televisi swasta nasional membuat logo khusus dalam penayangan
pemberitaan kasus yang menimpa orang yang dikenal sebagai "pendekar
pemberantas korupsi" tersebut.
Pada sisi lain, muncul sejumlah keluhan yang, antara lain, menyebut
bahwa pemberitaan kasus itu mengarah pada pembunuhan karakter orang
yang selama ini memiliki prestasi sangat dalam mengungkap praktik
korupsi di tanah air.
Maraknya berita soal Antasari bisa disebut dimulai ketika sejumlah
wartawan mendapat sms dari nomor tidak dikenal yang isinya: Ass.ww ibu
negara yth. pelaku penembakan Nasrudin Direktur anak perusahaan RNI
telah ditangkap dan mengaku dibayar Antasari, mohon pemerintah segera
mengumumkan dan segera menangkap Ketua KPK.
Pesan singkat itu diterima wartawan pada Kamis (30/4). Esok harinya,
pemberitaan soal itu mulai menghiasi media massa, tanpa kecuali.
Media massa, pada hari pertama berita besar itu beredar mendapat
"umpan" baru, yaitu ketika Kejaksaan Agung mengumumkan kepada pers
mengenai status Antasari Azhar yang menjadi tersangka dalam kasus itu.
Disebutkan, status itu diperoleh dari surat Badan Reserse dan Kriminal
Polri.
Saat itu, Kepala Pusat Penerangan Hukum Kejaksaan Agung Jasman
Pandjaitan menyebutkan, surat dari Mabes Polri itu bersifat rahasia.
Sejumlah wartawan "berkasak-kusuk" mengenai surat rahasia yang isinya
diumumkan secara terbuka tersebut.
Surat itu oleh Kejaksaan dijadikan dasar untuk melakukan pencekalan terhadap Antasari Azhar.
Biasanya, pengumuman status tersangka merupakan kewenangan dari
kepolisian. Bagi wartawan yang biasa meliput kasus hukum, ini merupakan
keanehan kedua setelah sms dari orang tidak dikenal.
Menurut peneliti Indonesia Corruption Watch (ICW), Febri Diansyah,
pengumuman penetapan sebagai tersangka itu merupakan kewenangan
penyidik, yaitu polisi.
Sebelum pengumuman itu, sejumlah petinggi Mabes Polri datang ke kantor Jaksa Agung Hendarman Supandji, Jumat (1/5) pagi.
Tapi, semua pejabat kejaksaan melakukan gerakan tutup mulut saat ditanya mengenai pertemuan tersebut.
Kejaksaan juga tutup mulut ketika wartawan bertanya dasar hukum atau
pun alasan yang membuat korps penuntut itu mendahului polisi dalam hal
penetapan status tersangka kepada Antasari.
Pada pengumuman Jumat itu, Jasman Pandjaitan menyatakan, penyidik Polri
saat itu sudah melakukan penyidikan terhadap pembunuhan berencana
Nasrudin yang terjadi di Tangerang pada 14 Maret 2009. Dalam pengumuman
itu juga disebutkan nama AA sebagai aktor intelektual pembuhan tersebut.
Ketika wartawan merasakan suasana kehati-hatian Polri dalam kasus ini,
isu terus berkembang dengan bahan baru yang menyebutkan adanya kasus
asmara yang melatarbelakangi pembunuhan tersebut.
Muncul nama Rhani Juliani, gadis pendamping (caddy) di Lapangan Golf
Modernland, Tangerang, yang disebut-sebut memiliki kaitan dengan
Antasari dan Nasrudin.
Akhirnya kepolisian pada Senin (4/5) atau tiga hari setelah pengumuman
di Kejaksaan Agung, menetapkan status Antasari Azhar sebagai tersangka.
Namun, tidak ada keterangan mengenai motif motif dari pembunuhan itu.
Pengumuman itu dilakukan pada siang hari, setelah pada pagi harinya polisi memeriksa Antasari.
Antasari Azhar pun harus ditahan di Rumah Tahanan Narkoba Polda Metro
Jaya. Dia diancam hukuman pidana seumur hidup karena dikenai Pasal 340
KUHP mengenai pembunuhan berencana.
Pemberitaan soal Antasari Azhar terus membesar.
Bantah pertemuan
Sementara itu, Jaksa Agung Hendarman Supandji membantah adanya
pertemuan khusus menjelang penahanan Ketua Komisi Pemberantasan Korupsi
(KPK) Antasari Azhar.
"Enggak," katanya ketika dikonfirmasi ada tidaknya pertemuan khusus
itu. Dia hanya menjawab singkat seperti itu ketika ditemui seusai
mengikuti Rapat Koordinasi Penanganan Perkara Perselisihan Hasil Pemilu
di Gedung Mahkamah Konstitusi (MK), Jakarta, Kamis.
Yang dibantah itu menyebutkan, sebelum penahanan terhadap Antasari,
sempat digelar pertemuan dengan sejumlah pihak terkait kasus dugaan
pembunuhan terhadap Direktur PT Putra Rajawali Banjaran (PRB) Nasrudin
Zulkarnaen.
Jaksa Agung kemudian menyatakan, kejaksaan sudah menerima Surat Perintah Dimulainya Penyidikan (SPDP) dari pihak kepolisian.
"Saya hanya menerima surat pemberitahuan dimulainya penyidikan. Karena
`locus delictie` (tempat kejadian) perkara itu ada di wilayah Kejati
Banten, maka saya minta untuk ditunjuk jaksa pada Kejati Banten,"
katanya.
Ketika ditanya wartawan mengenai sikap kejaksaan yang mengumumkan
Antasari Azhar sebagai tersangka mendahului pernyataan kepolisian
sebagai bentuk rivalitas dengan KPK, Jaksa Agung menjawabnya, "kalau
membalas, itu kan dipukul lalu membalas mukul, ini tidak ada," katanya.
Ia juga menyatakan, pengumuman kejaksaan mengenai status Antasari Azhar
sebagai tersangka dalam kasus pembunuhan itu karena ditanya wartawan.
Keterangan Jaksa Agung itu bertentangan dengan fakta jumpa pers pada
Jumat (1/5). Ketika itu, nama AA yang disebut sebagai aktor intelektual
keluar dari mulut Kepala Pusat Penerangan Hukum Kejaksaan Agung Jasman
Pandjaitan.
Saat itu, Jasman sedang mengumumkan surat rahasia Polri yang diterima
Kejaksaan Agung. Jadi bukan pada saat tanya jawab dengan wartawan.
Pengacara Juniver Girsang SH, yang menjadi para pembela Antasari Azhar,
mengatakan, ada skenario besar di balik kasus pembunuhan Nasarudin
Zulkarnain.
"Ada pihak lain yang ingin mengarahkan agar Antasari jadi tersangka," kata Jurniver Girsang.
Dia mengatakan, pemberitaan tentang Antasari menyangkut kasus
pembunuhan Nasarudin itu dianggap berlebihan sehingga terkadang
mendahului penyidik dan ada pula yang menyebutkan Antasari menjadi
tersangka.
Menurut Girsang, tidak tertutup kemungkinan dalam kasus tersebut
Antasari diarahkan sebagai tersangka karena ia sering mengungkap kasus
korupsi dengan skala besar.
Kuasa hukum Ari Yusuf Amir menyesalkan sikap kejaksaan yang mengumumkan status kliennya sebagai tersangka.
"Kita menyesalkan sikap kejaksaan, karena itu bukan kewenangannya," katanya kepada ANTARA.
Ia mengatakan, sikap kejaksaan itu terlalu cepat menyimpulkan.
Masyarakat curiga
Penetapan status tersangka kepada Antasari Azhar itu juga menjadi tanda tanya dari anggota masyarakat.
"Saya tidak percaya dengan tuduhan terhadap Antasari Azhar, dia kan
sedang gigihnya melawan korupsi. Tentunya dia banyak musuhnya," kata
salah seorang warga yang sengaja datang ke Polda Metro Jaya saat
menjelang pemeriksaan terhadap Antasari Azhar.
Keluarga Antasari Azhar juga menyatakan ketidakpercayaan atas tuduhan itu.
"Saya yakin seratus persen, tidak mungkin Antasari Azhar berbuat
sebodoh itu," kata Ariman Azhar, kakak kandung Antasari Azhar.
Ia menjelaskan adik kandungnya itu memiliki dua anak perempuan, yang
sudah menjadi dokter hingga tidak mungkin melakukan tindakan seperti
itu.
Ketika ditanya apakah dalam kasus itu, adik kandungnya menjadi korban konspirasi, dia menjawab "No comment".
Hal senada dikatakan rekan Antasari Azhar bernama Yuniar. Rekan ketika
saat sama-sama mengambil program S2, Yuniar yang mengatakan dirinya
tidak percaya dengan yang disangkakan terhadap rekannya tersebut.
"Saya tidak percaya. Ini ada konspirasi. Apalagi dia jadi Ketua KPK
banyak kasus korupsi yang ditangani. Saya tahu pribadi dia," kata rekan
kuliah S2 Antasari itu. ***4***
(T.R021/B/s018/s018) 08-05-2009 10:13:31
Posted by paij0's at 12:49 AM 0 comments
Labels: interesting
Thursday, May 7, 2009
What is the Widgets?
What is a widget, download free widgets
What exactly are widgets?
Widgets can basically be thought of as shareable gadgets. Therefore
a web widget is a portable piece of code that can be used easily on
numerous websites. To display a widget on your own website, you simply
need to insert a piece of code to your HTML. The widgets main
programming code is stored on the original developers webhost but the
graphical interface is displayed on your site.
For example, LabPixies has a Calorie Calculator Widget (gadget).
As you can see below, for webmasters of health sites, this type of
gadget is ideal to keep visitors coming back each day.
Note for webmasters: It's important to remember that
the widget is being served from a 3rd party and will only load as fast
as their webserver. If it is a popular widget you are using on your
website, then it could slow down your page load time.
Listed below are some great sites where you can download free web widgets:
LabPixies
has some of the coolest gadgets / widgets on the web including games
such as Invaders, Backgammon, Marbles and many others. Their
entertainment widgets include Flickr Slides, YouTube top feeds, Yahoo
Music and ITunes feeds. While their tools category includes widget
clocks, Calculators, To Do Lists and many others.
Widgipedia
also has some terrific web widgets including Valentine’s Day gadgets,
Flash Buttons, Questions and Answers, Support Forms, Order Forms and
many others. All widgets also say whether they can be used on MySpace
profiles.
Not all widgets are for use on websites. For example, Yahoo Widgets
offers a huge collection of Windows and Mac widgets that you can
download to your desktop. They offer news feeds, cam viewers, system
utilities, games and heaps of others.
Last but not least, we can't go past Google Gadgets that offers over 3000 widgets. Your sure to find a few to add to your website to help keep those visitors returning often. (from www.hypergurl.com)
Posted by paij0's at 2:07 AM 0 comments