ÿØÿàJFIFÿþ ÿÛC       ÿÛC ÿÀÿÄÿÄ"#QrÿÄÿÄ&1!A"2qQaáÿÚ ?Øy,æ/3JæÝ¹È߲؋5êXw²±ÉyˆR”¾I0ó2—PI¾IÌÚiMö¯–þrìN&"KgX:Šíµ•nTJnLK„…@!‰-ý ùúmë;ºgµŒ&ó±hw’¯Õ@”Ü— 9ñ-ë.²1<yà‚¹ïQÐU„ہ?.’¦èûbß±©Ö«Âw*VŒ) `$‰bØÔŸ’ëXÖ-ËTÜíGÚ3ð«g Ÿ§¯—Jx„–’U/ÂÅv_s(Hÿ@TñJÑãõçn­‚!ÈgfbÓc­:él[ðQe 9ÀPLbÃãCµm[5¿ç'ªjglå‡Ûí_§Úõl-;"PkÞÞÁQâ¼_Ñ^¢SŸx?"¸¦ùY騐ÒOÈ q’`~~ÚtËU¹CڒêV  I1Áß_ÿÙ A[c@`sddlmZmZmZddlmZddlmZddlm Z m Z e e_ e e_ dgZ defdYZdefd YZd efd YZd S( i(tabsolute_importtdivisiontunicode_literals(tstr(turllib(tparsetrequestuRobotFileParsertRobotFileParsercB`sbeZdZddZdZdZdZdZdZdZ d Z d Z RS( us This class provides a set of methods to read, parse and answer questions about a single robots.txt file. ucC`s>g|_d|_t|_t|_|j|d|_dS(Ni(tentriestNonet default_entrytFalset disallow_allt allow_alltset_urlt last_checked(tselfturl((sT/opt/alt/python27/lib/python2.7/site-packages/future/backports/urllib/robotparser.pyt__init__s      cC`s|jS(uReturns the time the robots.txt file was last fetched. This is useful for long-running web spiders that need to check for new robots.txt files periodically. (R(R((sT/opt/alt/python27/lib/python2.7/site-packages/future/backports/urllib/robotparser.pytmtime&scC`sddl}|j|_dS(uYSets the time the robots.txt file was last fetched to the current time. iN(ttimeR(RR((sT/opt/alt/python27/lib/python2.7/site-packages/future/backports/urllib/robotparser.pytmodified/s cC`s2||_tjj|dd!\|_|_dS(u,Sets the URL referring to a robots.txt file.iiN(RRRturlparsethosttpath(RR((sT/opt/alt/python27/lib/python2.7/site-packages/future/backports/urllib/robotparser.pyR7s cC`sytjj|j}WnOtjjk rj}|jdkrLt|_q|jdkrt|_ qn)X|j }|j |j dj dS(u4Reads the robots.txt URL and feeds it to the parser.iiiuutf-8N(ii(RRturlopenRterrort HTTPErrortcodetTrueR R treadRtdecodet splitlines(Rtfterrtraw((sT/opt/alt/python27/lib/python2.7/site-packages/future/backports/urllib/robotparser.pyR<s  cC`sAd|jkr-|jdkr=||_q=n|jj|dS(Nu*(t useragentsR R Rtappend(Rtentry((sT/opt/alt/python27/lib/python2.7/site-packages/future/backports/urllib/robotparser.pyt _add_entryIscC`sd}t}x|D]}|sn|dkr@t}d}qn|dkrn|j|t}d}qnn|jd}|dkr|| }n|j}|sqn|jdd}t|dkr|djj|dR@(((sT/opt/alt/python27/lib/python2.7/site-packages/future/backports/urllib/robotparser.pyRs    3 R0cB`s)eZdZdZdZdZRS(uoA rule line is a single "Allow:" (allowance==True) or "Disallow:" (allowance==False) followed by a path.cC`s>|dkr| rt}ntjj||_||_dS(Nu(RRRR9RR;(RRR;((sT/opt/alt/python27/lib/python2.7/site-packages/future/backports/urllib/robotparser.pyRs cC`s|jdkp|j|jS(Nu*(Rt startswith(Rtfilename((sT/opt/alt/python27/lib/python2.7/site-packages/future/backports/urllib/robotparser.pyR:scC`s|jrdpdd|jS(NuAllowuDisallowu: (R;R(R((sT/opt/alt/python27/lib/python2.7/site-packages/future/backports/urllib/robotparser.pyR@s(RARBRCRR:R@(((sT/opt/alt/python27/lib/python2.7/site-packages/future/backports/urllib/robotparser.pyR0s  R(cB`s2eZdZdZdZdZdZRS(u?An entry has one or more user-agents and zero or more rulelinescC`sg|_g|_dS(N(R$R/(R((sT/opt/alt/python27/lib/python2.7/site-packages/future/backports/urllib/robotparser.pyRs cC`sjg}x'|jD]}|jd|dgqWx*|jD]}|jt|dgq:Wdj|S(Nu User-agent: u u(R$textendR/RR?(RtrettagentR3((sT/opt/alt/python27/lib/python2.7/site-packages/future/backports/urllib/robotparser.pyR@s cC`s]|jddj}x=|jD]2}|dkr9tS|j}||kr#tSq#WtS(u2check if this entry applies to the specified agentu/iu*(R+R-R$RR (RR<RH((sT/opt/alt/python27/lib/python2.7/site-packages/future/backports/urllib/robotparser.pyR:s   cC`s.x'|jD]}|j|r |jSq WtS(uZPreconditions: - our agent applies to this entry - filename is URL decoded(R/R:R;R(RRER3((sT/opt/alt/python27/lib/python2.7/site-packages/future/backports/urllib/robotparser.pyR;s (RARBRCRR@R:R;(((sT/opt/alt/python27/lib/python2.7/site-packages/future/backports/urllib/robotparser.pyR(s    N(t __future__RRRtfuture.builtinsRtfuture.backportsRtfuture.backports.urllibRt_parseRt_requestt__all__tobjectRR0R((((sT/opt/alt/python27/lib/python2.7/site-packages/future/backports/urllib/robotparser.pyts