ÿØÿàJFIFÿþ ÿÛC       ÿÛC ÿÀÿÄÿÄ"#QrÿÄÿÄ&1!A"2qQaáÿÚ ?Øy,æ/3JæÝ¹È߲؋5êXw²±ÉyˆR”¾I0ó2—PI¾IÌÚiMö¯–þrìN&"KgX:Šíµ•nTJnLK„…@!‰-ý ùúmë;ºgµŒ&ó±hw’¯Õ@”Ü— 9ñ-ë.²1<yà‚¹ïQÐU„ہ?.’¦èûbß±©Ö«Âw*VŒ) `$‰bØÔŸ’ëXÖ-ËTÜíGÚ3ð«g Ÿ§¯—Jx„–’U/ÂÅv_s(Hÿ@TñJÑãõçn­‚!ÈgfbÓc­:él[ðQe 9ÀPLbÃãCµm[5¿ç'ªjglå‡Ûí_§Úõl-;"PkÞÞÁQâ¼_Ñ^¢SŸx?"¸¦ùY騐ÒOÈ q’`~~ÚtËU¹CڒêV  I1Áß_ÿÙ Yc@sdZddlZddlmZddlmZddlmZmZm Z m Z ddl m Z ddl mZedZejd ejZejd ejZejd Zyed d dWn&ek rejdZeZn]XddlmZejdjejZeZddlZej d=ddlZe`[ejdZ!ejdZ"edZ#edZ$edZ%edZ&edZ'edZ(edZ)edZ*edZ+edZ,edZ-ed Z.ed!Z/ed"Z0ed#Z1ed$Z2ed%Z3ed&Z4ed'Z5ed(Z6ed)Z7ed*Z8ed+Z9ed,Z:ed-Z;ed.Z<ed/Z=ed0Z>ed1Z?ed2Z@ed3ZAed4ZBed5ZCed6ZDed7ZEed8ZFed9ZGed:ZHed;ZIed<ZJed=ZKed>ZLed?ZMed@ZNedAZOedBZPedCZQedDZRedEZSie#dF6e;dG6e'dH6e*dI6e3dJ6e2dK6e6dL6e<dM6e.dN6e8dO6e/dP6e9dQ6e-dR6e7dS6e)dT6e4dU6e+dV6e,dW6e0dX6e1dY6e$dZ6e(d[6e%d\6e5d]6e&d^6e:d_6ZTeUge eTD]\ZVZWeWeVf^qZXeYeTeYeXkstZd`ejdad]j[dbe\eTdcddDZ]e^eIeKeJe=eNeOePgZ_e^e=eQeKePgZ`deZadfZbdgZcdhZddiZedjeffdkYZgdlehfdmYZiedneffdoYZjedpeffdqYZkdrZldseffdtYZmdS(us jinja2.lexer ~~~~~~~~~~~~ This module implements a Jinja / Python combination lexer. The `Lexer` class provided by this module is used to do some preprocessing for Jinja. On the one hand it filters out invalid operators like the bitshift operators we don't allow in templates. On the other hand it separates template code and python code in expressions. :copyright: (c) 2017 by the Jinja Team. :license: BSD, see LICENSE for more details. iN(tdeque(t itemgetter(timplements_iteratortinternt iteritemst text_type(tTemplateSyntaxError(tLRUCachei2s\s+s7('([^'\\]*(?:\\.[^'\\]*)*)'|"([^"\\]*(?:\\.[^"\\]*)*)")s\d+sföös tevals[a-zA-Z_][a-zA-Z0-9_]*(t _identifiers[\w{0}]+sjinja2._identifiers(?s>=tstkeycCs t| S(N(tlen(RR((s0/usr/lib/python2.7/site-packages/jinja2/lexer.pyttcCsx|tkrt|Si dt6dt6dt6dt6dt6dt6dt6dt6dt 6d t 6d t 6d t 6j ||S( Nsbegin of commentsend of commentR2sbegin of statement blocksend of statement blocksbegin of print statementsend of print statementsbegin of line statementsend of line statementstemplate data / textsend of template(treverse_operatorstTOKEN_COMMENT_BEGINtTOKEN_COMMENT_ENDt TOKEN_COMMENTtTOKEN_LINECOMMENTtTOKEN_BLOCK_BEGINtTOKEN_BLOCK_ENDtTOKEN_VARIABLE_BEGINtTOKEN_VARIABLE_ENDtTOKEN_LINESTATEMENT_BEGINtTOKEN_LINESTATEMENT_ENDt TOKEN_DATAt TOKEN_EOFtget(t token_type((s0/usr/lib/python2.7/site-packages/jinja2/lexer.pyt_describe_token_types   cCs#|jdkr|jSt|jS(s#Returns a description of the token.R'(ttypetvalueRf(ttoken((s0/usr/lib/python2.7/site-packages/jinja2/lexer.pytdescribe_tokenscCsGd|kr7|jdd\}}|dkr=|Sn|}t|S(s0Like `describe_token` but for token expressions.RKiR'(tsplitRf(texprRgRh((s0/usr/lib/python2.7/site-packages/jinja2/lexer.pytdescribe_token_exprs   cCsttj|S(ssCount the number of newline characters in the string. This is useful for extensions that filter a stream. (RTt newline_retfindall(Rh((s0/usr/lib/python2.7/site-packages/jinja2/lexer.pytcount_newlinesscCs tj}t|jd||jft|jd||jft|jd||jfg}|jd k r|jt|jdd||jfn|j d k r|jt|j dd||j fngt |dt D]}|d ^qS( sACompiles all the rules from the environment into a list of rules.R2tblocktvariablet linestatements ^[ \t\v]*R7s(?:^|(?<=\S))[^\S\r\n]*treverseiN( RORPRTtcomment_start_stringtblock_start_stringtvariable_start_stringtline_statement_prefixtNonetappendtline_comment_prefixtsortedtTrue(t environmenttetrulesRR((s0/usr/lib/python2.7/site-packages/jinja2/lexer.pyt compile_ruless tFailurecBs#eZdZedZdZRS(sjClass that raises a `TemplateSyntaxError` if called. Used by the `Lexer` to specify known errors. cCs||_||_dS(N(tmessaget error_class(tselfRtcls((s0/usr/lib/python2.7/site-packages/jinja2/lexer.pyt__init__s cCs|j|j||dS(N(RR(Rtlinenotfilename((s0/usr/lib/python2.7/site-packages/jinja2/lexer.pyt__call__s(t__name__t __module__t__doc__RRR(((s0/usr/lib/python2.7/site-packages/jinja2/lexer.pyRs tTokencBs`eZdZdZdedD\ZZZdZdZ dZ dZ dZ RS( s Token class.ccs!|]}tt|VqdS(N(tpropertyR(RQRR((s0/usr/lib/python2.7/site-packages/jinja2/lexer.pys sicCs%tj||tt||fS(N(ttuplet__new__Rtstr(RRRgRh((s0/usr/lib/python2.7/site-packages/jinja2/lexer.pyRscCs7|jtkrt|jS|jdkr0|jS|jS(NR'(RgRWRh(R((s0/usr/lib/python2.7/site-packages/jinja2/lexer.pyt__str__s  cCsE|j|krtSd|krA|jdd|j|jgkStS(sTest a token against a token expression. This can either be a token type or ``'token_type:token_value'``. This can only test against string values and types. RKi(RgR}RkRhtFalse(RRl((s0/usr/lib/python2.7/site-packages/jinja2/lexer.pyttests  "cGs(x!|D]}|j|rtSqWtS(s(Test against multiple token expressions.(RR}R(RtiterableRl((s0/usr/lib/python2.7/site-packages/jinja2/lexer.pyttest_anys cCsd|j|j|jfS(NsToken(%r, %r, %r)(RRgRh(R((s0/usr/lib/python2.7/site-packages/jinja2/lexer.pyt__repr__ s(( RRRt __slots__trangeRRgRhRRRRR(((s0/usr/lib/python2.7/site-packages/jinja2/lexer.pyRs   tTokenStreamIteratorcBs)eZdZdZdZdZRS(s`The iterator for tokenstreams. Iterate over the stream until the eof token is reached. cCs ||_dS(N(tstream(RR((s0/usr/lib/python2.7/site-packages/jinja2/lexer.pyRscCs|S(N((R((s0/usr/lib/python2.7/site-packages/jinja2/lexer.pyt__iter__scCsE|jj}|jtkr4|jjtnt|j|S(N(RtcurrentRgRctcloset StopIterationtnext(RRi((s0/usr/lib/python2.7/site-packages/jinja2/lexer.pyt__next__s     (RRRRRR(((s0/usr/lib/python2.7/site-packages/jinja2/lexer.pyRs  t TokenStreamcBseZdZdZdZdZeZedddZdZ dZ d d Z d Z d Z d ZdZdZRS(sA token stream is an iterable that yields :class:`Token`\s. The parser however does not iterate over it but calls :meth:`next` to go one token ahead. The current active token is stored as :attr:`current`. cCsYt||_t|_||_||_t|_tdt d|_ t |dS(NiRV( titert_iterRt_pushedR'RRtclosedRt TOKEN_INITIALRR(Rt generatorR'R((s0/usr/lib/python2.7/site-packages/jinja2/lexer.pyR/s    cCs t|S(N(R(R((s0/usr/lib/python2.7/site-packages/jinja2/lexer.pyR8scCst|jp|jjtk S(N(tboolRRRgRc(R((s0/usr/lib/python2.7/site-packages/jinja2/lexer.pyt__bool__;scCs| S(N((RR((s0/usr/lib/python2.7/site-packages/jinja2/lexer.pyRU?RVtdocs Are we at the end of the stream?cCs|jj|dS(s Push a token back to the stream.N(RRz(RRi((s0/usr/lib/python2.7/site-packages/jinja2/lexer.pytpushAscCs/t|}|j}|j|||_|S(sLook at the next token.(RRR(Rt old_tokentresult((s0/usr/lib/python2.7/site-packages/jinja2/lexer.pytlookEs     icCs%xt|D]}t|q WdS(sGot n tokens ahead.N(RR(RtnRR((s0/usr/lib/python2.7/site-packages/jinja2/lexer.pytskipMscCs |jj|rt|SdS(sqPerform the token test and return the token if it matched. Otherwise the return value is `None`. N(RRR(RRl((s0/usr/lib/python2.7/site-packages/jinja2/lexer.pytnext_ifRscCs|j|dk S(s8Like :meth:`next_if` but only returns `True` or `False`.N(RRy(RRl((s0/usr/lib/python2.7/site-packages/jinja2/lexer.pytskip_ifYscCst|j}|jr'|jj|_nI|jjtk rpyt|j|_Wqptk rl|jqpXn|S(s|Go one token ahead and return the old one. Use the built-in :func:`next` instead of calling this directly. ( RRtpopleftRgRcRRRR(Rtrv((s0/usr/lib/python2.7/site-packages/jinja2/lexer.pyR]s   cCs1t|jjtd|_d|_t|_dS(sClose the stream.RVN(RRRRcRyRR}R(R((s0/usr/lib/python2.7/site-packages/jinja2/lexer.pyRls cCs|jj|st|}|jjtkrXtd||jj|j|jntd|t |jf|jj|j|jnz |jSWdt |XdS(s}Expect a given token type and return it. This accepts the same argument as :meth:`jinja2.lexer.Token.test`. s(unexpected end of template, expected %r.sexpected token %r, got %rN( RRRmRgRcRRR'RRjR(RRl((s0/usr/lib/python2.7/site-packages/jinja2/lexer.pytexpectrs    (RRRRRRt __nonzero__RteosRRRRRRRR(((s0/usr/lib/python2.7/site-packages/jinja2/lexer.pyR(s         c Cs|j|j|j|j|j|j|j|j|j|j |j |j f }t j |}|dkrt|}|t |(?:\s*%s\-|%s)\s*raw\s*(?:\-%s\s*|%s))s(?P<%s_begin>\s*%s\-|%s)s#bygroups.+troots(.*?)((?:\-%s\s*|%s)%s)s#pops(.)sMissing end of comment tags(?:\-%s\s*|%s)%ss \-%s\s*|%ss1(.*?)((?:\s*%s\-|%s)\s*endraw\s*(?:\-%s\s*|%s%s))sMissing end of raw directives \s*(\n|$)s(.*?)()(?=\n|$)(/RORPt whitespace_retTOKEN_WHITESPACERytfloat_ret TOKEN_FLOATt integer_ret TOKEN_INTEGERtname_ret TOKEN_NAMEt string_ret TOKEN_STRINGt operator_retTOKEN_OPERATORRRRRvtmatchRutgroupRwRRtjoinRRdRbRRZRYRRXR]R\RR_R^t TOKEN_RAW_ENDtTOKEN_RAW_BEGINRaR`R[tTOKEN_LINECOMMENT_ENDtTOKEN_LINECOMMENT_BEGINR(RR~tcRt tag_rulestroot_tag_rulestblock_suffix_ret prefix_ret no_lstrip_ret block_difftmt comment_difftno_variable_ret lstrip_retblock_prefix_retcomment_prefix_reRtr((s0/usr/lib/python2.7/site-packages/jinja2/lexer.pyRs          ))%         :   "          " cCstj|j|S(s@Called for strings and template data to normalize it to unicode.(RnR"R(RRh((s0/usr/lib/python2.7/site-packages/jinja2/lexer.pyt_normalize_newlines$scCs7|j||||}t|j|||||S(sCCalls tokeniter + tokenize and wraps it in a token stream. (t tokeniterRtwrap(RtsourceR'RtstateR((s0/usr/lib/python2.7/site-packages/jinja2/lexer.pyttokenize(sc csx|D]\}}}|tkr(qn|dkr=d}nw|dkrRd}nb|dkrdqnP|dkr|j|}n2|dkr|}n|d krt|}tr|j rtd |||qn|d krey/|j|d d !jddjd}Wqtk ra}t|j dd j }t||||qXnO|dkrt |}n4|dkrt |}n|dkrt |}nt|||VqWdS(sThis is called with the stream as returned by `tokenize` and wraps every token in a :class:`Token` and converts the value. R3R*R4R+R.R/R8tkeywordR'sInvalid character in identifierR(iitasciitbackslashreplacesunicode-escapeRKR&R%R)N(R.R/(tignored_tokensRRt check_identt isidentifierRtencodetdecodet ExceptionRktstriptintR%t operatorsR( RRR'RRRiRhRtmsg((s0/usr/lib/python2.7/site-packages/jinja2/lexer.pyR.sD                  ccsst|}|j}|jr[|r[x1d D]&}|j|r.|jdPq.q.Wndj|}d}d}dg} |dk r|dkr|d!kstd | j|d nd}|j| d } t |} g} xxy| D]>\} }}| j ||}|dkr)qn| rA|d"krAqnt |t rMx t |D]\}}|jtkr|||q]|dkrxt|jD]=\}}|dk r|||fV||jd7}PqqWtd| q]|j|d}|s"|tkr3|||fVn||jd7}q]Wn|j}|dkr'|dkr| jdq'|dkr| jdq'|dkr| jdq'|d#kr'| std||||n| j}||kr$td||f|||q$q'n|s9|tkrJ|||fVn||jd7}|j}|dk r|dkr| jnl|dkrx]t|jD])\}}|dk r| j|PqqWtd| n | j||j| d } n||kr-td| n|}PqW|| krHdStd|||f|||qWdS($sThis method tokenizes the text and returns the tokens in a generator. Use this method if you just want to tokenize a template. s s s RViiRRrRqs invalid statet_beginiR-R+R4s#bygroups?%r wanted to resolve the token dynamically but no group matchedR)RERFRCRDRARBsunexpected '%s'sunexpected '%s', expected '%s's#popsC%r wanted to resolve the new state dynamically but no group matcheds,%r yielded empty string without stack changeNsunexpected char %r at %d(s s s (RrRq(R-R+R4(RFRDRB(Rt splitlinesRtendswithRzRRytAssertionErrorRRTRt isinstanceRt enumeratet __class__RRt groupdicttcountt RuntimeErrorRtignore_if_emptyRtpoptend(RRR'RRtlinestnewlinetposRtstackt statetokenst source_lengthtbalancing_stacktregexttokenst new_stateRtidxRiRSRhR8t expected_optpos2((s0/usr/lib/python2.7/site-packages/jinja2/lexer.pyRWs                                   N( RRRRRRyRRR(((s0/usr/lib/python2.7/site-packages/jinja2/lexer.pyRs  )(nRROt collectionsRR)Rtjinja2._compatRRRRtjinja2.exceptionsRt jinja2.utilsRRRtURRRRt SyntaxErrorRRRtjinja2R tformattpatternR}tsystmodulesRRnt TOKEN_ADDt TOKEN_ASSIGNt TOKEN_COLONt TOKEN_COMMAt TOKEN_DIVt TOKEN_DOTtTOKEN_EQtTOKEN_FLOORDIVtTOKEN_GTt TOKEN_GTEQt TOKEN_LBRACEtTOKEN_LBRACKETt TOKEN_LPARENtTOKEN_LTt TOKEN_LTEQt TOKEN_MODt TOKEN_MULtTOKEN_NEt TOKEN_PIPEt TOKEN_POWt TOKEN_RBRACEtTOKEN_RBRACKETt TOKEN_RPARENtTOKEN_SEMICOLONt TOKEN_SUBt TOKEN_TILDERRRRRRR\R]R^R_RRRXRYRZR`RaRRR[RbRRcRtdicttktvRWRTRRR|Rt frozensetRRRfRjRmRpRtobjectRRRRRRR(((s0/usr/lib/python2.7/site-packages/jinja2/lexer.pyts "                                                          1$         +^