Ñò ?íLc@sÆdZddkZddklZlZddklZddklZl Z l Z l Z ddk l Z lZlZlZddd d d d d ddg Zed„ƒZdefd„ƒYZdefd„ƒYZd efd„ƒYZd efd„ƒYZdefd„ƒYZdefd„ƒYZd„Zdefd„ƒYZeƒZ d„Z!defd„ƒYZ"defd „ƒYZ#d efd!„ƒYZ$d e#fd"„ƒYZ%d#„Z&dS($s´ pygments.lexer ~~~~~~~~~~~~~~ Base lexer classes. :copyright: Copyright 2006-2010 by the Pygments team, see AUTHORS. :license: BSD, see LICENSE for details. iÿÿÿÿN(t apply_filterstFilter(tget_filter_by_name(tErrortTexttOthert _TokenType(t get_bool_optt get_int_optt get_list_opttmake_analysatortLexert RegexLexertExtendedRegexLexertDelegatingLexert LexerContexttincludetbygroupstusingtthiscCsdS(g((tx((s5/Users/jano/code/pygments-rb/vendor/pygments/lexer.pytst LexerMetacBseZdZd„ZRS(s‚ This metaclass automagically converts ``analyse_text`` methods into static methods which always return float values. cCs;d|jot|dƒ|ds(R(t __class__R(R-((s5/Users/jano/code/pygments-rb/vendor/pygments/lexer.pyt__repr__\s   cKs7t|tƒpt||}n|ii|ƒdS(s8 Add a new stream filter to this lexer. N(t isinstanceRRR'tappend(R-R.R(((s5/Users/jano/code/pygments-rb/vendor/pygments/lexer.pyR,cscCsdS(s~ Has to return a float between ``0`` and ``1`` that indicates if a lexer wants to highlight this text. Used by ``guess_lexer``. If this method returns ``0`` it won't highlight it in any case, if it returns ``1`` highlighting with this lexer is guaranteed. The `LexerMeta` metaclass automatically wraps this function so that it works like a static method (no ``self`` or ``cls`` parameter) and the return value is automatically converted to `float`. If the return value is an object that is boolean `False` it's the same as if the return values was ``0.0``. N((ttext((s5/Users/jano/code/pygments-rb/vendor/pygments/lexer.pyRksc sçtˆtƒpîˆidjo`y7ˆidƒ‰ˆidƒoˆtdƒ‰nWqútj oˆidƒ‰qúXqþˆidjoXyddk}Wntj otdƒ‚nX|i ˆƒ}ˆi|d ƒ‰qþˆiˆiƒ‰nˆi d d ƒ‰ˆi d d ƒ‰ˆi oˆi ƒ‰nˆi oˆi d ƒ‰nˆid joˆiˆiƒ‰nˆioˆid ƒ oˆd 7‰n‡‡fd†}|ƒ}|pt|ˆiˆƒ}n|S(s= Return an iterable of (tokentype, value) pairs generated from `text`. If `unfiltered` is set to `True`, the filtering mechanism is bypassed even if filters are defined. Also preprocess the text, i.e. expand tabs and strip it if wanted and applies registered filters. tguesssutf-8uR&tchardetiÿÿÿÿNskTo enable chardet encoding guessing, please install the chardet library from http://chardet.feedparser.org/R%s s s ic3s2x+ˆiˆƒD]\}}}||fVqWdS(N(tget_tokens_unprocessed(titttv(R4R-(s5/Users/jano/code/pygments-rb/vendor/pygments/lexer.pytstreamer¡s(R2tunicodeR%tdecodet startswithtlentUnicodeDecodeErrorR6t ImportErrortdetecttreplaceR"tstripR!R$t expandtabsR#tendswithRR'(R-R4t unfilteredR6tencR;tstream((R4R-s5/Users/jano/code/pygments-rb/vendor/pygments/lexer.pyt get_tokensys>    cCs t‚dS(s  Return an iterable of (tokentype, value) pairs. In subclasses, implement this method as a generator to maximize effectiveness. N(tNotImplementedError(R-R4((s5/Users/jano/code/pygments-rb/vendor/pygments/lexer.pyR7©sN(RRR tNoneRtaliasest filenamestalias_filenamest mimetypesRt __metaclass__R/R1R,RR*RJR7(((s5/Users/jano/code/pygments-rb/vendor/pygments/lexer.pyR 's    0cBs#eZdZed„Zd„ZRS(s  This lexer takes two lexer as arguments. A root lexer and a language lexer. First everything is scanned using the language lexer, afterwards all ``Other`` tokens are lexed using the root lexer. The lexers from the ``template`` lexer package use this base lexer. cKs;|||_|||_||_ti||dS(N(t root_lexertlanguage_lexertneedleR R/(R-t _root_lexert_language_lexert_needleR(((s5/Users/jano/code/pygments-rb/vendor/pygments/lexer.pyR/¼s cCsÓd}g}g}x|ii|ƒD]m\}}}||ijo8|o#|it|ƒ|fƒg}n||7}q%|i|||fƒq%W|o|it|ƒ|fƒnt||ii|ƒƒS(Nt(RSR7RTR3R?t do_insertionsRR(R-R4tbufferedt insertionst lng_bufferR8R9R:((s5/Users/jano/code/pygments-rb/vendor/pygments/lexer.pyR7Âs (RRR RR/R7(((s5/Users/jano/code/pygments-rb/vendor/pygments/lexer.pyR²s cBseZdZRS(sI Indicates that a state should include rules from another state. (RRR (((s5/Users/jano/code/pygments-rb/vendor/pygments/lexer.pyRÙstcombinedcBs eZdZd„Zd„ZRS(s: Indicates a state combined from multiple states. cGsti||ƒS(N(ttupleR(Rtargs((s5/Users/jano/code/pygments-rb/vendor/pygments/lexer.pyRåscGsdS(N((R-R_((s5/Users/jano/code/pygments-rb/vendor/pygments/lexer.pyR/ès(RRR RR/(((s5/Users/jano/code/pygments-rb/vendor/pygments/lexer.pyR]às t _PseudoMatchcBsMeZdZd„Zdd„Zdd„Zdd„Zd„Zd„Z RS(s: A pseudo match object constructed from a string. cCs||_||_dS(N(t_textt_start(R-tstartR4((s5/Users/jano/code/pygments-rb/vendor/pygments/lexer.pyR/òs cCs|iS(N(Rb(R-targ((s5/Users/jano/code/pygments-rb/vendor/pygments/lexer.pyRcöscCs|it|iƒS(N(RbR?Ra(R-Rd((s5/Users/jano/code/pygments-rb/vendor/pygments/lexer.pytendùscCs|otdƒ‚n|iS(Ns No such group(t IndexErrorRa(R-Rd((s5/Users/jano/code/pygments-rb/vendor/pygments/lexer.pytgroupüscCs |ifS(N(Ra(R-((s5/Users/jano/code/pygments-rb/vendor/pygments/lexer.pytgroupsscCshS(N((R-((s5/Users/jano/code/pygments-rb/vendor/pygments/lexer.pyt groupdictsN( RRR R/RLRcReRgRhRi(((s5/Users/jano/code/pygments-rb/vendor/pygments/lexer.pyR`ís     csd‡fd†}|S(sL Callback that yields multiple actions for each group in the match. c3sxïtˆƒD]á\}}|djoq q t|ƒtjo=|i|dƒ}|o|i|dƒ||fVqîq |o|i|dƒ|_nxM||t|i|dƒ|i|dƒƒ|ƒD]}|o |VqÔqÔWq W|o|iƒ|_ndS(Ni( t enumerateRLRRRgRctposR`Re(tlexertmatchtctxR8tactiontdatatitem(R_(s5/Users/jano/code/pygments-rb/vendor/pygments/lexer.pytcallback s"  #N(RL(R_Rr((R_s5/Users/jano/code/pygments-rb/vendor/pygments/lexer.pyRst_ThiscBseZdZRS(sX Special singleton used for indicating the caller class. Used by ``using``. (RRR (((s5/Users/jano/code/pygments-rb/vendor/pygments/lexer.pyRs sc sœh‰dˆjoGˆidƒ}t|ttfƒo|ˆds c3s„ˆi|iƒˆˆ}|iƒ}x;|i|iƒˆD]!\}}}||||fVqAW|o|iƒ|_ndS(N(RwR(RcR7RgReRk(RlRmRnRxRyR8R9R:(RzR{t_other(s5/Users/jano/code/pygments-rb/vendor/pygments/lexer.pyRrMs  N(tpopR2tlistR^RRL(R|R{RyRr((RzR{R|s5/Users/jano/code/pygments-rb/vendor/pygments/lexer.pyR(s    tRegexLexerMetacBs,eZdZd„Zdd„Zd„ZRS(sw Metaclass for RegexLexer, creates the self._tokens attribute from self.tokens on the first instantiation. c Cs^t|ƒtjptd|‚|ddjptd|‚||jo ||Sg}||<|i}xê||D]Þ}t|tƒoD||jptd|‚|i|i||t|ƒƒƒqxnt|ƒtjptd|‚yt i |d|ƒi }Wn5t j o)}t d|d|||fƒ‚nXt|dƒtjp&t|dƒptd |df‚t|ƒd jo d} n›|d } t| tƒo‡| d jo d } q<| |jo | f} q<| d jo | } q<| d djot| dƒ } q<tptd| ‚nút| tƒod|i} |id7_g} xE| D]=} | |jptd| ‚| i|i||| ƒƒq|W| || <| f} nit| tƒoCx6| D].}||jp|djptd|‚qëW| } ntptd| ‚|i||d| fƒqxW|S(Nswrong state name %rit#sinvalid state name %rscircular state reference %rswrong rule def %rs+uncompilable regex %r in state %r of %r: %sis2token type must be simple type or callable, not %ris#popiÿÿÿÿs#pushis#pop:sunknown new state %rs_tmp_%dscircular state ref %rsunknown new state sunknown new state def %r(s#pops#push(RtstrtAssertionErrortflagsR2Rtextendt_process_stateR^tretcompileRmt Exceptiont ValueErrorRtcallableR?RLtintR*R]t_tmpnameR3( Rt unprocessedt processedRtttokenstrflagsttdeftrexterrt new_statettdef2titokenstistate((s5/Users/jano/code/pygments-rb/vendor/pygments/lexer.pyR…`sl!    "!+             cCsSh}|i|<|p |i|}x'|iƒD]}|i|||ƒq2W|S(N(t _all_tokensRtkeysR…(RRt tokendefsRŽRt((s5/Users/jano/code/pygments-rb/vendor/pygments/lexer.pytprocess_tokendef s  cOsot|dƒpLh|_d|_t|dƒo|ioq\|id|iƒ|_nti|||ŽS(Nt_tokensittoken_variantsRX( thasattrR˜RŒRR›RRœRt__call__(RR_tkwds((s5/Users/jano/code/pygments-rb/vendor/pygments/lexer.pyRŸ§s  N(RRR R…RLR›RŸ(((s5/Users/jano/code/pygments-rb/vendor/pygments/lexer.pyRZs @ cBs/eZdZeZeiZhZdd„Z RS(s± Base for simple stateful regular expression-based lexers. Simplifies the lexing process so that you need only provide a list of states and regular expressions. Rvc csd}|i}t|ƒ}||d}xèxâ|D]`\}}} |||ƒ} | o;t|ƒtjo||| iƒfVn x||| ƒD] } | Vq“W| iƒ}| d j oÓt| tƒo_x®| D]P} | djo|i ƒqÒ| djo|i |dƒqÒ|i | ƒqÒWnSt| t ƒo || 3n8| djo|i |dƒnt pt d| ‚||d}nPq3q3Wyc||djo2|d7}dg}|d}|td fVw,n|t||fV|d7}Wq,tj oPq,Xq,d S( s} Split ``text`` into (tokentype, text) pairs. ``stack`` is the inital stack (default: ``['root']``) iiÿÿÿÿs#pops#pushswrong state def: %rs iRvu N(RœR~RRRgReRLR2R^R}R3R‹R*R‚RRRf( R-R4RuRkRšt statestackt statetokenstrexmatchRoR”tmRqRt((s5/Users/jano/code/pygments-rb/vendor/pygments/lexer.pyR7ÓsT             (sroot( RRR RRQR†t MULTILINERƒRR7(((s5/Users/jano/code/pygments-rb/vendor/pygments/lexer.pyR ´s  cBs&eZdZddd„Zd„ZRS(s9 A helper object that holds lexer position data. cCs?||_||_|p t|ƒ|_|pdg|_dS(NRv(R4RkR?ReRu(R-R4RkRuRe((s5/Users/jano/code/pygments-rb/vendor/pygments/lexer.pyR/s  cCsd|i|i|ifS(NsLexerContext(%r, %r, %r)(R4RkRu(R-((s5/Users/jano/code/pygments-rb/vendor/pygments/lexer.pyR1sN(RRR RLR/R1(((s5/Users/jano/code/pygments-rb/vendor/pygments/lexer.pyR scBseZdZddd„ZRS(sE A RegexLexer that uses a context object to store its state. c cs\|i}|pt|dƒ}|d}n!|}||id}|i}xx|D]O\}}}|||i|iƒ} | o!t|ƒtjo*|i|| iƒfV| iƒ|_n?x||| |ƒD] } | VqÕW|p||id}n|d j o”t |t ƒo|ii |ƒn\t |t ƒo|i|3n>|djo|ii|idƒntptd|‚||id}nPqWqWWy”|i|ijoPn||idjo=|id7_dg|_|d}|itdfVwPn|it||ifV|id7_WqPtj oPqPXqPd S( s Split ``text`` into (tokentype, text) pairs. If ``context`` is given, use this lexer context instead. iRviÿÿÿÿs#pushswrong state def: %rs iu N(RœRRuR4RkReRRRgRLR2R^R„R‹R3R*R‚RRRf( R-R4tcontextRšRnR¢R£RoR”R¤Rq((s5/Users/jano/code/pygments-rb/vendor/pygments/lexer.pyR7sV        N(RRR RLR7(((s5/Users/jano/code/pygments-rb/vendor/pygments/lexer.pyR sc cs t|ƒ}y|iƒ\}}Wn*tj ox|D] }|Vq:WdSXd}t}x$|D]\}}} |djo |}nd} xÇ|o¿|t| ƒ|jo¨| | ||!} ||| fV|t| ƒ7}x5|D]-\} } }|| |fV|t|ƒ7}qåW||} y|iƒ\}}Wq‘tj ot}Pq‘Xq‘W||| | fV|t| ƒ| 7}qbWx„|o||pd}x5|D]-\}}} ||| fV|t| ƒ7}q Wy|iƒ\}}Wq…tj ot}Pq…Xq…WdS(sg Helper for lexers which must combine the results of several sublexers. ``insertions`` is a list of ``(index, itokens)`` pairs. Each ``itokens`` iterable should be inserted at position ``index`` into the token stream given by the ``tokens`` argument. The result is a combined token stream. TODO: clean up the code here. Ni(titertnextt StopIterationRLR)R?R*(R[RtindexR–RqtrealpostinsleftR8R9R:toldittmpvaltit_indextit_tokentit_valuetp((s5/Users/jano/code/pygments-rb/vendor/pygments/lexer.pyRYWsX       ('R R†tpygments.filterRRtpygments.filtersRtpygments.tokenRRRRt pygments.utilRRR R t__all__t staticmethodt_default_analyseRRtobjectR RRRR^R]R`RRsRRRR RR RY(((s5/Users/jano/code/pygments-rb/vendor/pygments/lexer.pyt s. ""  ‹'    2ZU>