python - How to handle tokenization errors? -


Please find below the piece of code that will be used to hook the string.

  strlist = (Generate_token for token [STRING] token (stringIo (line) .readline) if token [STRING])  

Such as: -

<(> 'EOF' in multi-line statement, (2, 0)) Pre>

I ignore such errors I want to be able to complete the tokenization process. I have a lot of data, so I am fine with losing a part of the data of these errors. However, I'm not sure how to write a piece of code to implement the desired functionality

Thanks.

Edit 1: -

  Excluding Tokenize. TokenAir: Pass  

I get the following error message Token

 except . Tokenier: Name: Error: Name 'tokenize' is not defined  

Note that your The error message is called tokenize.TokenError . This is the type of exception is picking up your code to capture the error, you should leave the block .... To leave the error, you just put pass to excluding in the block.

  Tunken to import: tokenize.goldate_tokens (string) for strList = list (token [STRING].) Tokenize token, except token, [STRING]) . TokenAir: Pass  

Comments

Popular posts from this blog

Eclipse CDT variable colors in editor -

AJAX doesn't send POST query -

wpf - Custom Message Box Advice -