I have a file that is a python code (may not be syntactically correct).
It has some functions which are commented out except the signature.
My goal is to detect those empty functions using a regex and clean them up.
Had it been only # kind of comment it would have been easier to locate if all lines had # in beginning between two lines starting with def but the issue is in many functions I have multi-line comments (actually, docstrings) as well.
If you could suggest a way to change multi-line comments to single line comments that would help too.
In case you are curious about what is this useful for, this is a part of a python tool where we are trying to automate some of the steps of code refactoring.
Input:
def this_function_has_stuff(f, g, K):
""" Thisfunction has stuff in it """
if f:
s = 0
else:
u =0
return None
def fuly_commented_fucntion(f, g, K):
"""
remove this empty function.
Examples
========
>>> which function is
>>> empty
"""
def empty_annotated_fn(name: str, result: List[100]) -> List[100]:
"""
Make some bla.
Examples
========
>>> bla bla
>>> bla bla
x**2 + 1
"""
def note_this_has_one_valid_line(f, K):
"""
Make some bla.
Examples
========
>>> bla bla
>>> bla bla
x**2 + 1
"""
return [K.abs(coff) for coff in f]
def empty_with_both_types_of_comment(f, K):
"""
my bla bla
Examples
========
3
"""
# if not f:
# else:
# return max(dup_abs(f, K))
SOME_VAR = 6
Expected output:
def this_function_has_stuff(f, g, K):
""" Thisfunction has stuff in it """
if f:
s = 0
else:
u =0
return None
def note_this_has_one_valid_line(f, K):
"""
Make some bla.
Examples
========
>>> bla bla
>>> bla bla
x**2 + 1
"""
return [K.abs(coff) for coff in f]
SOME_VAR = 6
Use the following regex:
?!deny the methods(?:\n.+)+)do the line breakmatch.group(groupNum)in the code below contains the functions instringthe complete code