Removing duplicates in python list -
i have following list of titles:
titles = ['saw (us)', 'saw (au)', 'dear sally (se)']
how following:
titles = ['saw (us)', 'dear sally (se)']
basically, need remove duplicate titles. doesn't matter territory shows, long on (i.e., can remove duplicate).
here have tried, unsuccessfully:
[title title in localized_titles if title.split(' (')[0] not in localized_titles]
i'm not sure elegant solution, should work - can use non-territory version of title dict key.
unique_titles = dict((title.rsplit(' (', 1)[0], title) title in titles)
or if need preserve order, ordereddict.
unique_titles.values() titles including territories (one per title).
using optional argument rsplit limit @ 1 split, , rsplit start looking parens end rather beginning of string.
Comments
Post a Comment