RECURRENT DONATION
Donate monthly to support
the NeculaiFantanaru.com project
Ny toetra tena ilaina amin'ity boky ity raha ampitahaina amin'ny hafa eny an-tsena amin'ny sehatra iray ihany dia ny famariparitana amin'ny alalan'ny ohatra ny fahaiza-manaon'ny mpitarika iray. Tsy nilaza mihitsy aho hoe mora ny ho tonga mpitarika tsara, fa raha tian'ny olona...
Nanoratra ity boky ity aho izay mampifandray amin'ny fomba tsotra ny fivoaran'ny tena manokana amin'ny fitarihana, toy ny piozila, izay tsy maintsy ampifandraisinao ny ampahany rehetra mba hamerenana ny sary ankapobeny.
Ny tanjon'ity boky ity dia ny hanome anao vaovao amin'ny alalan'ny ohatra mivaingana ary hampiseho aminao ny fomba hahazoanao ny fahaiza-manao hahatonga ny hafa hahita zavatra mitovy amin'ny anao.
Raha tsy mihevitra azy io ho fifanarahana, ny boky dia maneho ny andrana ny olon-tsotra iray - ny mpanoratra - izay amin'ny alalan'ny teny tsotra, zava-misy sy ny ohatra mahazatra mampianatra ny olon-tsotra herim-po sy ny fanantenana amin'ny fikatsahany manokana ho tompony sy izay mahalala. .. mety ho mpitarika mihitsy aza.
Azonao atao ny mijery ny kaody feno:Https: // passatin.com / vs xl 淌 umcl hametrakaPython. Avy eo, apetraho ireto tranomboky roa ireto amin'ny alàlan'ny mpandika teny (cmd) amin'ny Windows10: py- m pip install google-cloud-translate py -m pip install beautifulsoup4 Python dia handika ny tags html manaraka:
Ary koa, ny kaody python dia handika ny votoatin'ny tagy manaraka (ny lahatsoratrao), fa raha toa ka namboarina ireo marika ireoSY HTML fanehoan-kevitra. Mazava ho azy fa mila manolo ireo marika ireo amin'ny teninao manokana ianao.
Mila rakitra miaraka amin'ny fanitarana ianao .json (Nosoratako izany secret.json) izay azonao alaina Https://mconsole.cloud.google.com/ Araho ity torolàlana ity amin'ny fomba hahazoana Google API fanalahidy. Adikao ny rakitra tsiambaratelo.JSon amin'ny lahatahiry mitovy amin'ny kaody eto ambanyNy code code. Friends Ny kaody: mandika sy mitantana ny kaody eto ambany amin'ny programa mpandika teny(Ampiasaikopycripter.Aza adino ny hanova ny lalana ao amin'ny tsipika "Files_From_Folder". from bs4 import BeautifulSoup from bs4.formatter import HTMLFormatter import requests import json import os import six from google.cloud import translate_v2 as translate class UnsortedAttributes(HTMLFormatter): def attributes(self, tag): for k, v in tag.attrs.items(): yield k, v def translate_text(target, text): """Translates text into the target language. Target must be an ISO 639-1 language code. See https://g.co/cloud/translate/v2/translate-reference#supported_languages """ os.environ["GOOGLE_APPLICATION_CREDENTIALS"] = "secret.json" translate_client = translate.Client() if isinstance(text, six.binary_type): text = text.decode("utf-8") # Text can also be a sequence of strings, in which case this method # will return a sequence of results for each text. result = translate_client.translate(text, target_language=target) return result["translatedText"] files_from_folder = r"C:\test" source_language = 'EN' use_translate_folder = False destination_language = 'ZH' extension_file = ".html" import os directory = os.fsencode(files_from_folder) def recursively_translate(node): for x in range(len(node.contents)): if isinstance(node.contents[x], str): if node.contents[x].strip() != '': try: newtext = translate_text(destination_language, node.contents[x]) node.contents[x].replaceWith(newtext) except: pass elif node.contents[x] != None: recursively_translate(node.contents[x]) for file in os.listdir(directory): filename = os.fsdecode(file) print(filename) if filename == 'y_key_e479323ce281e459.html' or filename == 'TS_4fg4_tr78.html': continue if filename.endswith(extension_file): with open(os.path.join(files_from_folder, filename), encoding='utf-8') as html: soup = BeautifulSoup('', 'html.parser') for title in soup.findAll('title'): recursively_translate(title) for meta in soup.findAll('meta', {'name':'description'}): try: newtext = translate_text(destination_language, meta['content']) meta['content'] = newtext except: pass for h1 in soup.findAll('h1', {'itemprop':'name'}, class_='den_articol'): begin_comment = str(soup).index('') end_comment = str(soup).index('') if begin_comment < str(soup).index(str(h1)) < end_comment: recursively_translate(h1) for p in soup.findAll('p', class_='text_obisnuit'): begin_comment = str(soup).index('') end_comment = str(soup).index('') if begin_comment < str(soup).index(str(p)) < end_comment: recursively_translate(p) for p in soup.findAll('p', class_='text_obisnuit2'): begin_comment = str(soup).index('') end_comment = str(soup).index('') if begin_comment < str(soup).index(str(p)) < end_comment: recursively_translate(p) for span in soup.findAll('span', class_='text_obisnuit2'): begin_comment = str(soup).index('') end_comment = str(soup).index('') if begin_comment < str(soup).index(str(span)) < end_comment: recursively_translate(span) for li in soup.findAll('li', class_='text_obisnuit'): begin_comment = str(soup).index('') end_comment = str(soup).index('') if begin_comment < str(soup).index(str(li)) < end_comment: recursively_translate(li) for a in soup.findAll('a', class_='linkMare'): begin_comment = str(soup).index('') end_comment = str(soup).index('') if begin_comment < str(soup).index(str(a)) < end_comment: recursively_translate(a) for h4 in soup.findAll('h4', class_='text_obisnuit2'): begin_comment = str(soup).index('') end_comment = str(soup).index('') if begin_comment < str(soup).index(str(h4)) < end_comment: recursively_translate(h4) for h5 in soup.findAll('h5', class_='text_obisnuit2'): begin_comment = str(soup).index('') end_comment = str(soup).index('') if begin_comment < str(soup).index(str(h5)) < end_comment: recursively_translate(h5) print(f'{filename} translated') soup = soup.encode(formatter=UnsortedAttributes()).decode('utf-8') new_filename = f'{filename.split(".")[0]}_{destination_language}.html' if use_translate_folder: try: with open(os.path.join(files_from_folder+r'\translated', new_filename), 'w', encoding='utf-8') as new_html: new_html.write(soup[5:-6]) except: os.mkdir(files_from_folder+r'\translated') with open(os.path.join(files_from_folder+r'\translated', new_filename), 'w', encoding='utf-8') as new_html: new_html.write(soup[5:-6]) else: with open(os.path.join(files_from_folder, new_filename), 'w', encoding='utf-8') as html: html.write(soup[5:-6])'+ html.read() + ' That's all folks. If you like my code, then make me a favor: translate your website into Romanian, "ro". Jereo ihany koa ity kaody fandikan-teny ity: BeautifulSoup Library na DEEPL+API Key naVersion 3naVersion 4 Latest articles accessed by readers:
Donate via Paypal
RECURRENT DONATIONDonate monthly to support SINGLE DONATIONDonate the desired amount to support Donate by Bank TransferAccount Ron: RO34INGB0000999900448439
Open account at ING Bank
|
||||||||||||
![]() |
||||||||||||