ro  fr  en  es  pt  ar  zh  hi  de  ru
ART 2.0 ART 3.0 ART 4.0 ART 5.0 ART 6.0 Pinterest

Urugero rwa code ya Python rusobanura urubuga mu zindi ndimi, hamwe nisomero ryimbitse + API Urufunguzo şi Goodsoup

June 20, 2021, in Python Scripts Examples, by Neculai Fantanaru

Urashobora kureba kode yuzuye hano:HTTPS: // Passatbin .com / km uu6hv

ShyiramoPython. Noneho shyiramo amasomero abiri akurikira ukoresheje umusemuzi wihuse (CMD) muri Windows10:

py- m pip install pydeepl
py -m pip install beautifulsoup4

Python azahita ahindura ibikurikira HTML ikurikira hamwe nisomero ryimbitse:

Yamaha API Urufunguzo Python Code Inyandiko ya Google hamwe na Cyiza
 name="description" content="Your Text"/>

Nanone, kode ya Python nayo izahita ihindura ibiri muri tagi ikurikira (inyandiko yawe), ariko gusa niba iyi tagi ikozwe nana HTML. Birumvikana, uzakenera gusimbuza ibirambo hamwe na tagi yawe.



 class="den_articol" itemprop="name">Your Text
 class="text_obisnuit">Your Text

class="text_obisnuit2">Your Text

class="text_obisnuit2">Your Text class="text_obisnuit">Your Text class="text_obisnuit">Your Text class="linkMare" href="https://neculaifantanaru.com/rw/">Your Text class="text_obisnuit2>Your Text

text_obisnuit2>Your Text

class="text_obisnuit2>Your Text

Kode: Gukoporora no kuyobora kode hepfo muri gahunda iyo ari yo yose isobanura (nkoresha pyscritenter).Ntiwibagirwe guhindura inzira muri "dosiye_from_folder.Ntiwibagirwe guhindura urufunguzo rwa API (wasangamo inshuro 3 muri kode) kandi dore urutonde rwindimi zishobora guhindurwa:Lang. Iyandikishe kuriKurelkugirango ubone urufunguzo rwa API.

from bs4 import BeautifulSoup
from bs4.formatter import HTMLFormatter
from googletrans import Translator
import requests
import json

if False:
    test = requests.post('https://api-free.deepl.com/v2/translate',
                    data={'auth_key':'PUT HERE YOUR API KEY:fx',
                          'text':'hello',
                          'source_lang':'EN',
                          'target_lang':'ZH'  #translates into Chinesse
                          }).content

    print(json.loads(test)['translations'][0]['text'])

translator = Translator()

class UnsortedAttributes(HTMLFormatter):
    def attributes(self, tag):
        for k, v in tag.attrs.items():
            yield k, v

files_from_folder = r"c:\test" #Change with your basic Path, for example a Folder with your website written in English
source_language = 'EN'  #translates from English

use_translate_folder = False

destination_language = 'ZH'  #translates into Chinesse

extension_file = ".html"

import os

directory = os.fsencode(files_from_folder)

def recursively_translate(node):
    for x in range(len(node.contents)):
        if isinstance(node.contents[x], str):
            if node.contents[x].strip() != '':
                try:
                    newtext = requests.post('https://api-free.deepl.com/v2/translate',
                    data={'auth_key':'PUT HERE YOUR API KEY:fx',
                          'text':node.contents[x],
                          'source_lang':source_language,
                          'target_lang':destination_language
                          }).content
                    node.contents[x].replaceWith(json.loads(newtext)['translations'][0]['text'])
                except:
                    pass
        elif node.contents[x] != None:
            recursively_translate(node.contents[x])
    
for file in os.listdir(directory):
    filename = os.fsdecode(file)
    print(filename)
    if filename == 'y_key_e479323ce281e459.html' or filename == 'TS_4fg4_tr78.html':
        continue
    if filename.endswith(extension_file): 
        with open(os.path.join(files_from_folder, filename), encoding='utf-8') as html:
            soup = BeautifulSoup('
' + html.read() + '
'
, 'html.parser') for title in soup.findAll('title'): recursively_translate(title) for meta in soup.findAll('meta', {'name':'description'}): try: newtext = requests.post('https://api-free.deepl.com/v2/translate', data={'auth_key':'PUT HERE YOUR API KEY:fx', 'text':meta['content'], 'source_lang':source_language, 'target_lang':destination_language }).content meta['content'] = json.loads(newtext)['translations'][0]['text'] except: pass for h1 in soup.findAll('h1', {'itemprop':'name'}, class_='den_articol'): begin_comment = str(soup).index('') end_comment = str(soup).index('') if begin_comment < str(soup).index(str(h1)) < end_comment: recursively_translate(h1) for p in soup.findAll('p', class_='text_obisnuit'): begin_comment = str(soup).index('') end_comment = str(soup).index('') if begin_comment < str(soup).index(str(p)) < end_comment: recursively_translate(p) for p in soup.findAll('p', class_='text_obisnuit2'): begin_comment = str(soup).index('') end_comment = str(soup).index('') if begin_comment < str(soup).index(str(p)) < end_comment: recursively_translate(p) for span in soup.findAll('span', class_='text_obisnuit2'): begin_comment = str(soup).index('') end_comment = str(soup).index('') if begin_comment < str(soup).index(str(span)) < end_comment: recursively_translate(span) for li in soup.findAll('li', class_='text_obisnuit'): begin_comment = str(soup).index('') end_comment = str(soup).index('') if begin_comment < str(soup).index(str(li)) < end_comment: recursively_translate(li) for a in soup.findAll('a', class_='linkMare'): begin_comment = str(soup).index('') end_comment = str(soup).index('') if begin_comment < str(soup).index(str(a)) < end_comment: recursively_translate(a) for h4 in soup.findAll('h4', class_='text_obisnuit2'): begin_comment = str(soup).index('') end_comment = str(soup).index('') if begin_comment < str(soup).index(str(h4)) < end_comment: recursively_translate(h4) for h5 in soup.findAll('h5', class_='text_obisnuit2'): begin_comment = str(soup).index('') end_comment = str(soup).index('') if begin_comment < str(soup).index(str(h5)) < end_comment: recursively_translate(h5) print(f'{filename} translated') soup = soup.encode(formatter=UnsortedAttributes()).decode('utf-8') new_filename = f'{filename.split(".")[0]}_{destination_language}.html' if use_translate_folder: try: with open(os.path.join(files_from_folder+r'\translated', new_filename), 'w', encoding='utf-8') as new_html: new_html.write(soup[5:-6]) except: os.mkdir(files_from_folder+r'\translated') with open(os.path.join(files_from_folder+r'\translated', new_filename), 'w', encoding='utf-8') as new_html: new_html.write(soup[5:-6]) else: with open(os.path.join(files_from_folder, new_filename), 'w', encoding='utf-8') as html: html.write(soup[5:-6])

That's all folks.

If you like my code, then make me a favor: translate your website into Romanian, "ro".

Reba kandi indi verisiyo ya Python ihindura: BeautifulSoup Library cyangwaVerisiyo ya 2cyangwa Google Translate API Key

 


Latest articles accessed by readers:

  1. An Eye To See And A Mind To Understand
  2. Turn Towards Me With An Eye Full Of Your Own Gaze
  3. The Snapshot Of Magic In God's Universe
  4. Rhythm Of My Heart

Donate via Paypal

Alternate Text

RECURRENT DONATION

Donate monthly to support
the NeculaiFantanaru.com project

SINGLE DONATION

Donate the desired amount to support
the NeculaiFantanaru.com project

Donate by Bank Transfer

Account Ron: RO34INGB0000999900448439

Open account at ING Bank

Join The Neculai Fantanaru Community



* Note: If you want to read all my articles in real time, please check the romanian version !

decoration
About | Site Map | Partners | Feedback | Terms & Conditions | Privacy | RSS Feeds
© Neculai Fântânaru - All rights reserved