• Keine Ergebnisse gefunden

Python für Linguisten mittels Natural Language Processing

N/A
N/A
Protected

Academic year: 2022

Aktie "Python für Linguisten mittels Natural Language Processing"

Copied!
2
0
0

Wird geladen.... (Jetzt Volltext ansehen)

Volltext

(1)

PROGRAMMIERUNG

Python für Linguisten mittels Natural Language Processing

Weiterbildung

Kursnummer: LS20.157 6-Wochen-Kurs

Maschinelle Verarbeitung natürlicher Sprache. Natural Language Processing (NLP) verknüpft Erkenntnisse aus der Linguistik mit neuesten Methoden der Computerwissenschaft und der künstlichen Intelligenz.

Inhalt

Getting started with Deep Learning Build Blocks of Neuronal Networks

Introduction to Natural Language Processing Introduction to deep learning

Introduction to PyTorch

Traditional Natural Language Processing-Methods NLTK

SpaCy gensim

Deep Learning for Computer Vision

Learn representations from a language sequence, using the Recurrent Neural Network (RNN)

Improve on RNN results with complex neural architectures, such as Long Short Term Memories (LSTM) and Gated Recurrent Units

Explore sequence-to-sequence models (used in translation) that read one sequence and produce another Deep learning with Sequence-Data and Text

Komplexe Übungen und Anwendungsbeispiele

Zielgruppe

Linguisten

Voraussetzungen

fortgeschrittene Python-Kenntnisse, Englisch

Kosten

Kompletter Kurs förderfähig

z.B. mit Bildungsgutschein, über Berufsförderungsdienst (BFD) oder bei Kurzarbeit

(2)

Termine am Standort Marburg

Mo, 07.03.2022 Mo, 04.04.2022 Mo, 02.05.2022 Di, 07.06.2022 Mo, 04.07.2022 Mo, 01.08.2022 Mo, 05.09.2022 Di, 04.10.2022 Mo, 07.11.2022 Mo, 05.12.2022

Live-Online-Schulungen

Unsere Weiterbildungen und Schulungen finden auch online im virtuellen Klassenzimmer statt.

Ihr Ansprechpartner

Petra Schmoranz Trainingscenterleiter Telefon: 06421 965855

E-Mail: petra.schmoranz@futuretrainings.com

Neue Kasseler Strasse 62E 35039 Marburg

Weitere Infos unter

Telefon: 06421 965855 www.futuretrainings.com

Unsere Standorte

Halle (Saale), Berlin, Berlin-Neukölln, Chemnitz, Hannover, Köln, Leipzig, Reutlingen, Stuttgart, Ulm, Erfurt, Jena, Marburg, Nordhausen, Brand-Erbisdorf, Bernburg, Bitterfeld-Wolfen, Dessau-Roßlau, Lutherstadt Eisleben, Hettstedt, Köthen, Magdeburg, Merseburg, Naumburg, Quedlinburg, Sangerhausen, Weißenfels, Zerbst, Zeitz, Rostock, Aue, Annaberg-Buchholz,

Dippoldiswalde, Freital, Heidenau, Bayreuth

Powered by TCPDF (www.tcpdf.org)

Referenzen

ÄHNLICHE DOKUMENTE

Improve on RNN results with complex neural architectures, such as Long Short Term Memories (LSTM) and Gated Recurrent Units. Explore sequence-to-sequence models (used in

Improve on RNN results with complex neural architectures, such as Long Short Term Memories (LSTM) and Gated Recurrent Units. Explore sequence-to-sequence models (used in

Improve on RNN results with complex neural architectures, such as Long Short Term Memories (LSTM) and Gated Recurrent Units. Explore sequence-to-sequence models (used in

Improve on RNN results with complex neural architectures, such as Long Short Term Memories (LSTM) and Gated Recurrent Units. Explore sequence-to-sequence models (used in

Improve on RNN results with complex neural architectures, such as Long Short Term Memories (LSTM) and Gated Recurrent Units. Explore sequence-to-sequence models (used in

Improve on RNN results with complex neural architectures, such as Long Short Term Memories (LSTM) and Gated Recurrent Units. Explore sequence-to-sequence models (used in

Improve on RNN results with complex neural architectures, such as Long Short Term Memories (LSTM) and Gated Recurrent Units. Explore sequence-to-sequence models (used in

Improve on RNN results with complex neural architectures, such as Long Short Term Memories (LSTM) and Gated Recurrent Units. Explore sequence-to-sequence models (used in