Skip to content

Commit

Permalink
eng-ko term lookup table
Browse files Browse the repository at this point in the history
  • Loading branch information
Muhyun Kim authored and Aston Zhang committed Aug 29, 2019
1 parent 41b649a commit d4f3286
Showing 1 changed file with 155 additions and 0 deletions.
155 changes: 155 additions & 0 deletions TERMINOLOGY.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,155 @@
## 영어-한국어 용어 비교표

| 영어 | 한국어 |
| ------------------------------------------------ | ----------------------------- |
| access parameters | 액세스 파라미터 |
| accuracy | 정확도 |
| activation function | 활성화 함수 |
| attention model | 어텐션 모델 |
| average pooling layer | 평균 풀링 층 |
| backpropagation | 역전파 |
| baseline | 기준선 |
| batch | 배치 |
| bias | 편향 |
| bidirectional recurrent neural network | 양방향 리커런트 뉴럴 네트워크 |
| binary classification | 이진 분류 |
| block | 블록 |
| bucketing | |
| channel | 채널 |
| class | 클래스 |
| classification | 분류 |
| classifier | 분류기 |
| co-occurrence frequency | |
| collaborative filtering | |
| concatenate | 연결 |
| context | 컨텍스트 |
| context variable | |
| context vector | |
| context window | |
| context word | |
| continous bag-of-words (CBOW) model | |
| converge | |
| convex optimization | |
| convolutional | 컨볼루셔널 |
| convolutional layer | 컨볼루셔널 층 |
| convolutional neural network | |
| cost | |
| covariate shift | 공변량 변화 |
| cross-entropy | |
| cross-entropy loss | 크로스-엔트로피 손실 |
| data instance | |
| dataset | 데이터셋 |
| decision boundary | |
| decoder | |
| dense | |
| dimension | 차원 |
| diverge | |
| dropout | 드롭아웃 |
| eigenvalue | |
| empirical risk minimization | |
| encoder | |
| end-to-end | |
| epoch | 에포크 |
| error | 오류 |
| example | |
| exploding gradient | 그래디언트 폭발 |
| feature | 특성 |
| feature map | |
| filter | |
| forward propagation | 순전파 |
| fully connected layer | 완전 연결층 |
| Gaussian distribution | 가우시안 분포 |
| generalization | |
| generalization error | |
| gradient | 그래디언트 |
| gradient clipping | |
| gradient descent in one-dimensional space | |
| Gram matrix | |
| ground truth | |
| hidden layer | 은닉층 |
| hidden variable | |
| hyperparameter | 하이퍼파라미터 |
| hypothesis | |
| identity mapping | |
| image | |
| independent and identically distributed (i.i.d.) | |
| inference | 추론 |
| instance | |
| iterator | 이터레이터 |
| kernel | |
| label | 레이블 |
| layer ||
| learning rate | 학습 속도 |
| linear model | |
| linear regression | |
| local minimum | |
| log likelihhod | 로그 가능도 |
| loss function | 손실 함수 |
| machine learning | 머신 러닝 |
| marginalization | 주변화 |
| mean | |
| mean squared error | |
| metric | |
| mini-batch | |
| mini-batch gradient | |
| model complexity | |
| model parameter | |
| momentum (method) | |
| multilayer perceptron | 다층 퍼셉트론 |
| negative sampling | |
| neural network | 뉴럴 네트워크 |
| non-convex optimization | |
| normalization | |
| numerical method | |
| object detection | |
| objective function | 목적 함수 |
| offset | |
| one hot encoding | 원-핫-인코딩 |
| operator | |
| optimization algorithm | 최적화 알고리즘 |
| optimizer | |
| outlier | 이상치 |
| overfitting | 오버피팅 |
| padding | |
| parameter | 파라미터 |
| partial derivative | |
| perplexity | |
| pipeline | |
| pooling layer | |
| property | |
| pseudo | 의사 |
| random variable | 확률 변수 |
| receptive field | |
| recurrent neural network | |
| regression | 회귀 |
| saddle point | |
| scalar | 스칼라 |
| sentiment analysis | |
| shape | 모양 |
| skip-gram model | |
| softmax regression | |
| softmax,hierarchical softmax | |
| stochastic gradient descent | 확률적 경사 하강법 |
| stride | |
| subsample | |
| support vector machine | |
| test dataset | 테스트 데이터셋 |
| tokenizer/tokenization | |
| training dataset | 학습 데이터셋 |
| training error | |
| transform | |
| tune hyper-parameter | |
| unbiased estimate | |
| underfitting | 언더피팅 |
| uniform sampling | |
| unknown token | |
| update model parameter(s) | |
| upsample | |
| validation dataset | 검증 데이터셋 |
| vanishing gradient | 그래디언트 소실 |
| variance | |
| vector | 벡터 |
| weight | |
| word embedding | |
| word vector | |
| zero tensor | |

0 comments on commit d4f3286

Please sign in to comment.