Advertisement

Globally and Locally Consistent Image Completion论文阅读笔记

阅读量:

空洞卷积(dilate convolution)的作用:

为了在提升检测能力的同时减少计算开销,在deep net中增大感受野范围通常会采用下采样技术(池化层或空洞卷积)。这可能会导致分辨率下降的问题。若想同时保持高分辨率并仍然扩大感知范围,则可考虑使用空洞卷积这一技术。这种技术在目标检测和图像分割等场景中表现出显著优势:一方面能够捕获较大的物体或区域以提高检测精度;另一方面则能保证细节定位的准确性。

捕捉多尺度空间关系。通过调整空洞卷积中的膨胀率参数,在卷积核内部增加额外的零填充这一过程能够有效控制感受野尺寸的变化。相应地,在设置不同的膨胀率值时,这会导致感受野尺寸发生变化,并且能够捕获不同尺度的空间特征。这种多尺度特征对于提升视觉任务性能具有重要意义。

By employing dilated convolutions at lower resolutions, this method is capable to perceive a significantly enhanced area of the input image when calculating each output pixel compared to standard convolutional layers. The resulting network architecture calculates each output pixel based on a 307×307-pixel region within the input image. If such dilated convolutions are not utilized, only a 99×99-pixel region would be considered, rendering it impossible to fill holes that exceed this dimension, as illustrated in Figure 3.

在这里插入图片描述

池化层(pooling layer)的作用:

下采样层也可称为池化层;其具体操作与卷积层的操作基本上是相同的;只是下采样的卷积核仅取对应位置的最大值或平均值(即最大池化或平均池化),但它们在反向传播过程中并没有进行任何修改。

在数学领域中

通过降维技术减少参数数量,并结合主成分分析(PCA)以实现降维的同时降低计算复杂度

局部判决器

The local context discriminator follows similar patterns instead of exact repetition. Except for this aspect, its structure mirrors other discriminators. Instead of using an exact repetition mechanism based on fixed-size patches like VGG-16's convolutional layers and global average pooling layers (which are too rigid for handling complex patterns), this approach employs a more flexible method. Note that during training only one completed region exists in each sample. However, after training our completion network can effectively fill in multiple holes simultaneously. When an input isn't already a completed image—meaning it's missing some regions—the system selects a random patch from within its boundaries to use as reference.

连接部分

Finally, 全局判别器和局部判别器的输出被连接在一起,并形成一个2048维向量。接着通过一个全连接层来处理,并输出一个连续的值。为了将这个值限制在[0, 1]范围内并表示图片的真实性概率(而非修复完成的概率),我们使用了一个sigmoid激活函数。

Keras实现代码:

https://github.com/neka-nat/image_completion_tf2

全部评论 (0)

还没有任何评论哟~