Please wait a minute...
 
Remote Sensing for Natural Resources    2022, Vol. 34 Issue (3) : 65-72     DOI: 10.6046/zrzyyg.2021316
|
A method for color consistency of remote sensing images based on generative adversarial networks
WANG Yiru1,2(), WANG Guanghui1,2(), YANG Huachao1, LIU Huijie2
1. School of Environment and Spatial Informatics, China University of Mining and Technology, Xuzhou 221116, China
2. Land Satellite Remote Sensing Application Center, Ministry of Natural Resources, Beijing 100048, China
Download: PDF(5110 KB)   HTML
Export: BibTeX | EndNote | Reference Manager | ProCite | RefWorks    
Abstract  

Uneven brightness and inconsistent colors are prone to occur inside and between captured images in the process of remote sensing imaging. However, the manual color conditioning combined with image processing software can no longer meet the color matching demand of geometrically increasing remote sensing images. Given this, this study proposed a kind of unsupervised channel-cycle generative adversarial network (CA-CycleGAN) integrated with the attention mechanism suitable for ground objects in complex urban areas with a high land utilization rate. Firstly, the sample data set used for color reference was manually made through histogram adjustment and Photoshop, and the appropriate urban images were selected as the sample set to be corrected. Then, the two kinds of images were cut respectively to obtain the preprocessed image sample sets. Finally, the preprocessed image set to be corrected and the image set for color reference were processed using the CA-CycleGAN. Because the attention mechanism has been added to the generator, the generated focuses can be distributed into key areas using the attention feature map in the training process of the confrontation between the generator and the discriminator, thus improving the image effects and obtaining the color correction model based on urban images and the images after color correction. Both the image correction effect and the loss function diagram show that the proposed method is optimized based on the CycleGAN and that the comprehensive performance of the CycleGAN integrated with the attention mechanism is better than that without the attention mechanism. Compared to conventional methods, the method proposed in this study greatly reduced the time for color correction and achieved more stable image color correction effects than manual color matching. Therefore, the method proposed in this study enjoys significant advantages in the color dodging of remote sensing images and has a good application prospect.

Keywords remote sensing      image color correction      CycleGAN      urban satellite image      attention mechanism     
ZTFLH:  P236  
Corresponding Authors: WANG Guanghui     E-mail: 1306347915@qq.com;wanggh@lasac.cn
Issue Date: 21 September 2022
Service
E-mail this article
E-mail Alert
RSS
Articles by authors
Yiru WANG
Guanghui WANG
Huachao YANG
Huijie LIU
Cite this article:   
Yiru WANG,Guanghui WANG,Huachao YANG, et al. A method for color consistency of remote sensing images based on generative adversarial networks[J]. Remote Sensing for Natural Resources, 2022, 34(3): 65-72.
URL:  
https://www.gtzyyg.com/EN/10.6046/zrzyyg.2021316     OR     https://www.gtzyyg.com/EN/Y2022/V34/I3/65
Fig.1  Schematic diagram of attention guided CycleGAN
Fig.2  Structure of channel attention
Fig.3  Network structure of generator and discriminator
Tab.1  Comparison of images color correction effect in small area
Fig.4  Mosaic effect drawings after color correction by different methods
Fig.5  Loss function of color correction by different methods
方法 D_A G_A cycle_A idt_A D_B G_B cycle_B idt_B
CycleGAN 0.156 5 0.459 2 0.215 9 0.114 7 0.166 9 0.508 5 0.300 4 0.110 2
CA-CycleGAN 0.167 4 0.447 1 0.188 8 0.102 8 0.163 3 0.465 2 0.297 0 0.084 5
Tab.2  Loss value of different color correction methods in the last cycle
Fig.6  Verify the color correction effect of the dataset
[1] 金淑英, 包更生, 马洪超, 等. 海港海底视频影像的灰度不均匀校正方法的研究[J]. 武汉测绘科技大学学报, 2000, 25(6):531-536.
[1] Jin S Y, Bao G S, Ma H C, et al. Research on gray nonuniformity correction method of seaport seafloor video image[J]. Geomatics and Information Science of Wuhan University, 2000, 25(6):531-536.
[2] Kim S J, Kim N Y, Lee B G, et al. Wavelet based uneven illumination compensation for defect detection of flat panel display[C]// Iasted International Conference on Signal Processing,Pattern Reco-gnition,and Applications.ACTA Press, 2007:104-108.
[3] Tan R T. Visibility in bad weather from a single image[C]// IEEE Conference on Computer Vision and Pattern Recognition,IEEE, 2008:1-8.
[4] 谭攀, 吴绍斐, 苏晓军, 等. 基于影像动态分块与HIS结合的匀光算法[J]. 影像技术, 2014, 26(1):55-56.
[4] Tan P, Wu S F, Su X J, et al. A smoothing algorithm based on dynamic image segmentation and HIS[J]. Image Technology, 2014, 26(1):55-56.
[5] 张振. 光学遥感影像匀光算法研究[D]. 郑州: 解放军信息工程大学, 2010.
[5] Zhang Z. Research on dodging methods of optical remotely sensed image[D]. Zhengzhou: Information Engineering University, 2010.
[6] 易磊. 遥感影像色彩一致性处理技术研究[D]. 郑州: 解放军信息工程大学, 2015.
[6] Yi L. Research on color consistency processing for remote sensing image[D]. Zhengzhou: Information Engineering University, 2015.
[7] 任中杰. 遥感瓦片影像色彩一致性研究[D]. 赣州: 江西理工大学, 2019.
[7] Ren Z J. Research on color consistency of remote sensing tile image[D]. Ganzhou: Jiangxi University of Science and Technology, 2019.
[8] 杜神斌. 数字正射影像镶嵌中色彩一致性方法研究[D]. 南昌: 东华理工大学, 2018.
[8] Du S B. Research on color consistency method in digital orthophoto mosaic[D]. Nanchang: East China University of Technology, 2018.
[9] 刘晓龙. 基于影像匹配接边纠正的数字正射影像的镶嵌技术[J]. 遥感学报, 2001, 5(2):104-109.
[9] Liu X L. Image match and overlapping area correction mosaic technology for digital orthophoto maps[J]. National Remote Sensing Bulletin, 2001, 5(2):104-109.
[10] 王密, 潘俊. 一种数字航空影像的匀光方法[J]. 中国图象图形学报, 2004(6):104-108,127.
[10] Wang M, Pan J. A method of removing the uneven illumination for digital aerial image[J]. Journal of Image and Graphics, 2004(6):104-108,127.
[11] 刘建涛. 基于彩色空间变换的航空数码影像自动匀光处理研究[D]. 西安: 长安大学, 2008.
[11] Liu T J. Study on dodging of aerial digital image based on color model transform[D]. Xi’an: Chang’an University, 2008.
[12] 李治江. 彩色影像色调重建的理论与实践[D]. 武汉: 武汉大学, 2005.
[12] Li Z J. Theory and practice on tone reproduction of color photos[D]. Wuhan: Wuhan University, 2005.
[13] Stumpfel J, Jones A, Wenger A, et al. Direct HDR capture of the sun and sky[C]// Proceedings of the 3rd International Conference on Computer Graphics,Virtual Reality,Visualization and Interaction in Africa, 2004:145-149.
[14] 张振, 朱宝山, 朱述龙. 反差一致性改进的MASK匀光算法[J]. 测绘科学技术学报, 2010(1):54-56.
[14] Zhang Z, Zhu B S, Zhu S L. An improved MASK dodging method based on contrast consistency processing[J]. Journal of Geomatics Science and Technology, 2010(1):54-56.
[15] 朱述龙, 朱宝山, 王红卫. 遥感影像处理与应用[M]. 北京: 科学出版社, 2006.
[15] Zhu S L, Zhu B S, Wang H W. Remote sensing image processing and application[M]. Beijing: Science Press, 2006.
[16] Orsini G, Ramponi G, Carrai P, et al. A modified retinex for image contrast enhancement and dynamics control[C]// International Conference on Image Processing.IEEE, 2003:III-393-396.
[17] Lam E Y. Combining gray world and retinex theory for automatic white balance in digital photography[C]// International Symposium on Consumer Electronics.IEEE, 2005:134-139.
[18] Nnolim U, Lee P. Homomorphic filtering of colour images using a spatial filter kernel in the HIS colour space[J]. 2008, 71(2):1738-1743.
[19] Seow M J, Asari V K. Ratio rule and homomorphic filter for enhancement of digital colour image[J]. Neurocomputing, 2006, 69(7-9):954-958.
doi: 10.1016/j.neucom.2005.07.003 url: https://linkinghub.elsevier.com/retrieve/pii/S0925231205002134
[20] Xue Z X. Semi-supervised convolutional generative adversarial network for hyperspectral image classification[J]. IET Image Processing, 2020, 14(4):709-719.
doi: 10.1049/iet-ipr.2019.0869 url: https://onlinelibrary.wiley.com/doi/10.1049/iet-ipr.2019.0869
[21] 李雪, 张力, 王庆栋, 等. 多时相遥感影像语义分割色彩一致性对抗网络[J]. 测绘学报, 2020, 49(11):1473-1484.
[21] Li X, Zhang L, Wang Q D, et al. Multi-temporal remote sensing imagery semantic segmentation color consistency adversarial network[J]. Acta Geodaetica Cartographica Sinica, 2020, 49(11):1473-1484.
[22] 王照乾, 孔韦韦, 滕金保, 等. DenseNet生成对抗网络低照度图像增强方法[J]. 计算机工程与应用, 2020, 58(8):214-220.
[22] Wang Z Q, Kong W W, Teng J B, et al. Low illumination image enhancement method based on DenseNet GAN[J]. Computer Engineering and Applications, 2020, 58(8):214-220.
[23] Lyu N, Ma H X, Chen C, et al. Remote sensing data augmentation through adversarial training[J]. IEEE Journal of Selected Topics in Applied Earth Observations and Remote Sensing, 2020, 14:9318-9333.
doi: 10.1109/JSTARS.2021.3110842 url: https://ieeexplore.ieee.org/document/9531488/
[24] 何鎏一. 基于深度学习的去雾算法设计[D]. 青岛: 青岛大学, 2020.
[24] He L Y. Design of defogging algorithm based on deep learning[D]. Qingdao: Qingdao University, 2020.
[25] Hu A, Xie Z, Xu Y, et al. Unsupervised haze removal for high-resolution optical remote-sensing images based on improved generative adversarial networks[J]. Remote Sensing, 2020, 12(24):4162.
doi: 10.3390/rs12244162 url: https://www.mdpi.com/2072-4292/12/24/4162
[26] 冀俭俭. 基于生成对抗网络的分级联合图像补全方法[D]. 北京: 北京林业大学, 2020.
[26] Ji J J. Hierarchical jiont image completion method based on generative adversarial network[D]. Beijing: Beijing Forestry University, 2020.
[27] 郑凯, 李建胜, 王俊强, 等. DCLS-GAN:利用生成对抗网络的天绘一号卫星高原地区影像去云方法[J]. 测绘学报, 2021, 50(2):248-259.
[27] Zheng K, Li J S, Wang J Q, et al. DCLS-GAN:Cloud removal method for plateau area of TH-1 satellite image[J]. Acta Geodaetica Cartographica Sinica, 2021, 50(2):248-259.
[28] Zhu J Y, Park T, Isola P, et al. Unpaired image-to-image translation using cycle-consistent adversarial networks[C]// 2017 IEEE International Conference on Computer Vision.IEEE, 2017,17453078.
[1] NIU Xianghua, HUANG Wei, HUANG Rui, JIANG Sili. A high-fidelity method for thin cloud removal from remote sensing images based on attentional feature fusion[J]. Remote Sensing for Natural Resources, 2023, 35(3): 116-123.
[2] DONG Ting, FU Weiqi, SHAO Pan, GAO Lipeng, WU Changdong. Detection of changes in SAR images based on an improved fully-connected conditional random field[J]. Remote Sensing for Natural Resources, 2023, 35(3): 134-144.
[3] WANG Jianqiang, ZOU Zhaohui, LIU Rongbo, LIU Zhisong. A method for extracting information on coastal aquacultural ponds from remote sensing images based on a U2-Net deep learning model[J]. Remote Sensing for Natural Resources, 2023, 35(3): 17-24.
[4] TANG Hui, ZOU Juan, YIN Xianghong, YU Shuchen, HE Qiuhua, ZHAO Dong, ZOU Cong, LUO Jianqiang. River and lake sand mining in the Dongting Lake area: Supervision based on high-resolution remote sensing images and typical case analysis[J]. Remote Sensing for Natural Resources, 2023, 35(3): 302-309.
[5] YU Hang, AN Na, WANG Jie, XING Yu, XU Wenjia, BU Fan, WANG Xiaohong, YANG Jinzhong. High-resolution remote sensing-based dynamic monitoring of coal mine collapse areas in southwestern Guizhou: A case study of coal mine collapse areas in Liupanshui City[J]. Remote Sensing for Natural Resources, 2023, 35(3): 310-318.
[6] WANG Jing, WANG Jia, XU Jiangqi, HUANG Shaodong, LIU Dongyun. Exploring ecological environment quality of typical coastal cities based on an improved remote sensing ecological index: A case study of Zhanjiang City[J]. Remote Sensing for Natural Resources, 2023, 35(3): 43-52.
[7] XU Xinyu, LI Xiaojun, ZHAO Heting, GAI Junfei. Pansharpening algorithm of remote sensing images based on NSCT and PCNN[J]. Remote Sensing for Natural Resources, 2023, 35(3): 64-70.
[8] LIU Li, DONG Xianmin, LIU Juan. A performance evaluation method for semantic segmentation models of remote sensing images considering surface features[J]. Remote Sensing for Natural Resources, 2023, 35(3): 80-87.
[9] FANG He, ZHANG Yuhui, HE Yue, LI Zhengquan, FAN Gaofeng, XU Dong, ZHANG Chunyang, HE Zhonghua. Spatio-temporal variations of vegetation ecological quality in Zhejiang Province and their driving factors[J]. Remote Sensing for Natural Resources, 2023, 35(2): 245-254.
[10] ZHANG Xian, LI Wei, CHEN Li, YANG Zhaoying, DOU Baocheng, LI Yu, CHEN Haomin. Research progress and prospect of remote sensing-based feature extraction of opencast mining areas[J]. Remote Sensing for Natural Resources, 2023, 35(2): 25-33.
[11] MA Shibin, PI Yingnan, WANG Jia, ZHANG Kun, LI Shenghui, PENG Xi. High-efficiency supervision method for green geological exploration based on remote sensing[J]. Remote Sensing for Natural Resources, 2023, 35(2): 255-263.
[12] WANG Ping. Application of thermal infrared remote sensing in monitoring the steel overcapacity cutting[J]. Remote Sensing for Natural Resources, 2023, 35(2): 271-276.
[13] LI Tianchi, WANG Daoru, ZHAO Liang, FAN Renfu. Classification and change analysis of the substrate of the Yongle Atoll in the Xisha Islands based on Landsat8 remote sensing data[J]. Remote Sensing for Natural Resources, 2023, 35(2): 70-79.
[14] DIAO Mingguang, LIU Yong, GUO Ningbo, LI Wenji, JIANG Jikang, WANG Yunxiao. Mask R-CNN-based intelligent identification of sparse woods from remote sensing images[J]. Remote Sensing for Natural Resources, 2023, 35(2): 97-104.
[15] ZHAO Hailan, MENG Jihua, JI Yunpeng. Application status and prospect of remote sensing technology in precise planting management of apple orchards[J]. Remote Sensing for Natural Resources, 2023, 35(2): 1-15.
Viewed
Full text


Abstract

Cited

  Shared   
  Discussed   
京ICP备05055290号-2
Copyright © 2017 Remote Sensing for Natural Resources
Support by Beijing Magtech