يعرض 1 - 10 نتائج من 157 نتيجة بحث عن '"Ishida, Kei"', وقت الاستعلام: 0.75s تنقيح النتائج
  1. 1
    تقرير

    الوصف: Deep learning has been utilized for the statistical downscaling of climate data. Specifically, a two-dimensional (2D) convolutional neural network (CNN) has been successfully applied to precipitation estimation. This study implements a three-dimensional (3D) CNN to estimate watershed-scale daily precipitation from 3D atmospheric data and compares the results with those for a 2D CNN. The 2D CNN is extended along the time direction (3D-CNN-Time) and the vertical direction (3D-CNN-Vert). The precipitation estimates of these extended CNNs are compared with those of the 2D CNN in terms of the root-mean-square error (RMSE), Nash-Sutcliffe efficiency (NSE), and 99th percentile RMSE. It is found that both 3D-CNN-Time and 3D-CNN-Vert improve the model accuracy for precipitation estimation compared to the 2D CNN. 3D-CNN-Vert provided the best estimates during the training and test periods in terms of RMSE and NSE.

    الوصول الحر: http://arxiv.org/abs/2112.06571Test

  2. 2
    تقرير

    الوصف: An architecture consisting of a serial coupling of the one-dimensional convolutional neural network (1D-CNN) and the long short-term memory (LSTM) network, which is referred as CNNsLSTM, was proposed for hourly-scale rainfall-runoff modeling in this study. In CNNsLTSM, the CNN component receives the hourly meteorological time series data for a long duration, and then the LSTM component receives the extracted features from 1D-CNN and the hourly meteorological time series data for a short-duration. As a case study, CNNsLSTM was implemented for hourly rainfall-runoff modeling at the Ishikari River watershed, Japan. The meteorological dataset, consists of precipitation, air temperature, evapotranspiration, and long- and short-wave radiation, were utilized as input, and the river flow was used as the target data. To evaluate the performance of proposed CNNsLSTM, results of CNNsLSTM were compared with those of 1D-CNN, LSTM only with hourly inputs (LSTMwHour), parallel architecture of 1D-CNN and LSTM (CNNpLSTM), and the LSTM architecture which uses both daily and hourly input data (LSTMwDpH). CNNsLSTM showed clear improvements on the estimation accuracy compared to the three conventional architectures (1D-CNN, LSTMwHour, and CNNpLSTM), and recently proposed LSTMwDpH. In comparison to observed flows, the median of the NSE values for the test period are 0.455-0.469 for 1D-CNN (based on NCHF=8, 16, and 32, the numbers of the channels of the feature map of the first layer of CNN), 0.639-0.656 for CNNpLSTM (based on NCHF=8, 16, and 32), 0.745 for LSTMwHour, 0.831 for LSTMwDpH, and 0.865-0.873 for CNNsLSTM (based on NCHF=8, 16, and 32). Furthermore, the proposed CNNsLSTM reduces the median RMSE of 1D-CNN by 50.2%-51.4%, CNNpLSTM by 37.4%-40.8%, LSTMwHour by 27.3%-29.5%, and LSTMwDpH by 10.6%-13.4%.
    Comment: 18 pages, 9 figures

    الوصول الحر: http://arxiv.org/abs/2111.04732Test

  3. 3
    تقرير

    الوصف: This study investigates the relationships which deep learning methods can identify between the input and output data. As a case study, rainfall-runoff modeling in a snow-dominated watershed by means of a long- and short-term memory (LSTM) network is selected. Daily precipitation and mean air temperature were used as model input to estimate daily flow discharge. After model training and verification, two experimental simulations were conducted with hypothetical inputs instead of observed meteorological data to clarify the response of the trained model to the inputs. The first numerical experiment showed that even without input precipitation, the trained model generated flow discharge, particularly winter low flow and high flow during the snow-melting period. The effects of warmer and colder conditions on the flow discharge were also replicated by the trained model without precipitation. Additionally, the model reflected only 17-39% of the total precipitation mass during the snow accumulation period in the total annual flow discharge, revealing a strong lack of water mass conservation. The results of this study indicated that a deep learning method may not properly learn the explicit physical relationships between input and target variables, although they are still capable of maintaining strong goodness-of-fit results.
    Comment: 8 pages, 5 figures

    الوصول الحر: http://arxiv.org/abs/2106.07963Test

  4. 4
    تقرير

    الوصف: This study proposes two straightforward yet effective approaches to reduce the required computational time of the training process for time-series modeling through a recurrent neural network (RNN) using multi-time-scale time-series data as input. One approach provides coarse and fine temporal resolutions of the input time-series to RNN in parallel. The other concatenates the coarse and fine temporal resolutions of the input time-series data over time before considering them as the input to RNN. In both approaches, first, finer temporal resolution data are utilized to learn the fine temporal scale behavior of the target data. Next, coarser temporal resolution data are expected to capture long-duration dependencies between the input and target variables. The proposed approaches were implemented for hourly rainfall-runoff modeling at a snow-dominated watershed by employing a long and short-term memory (LSTM) network, which is a newer type of RNN. Subsequently, the daily and hourly meteorological data were utilized as the input, and hourly flow discharge was considered as the target data. The results confirm that both of the proposed approaches can reduce the computational time for the training of RNN significantly (up to 32.4 times). Furthermore, one of the proposed approaches improves the estimation accuracy.
    Comment: 11pages, 5 figures

    الوصول الحر: http://arxiv.org/abs/2103.10932Test

  5. 5
    دورية أكاديمية

    المساهمون: Japan Society for the Promotion of Science, Japan Agency for Medical Research and Development

    المصدر: Frontiers in Immunology ; volume 14 ; ISSN 1664-3224

    الوصف: Macrophages manifest as various subtypes that play diverse and important roles in immunosurveillance and the maintenance of immunological homeostasis in various tissues. Many in vitro studies divide macrophages into two broad groups: M1 macrophages induced by lipopolysaccharide (LPS), and M2 macrophages induced by interleukin 4 (IL-4). However, considering the complex and diverse microenvironment in vivo , the concept of M1 and M2 is not enough to explain diversity of macrophages. In this study, we analyzed the functions of macrophages induced by simultaneous stimulation with LPS and IL-4 (termed LPS/IL-4-induced macrophages). LPS/IL-4-induced macrophages were a homogeneous population showing a mixture of the characteristics of M1 and M2 macrophages. In LPS/IL-4-induced macrophages, expression of cell-surface M1 markers (I-A b ) was higher than in M1 macrophages, but lower expression of iNOS, and expression of M1-associated genes ( Tnfα and Il12p40 ) were decreased in comparison to expression in M1 macrophages. Conversely, expression of the cell-surface M2 marker CD206 was lower on LPS/IL-4-induced macrophages than on M2 macrophages and expression of M2-associated genes ( Arg1 , Chi3l3 , and Fizz1 ) varied, with Arg1 being greater than, Fizz1 being lower than, and Chi3l3 being comparable to that in M2 macrophages. Glycolysis-dependent phagocytic activity of LPS/IL-4-induced macrophages was strongly enhanced as was that of M1 macrophages; however, the energy metabolism of LPS/IL-4-induced macrophages, such as activation state of glycolytic and oxidative phosphorylation, was quite different from that of M1 or M2 macrophages. These results indicate that the macrophages induced by LPS and IL-4 had unique properties.

  6. 6
    رسالة جامعية

    المؤلفون: Ishida, Kei

    مرشدي الرسالة: 河地, 利彦, 村上, 章, 川島, 茂人, 石田, 桂, イシダ, ケイ

    مصطلحات موضوعية: 610

    الوصف: Kyoto University (京都大学)
    0048
    甲第16899号
    農博第1915号
    新制||農||997(附属図書館)
    学位論文||H24||N4660(農学部図書室)
    29574
    学位規則第4条第1項該当

    وصف الملف: application/pdf

  7. 7
    دورية أكاديمية
  8. 8
    دورية أكاديمية
  9. 9
    دورية أكاديمية
  10. 10
    دورية أكاديمية